• What people on TikTok are really talking about when they say cute winter boots
    www.fastcompany.com
    Those searching for cute winter boots on TikTok at the moment might be a little confused. A recent movement of the same name has nothing to do with footwear. Its a code phrase being used to discuss resistance to President Trump and his immigration policies while skirting censorship or bans on the platform.Many users have posted videos talking about their cute winter boots but showing warnings or slides of information to their viewers at the same time. Some posts see users discuss details about protests or recent developments, using a notebook or pieces of paper. Meanwhile, the sound over the video is unrelated or uses trending audio in order to avoid videos being flagged.When TikTok users mention cute winter boots protecting people from ice, theyre referencing the U.S. Immigration and Customs Enforcement (ICE). Nearly 1,000 immigration arrests were carried out on Sunday as Trumps promise of mass deportations began. ICE officials have since been directed by Trump officials to up the number of people they arrest, from a few hundred per day to at least 1,200 to 1,500.Coded language is commonplace on social mediaCute winter boots is one example of algospeak, a system of coded language designed to bypass algorithmic filters and spread warnings and information about such deportations. Another example is the phrase Senator, Im Singaporean, a quote from TikTok CEO Shou Chews response to Senator Tom Cotton during a congressional hearing, where Cottons question implied that Chew was a Chinese government agent. Now, TikTok users frequently leave this phrase in comment sections to subtly warn others about potentially sensitive or flagged content in the videos.The cute winter boots trend also exploits the platforms algorithm, which favors product-focused content, to maximize visibility. Creators often pair their videos with unrelated but highly searchable pop-culture keywords, such as Taylor Swift and Sabrina Carpenter, to further boost their reach. Some of these videos also link to TikTok Shop, but instead of boots, they offer educational items like Night by Elie Wiesel, a Holocaust memoir, or gear useful for protests, such as protective equipment.For those actually on the hunt for cute winter boots, youre better off searching elsewhere at the moment.
    0 Comments ·0 Shares ·36 Views
  • Inside the Wilson factory: Heres how Super Bowl footballs are made so quickly
    www.fastcompany.com
    As soon asthis years Super Bowl matchupwas set, workers at the Wilson Sporting Goods football factory jumped into action.The factory in the rural village of Ada, Ohio, makes the game balls used by every NFL team along with many of the nations top college programs and high schools.But this time of year its all about the Super Bowl. The Philadelphia Eagles and Kansas City Chiefswill face off for the Lombardi Trophyfor the second time in three years on Feb. 9 in New Orleans.Heres a look at the footballs, by the numbers:How many balls are made for the big game?The two teams will each get a shipment of 108, plus a dozen more for the kickers, all stamped with the Super Bowl logo and team names. Some of the balls will be for practices, while the best ones will be set aside by the quarterbacks. About 50 of those will be bagged and locked away for each team until its time for kickoff.How are NFL footballs different from other footballs?Wilson makes five different sizes for players of all levels, from the pros to youth leagues. Some have different patterns. NFL balls are notable for the lack of a stripe on the ends. Theyre also embedded with a chip that tracks the balls position on the field, how far it travels and its trajectory.How long does it take to make a football?Normally it takes three days from start to finish. But the first batch of footballs must be sent to the Super Bowl teams Monday, within about 18 hours, so they have enough time to break them in for practices and the game. Thats why the workers start production right away the night of the conference title games.How are they made?Its a 20-step process, most of it by hand, from cutting out the four leather panels that are sewn together with 250 stitches to putting in the laces. For NFL footballs, the work goes to the factorys most experienced and skilled workers. Certain parts of the process require a handmade feel, said Kevin Murphy, general manager of Wilson Team Sports. Its like making a beautiful, sculpted pair of shoes.How do they become game ready?Throughout production the balls are weighed, measured and inspected for flaws. Once finished, theyre checked again. By the time theyre packaged and ready for shipping, each one will have been touched by about 50 workers.How many does the factory normally produce?It churns out roughly 500,000 footballs each year, or about 2,500 per day. NFL teams go through several hundred during a season. For the Super Bowl, Wilson will make between 10,000-20,000 commemorative balls that will be sold by retailers nationwide and at the game site. If theres high demand, the factory will keep producing the souvenir balls well after.How long has Wilson made footballs for the NFL?Since 1941, Wilson has made every football used by the league. Its factory in Ada has been making the official game balls since 1955. This year the company opened a new plant in the village that allows for more production and a museum. Fans can tour the factory, too.John Seewer, Associated Press
    0 Comments ·0 Shares ·32 Views
  • 0 Comments ·0 Shares ·31 Views
  • In Seattle, a Meeting of 5,444 Mathematical Minds
    www.nytimes.com
    Participants at this years Joint Mathematics Meetings explored everything from the role of A.I. to the hyperbolic design of a patchwork denim skirt.
    0 Comments ·0 Shares ·33 Views
  • Microsoft 365 just raised its subscription prices heres a better deal
    www.macworld.com
    MacworldIn a world dominated by subscriptions, Microsoft 365s recent price hike has left many looking for a more cost-effective alternative. Instead of shelling out dollars monthly, get lifetime access to all the essential Microsoft appsWord, Excel, PowerPoint, Outlook, OneNote, and Teams for just $84.97 (reg. $219.99) with Microsoft Office Home & Business for Mac.This one-time purchase ensures youll worry less about rising software fees and more about what you can do with that software. Unlike subscription models, this lifetime license grants you full access to the software without monthly charges or renewals. Its ideal for professionals, students, and anyone who needs reliable tools for work, school, or personal projects.Whether you need offline access while traveling or want the confidence of having all your documents stored directly on your device, this Office suite has you covered. With no internet required for use, you can stay productive anywhere.The convenience of owning your software outright means you can focus on your work, not on future pricing changes. Take advantage of this limited-time offer and make a smart investment in your productivity. Dont miss out this price wont last forever!Grab Microsoft Office Home & Business for Mac 2021 at $84.97 (reg. $219.99), available only until 2/2 at 11:59 PM PST.Microsoft Office Home & Business for Mac 2021: Lifetime License $84.97See DealStackSocial prices subject to change.
    0 Comments ·0 Shares ·55 Views
  • Should you upgrade to Matter, security system discussion, & Tonie Box review on HomeKit Insider
    appleinsider.com
    On this episode of the HomeKit Insider Podcast, we step through the week's news, revisit the question whether to upgrade to Matter, and go hands-on with a kids smart speaker.HomeKit Insider PodcastIt was a busy week in the world of smart home, with a lot of news to catch up on. We started by revisiting the Schlage Sense Pro that has since confirmed it will support Aliro later this year.Another report has emerged on the timeline for Apple's smart home display. Bloomberg is saying it will be released late this year, which lines up with Kuo's prediction based on supply chain analysis. Continue Reading on AppleInsider | Discuss on our Forums
    0 Comments ·0 Shares ·59 Views
  • Looking back at 15 years of the iPad, Apple's revolutionary tablet
    appleinsider.com
    Announced on January 27, 2010, the iPad wasn't really an iPhone and not quite a Mac. The latest iPads blend the best of both worlds and that's now been an evolution a decade and a half in the making.Apple's iPad Pro with Magic KeyboardWith trackpad support, revamped file systems, and no shortage of computing power, the latest iPad Pro lineup consists of serious competitors in usability for nearly every task to any hybrid or touchscreen notebooks out there. It hasn't always been that way, though. Continue Reading on AppleInsider | Discuss on our Forums
    0 Comments ·0 Shares ·53 Views
  • 0 Comments ·0 Shares ·61 Views
  • Leveraging Hallucinations in Large Language Models to Enhance Drug Discovery
    www.marktechpost.com
    Researchers have highlighted concerns regarding hallucinations in LLMs due to their generation of plausible but inaccurate or unrelated content. However, these hallucinations hold potential in creativity-driven fields like drug discovery, where innovation is essential. LLMs have been widely applied in scientific domains, such as materials science, biology, and chemistry, aiding tasks like molecular description and drug design. While traditional models like MolT5 offer domain-specific accuracy, LLMs often produce hallucinated outputs when not fine-tuned. Despite their lack of factual consistency, such outputs can provide valuable insights, such as high-level molecular descriptions and potential compound applications, thereby supporting exploratory processes in drug discovery.Drug discovery, a costly and time-intensive process, involves evaluating vast chemical spaces and identifying novel solutions to biological challenges. Previous studies have used machine learning and generative models to assist in this field, with researchers exploring the integration of LLMs for molecule design, dataset curation, and prediction tasks. Hallucinations in LLMs, often viewed as a drawback, can mimic creative processes by recombining knowledge to generate novel ideas. This perspective aligns with creativitys role in innovation, exemplified by groundbreaking accidental discoveries like penicillin. By leveraging hallucinated insights, LLMs could advance drug discovery by identifying molecules with unique properties and fostering high-level innovation.ScaDS.AI and Dresden University of Technology researchers hypothesize that hallucinations can enhance LLM performance in drug discovery. Using seven instruction-tuned LLMs, including GPT-4o and Llama-3.1-8B, they incorporated hallucinated natural language descriptions of molecules SMILES strings into prompts for classification tasks. The results confirmed their hypothesis, with Llama-3.1-8B achieving an 18.35% ROC-AUC improvement over the baseline. Larger models and Chinese-generated hallucinations demonstrated the greatest gains. Analyses revealed that hallucinated text provides unrelated yet insightful information, aiding predictions. This study highlights hallucinations potential in pharmaceutical research and offers new perspectives on leveraging LLMs for innovative drug discovery.To generate hallucinations, SMILES strings of molecules are translated into natural language using a standardized prompt where the system is defined as an expert in drug discovery. The generated descriptions are evaluated for factual consistency using the HHM-2.1-Open Model, with MolT5-generated text as the reference. Results show low factual consistency across LLMs, with ChemLLM scoring 20.89% and others averaging 7.4213.58%. Drug discovery tasks are formulated as binary classification problems, predicting specific molecular properties via next-token prediction. Prompts include SMILES, descriptions, and task instructions, with models constrained to output Yes or No based on the highest probability.The study examines how hallucinations generated by different LLMs impact performance in molecular property prediction tasks. Experiments use a standardized prompt format to compare predictions based on SMILES strings alone, SMILES with MolT5-generated descriptions, and hallucinated descriptions from various LLMs. Five MoleculeNet datasets were analyzed using ROC-AUC scores. Results show that hallucinations generally improve performance over SMILES or MolT5 baselines, with GPT-4o achieving the highest gains. Larger models benefit more from hallucinations, but improvements plateau beyond 8 billion parameters. Temperature settings influence hallucination quality, with intermediate values yielding the best performance enhancements.In conclusion, the study explores the potential benefits of hallucinations in LLMs for drug discovery tasks. By hypothesizing that hallucinations can enhance performance, the research evaluates seven LLMs across five datasets using hallucinated molecule descriptions integrated into prompts. Results confirm that hallucinations improve LLM performance compared to baseline prompts without hallucinations. Notably, Llama-3.1-8B achieved an 18.35% ROC-AUC gain. GPT-4o-generated hallucinations provided consistent improvements across models. Findings reveal that larger model sizes generally benefit more from hallucinations, while factors like generation temperature have minimal impact. The study highlights hallucinations creative potential in AI and encourages further exploration of drug discovery applications.Check out the Paper. All credit for this research goes to the researchers of this project. Also,dont forget to follow us onTwitter and join ourTelegram Channel andLinkedIn Group. Dont Forget to join our70k+ ML SubReddit. Sana Hassan+ postsSana Hassan, a consulting intern at Marktechpost and dual-degree student at IIT Madras, is passionate about applying technology and AI to address real-world challenges. With a keen interest in solving practical problems, he brings a fresh perspective to the intersection of AI and real-life solutions. Meet 'Height':The only autonomous project management tool (Sponsored)
    0 Comments ·0 Shares ·54 Views
  • Top-Rated Chinese AI App DeepSeek Limits Registrations Amid Cyberattacks
    thehackernews.com
    Jan 28, 2025Ravie LakshmananArtificial Intelligence / TechnologyDeepSeek, the Chinese AI startup that has captured much of the artificial intelligence (AI) buzz in recent days, said it's restricting registrations on the service, citing malicious attacks."Due to large-scale malicious attacks on DeepSeek's services, we are temporarily limiting registrations to ensure continued service," the company said in an incident report page. "Existing users can log in as usual. Thanks for your understanding and support."Users attempting to sign up for an account are being displayed a similar message, stating "registration may be busy" and that they should wait and try again."With the popularity of DeepSeek growing, it's not a big surprise that they are being targeted by malicious web traffic," Eric Kron, security awareness advocate at KnowBe4, said in a statement shared with The Hacker News."These sorts of attacks could be a way to extort an organization by promising to stop attacks and restore availability for a fee, it could be rival organizations seeking to negatively impact the competition, or it could even be people who have invested in a competing organization and want to protect their investment by taking out the competition."DeepSeek, founded in 2023, is a Chinese upstart that's "dedicated to making AGI [artificial general intelligence] a reality," according to a description on its Hugging Face page.The company has become the talking point in the AI world, with its iOS chatbot app reaching the top of Apple's Top Free Apps chart in the U.S. this week, dethroning OpenAI's ChatGPT.The company has released a series of reasoning and mix-of-experts language models under an MIT license that it claims can outperform its Silicon Valley rivals while also being trained at a fraction of the cost, something of an achievement in the face of U.S. sanctions that prohibit the sale of advanced AI chips to Chinese companies."During the pre-training stage, training DeepSeek-V3 on each trillion tokens requires only 180K H800 GPU hours, i.e., 3.7 days on our cluster with 2048 H800 GPUs," the company said in a study."Consequently, our pre-training stage is completed in less than two months and costs 2664K GPU hours. Combined with 119K GPU hours for the context length extension and 5K GPU hours for post-training, DeepSeek-V3 costs only 2.788M GPU hours for its full training. Assuming the rental price of the H800 GPU is $2 per GPU hour, our total training costs amount to only $5.576M."That being said, the platform has been found to censor responses to sensitive topics like Tiananmen Square, Taiwan, and the treatment of Uyghurs in China.Its privacy policy also notes that users' personal information including device and network connection information, usage patterns, and payment details are hosted in "secure servers located in the People's Republic of China," a move that's likely to pose fresh concerns for Washington amid the TikTok ban."We are living in a timeline where a non-U.S. company is keeping the original mission of OpenAI alive truly open, frontier research that empowers all," said Jim Fan, senior research manager and lead of Embodied AI (GEAR Lab) at NVIDIA.OpenAI's CEO Sam Altman called DeepSeek's R1 reasoning model "impressive" and that it's "legit invigorating to have a new competitor."Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.SHARE
    0 Comments ·0 Shares ·60 Views