• Newsmax IPO draws comparisons to GameStop meme stock mania, as NMAX price surges another 160% today
    www.fastcompany.com
    Shares of Newsmax continued to rise on Tuesday following a massive public debut on Monday, in which shares shot up more than 700% for the best-performing first day since 2022.Investors who bought in at the $10 IPO price are reaping a massive potential return. When trading opened on Monday morning, Newsmax shares, trading under the NMAX ticker, rose from $10 to close at nearly $78. After hours, values continued to rise, and when trading opened on Tuesday, the stock saw another surge.As of roughly 2 p.m. ET, shares were trading at around $215, an increase of almost 160%.The trading has been intense, and was even halted numerous times on Monday due to volatility. That volatility has evidently caught the attention of retail investors, some of whom, on social media and trader forums, are comparing it to the GameStop meme stock mania from a few years ago.Newsmax may also have some other similarities to GameStopnotably, that its a company thats been losing money (it lost $55.5 million during the first six months of 2024); and as a cable news network, is competing in a field that is losing steam, similarly to GameStops physical video game retail model.Also, Newsmax is a competitor to Fox News, which has been able to beat its chief competitors MSNBC and CNN in recent years and recently had its best February on record, averaging 3.1 million primetime viewers for the entire month. Newsmaxwhich tends to offer programming that is often even further to the right than Foxis hoping to tap into that audience and pilfer some of those viewers for itself.As for whats playing out in the markets? Its something of a cherry on top for Newsmax. The company was founded by Christopher Ruddy in 1998; its cable news network launched in 2014 and now, with Trump back in the White House, appears to be flying high.
    0 Reacties ·0 aandelen ·24 Views
  • Beat the AirTag bulge with this $24 card-sized tracker
    www.macworld.com
    MacworldSitting on an Airtag isnt comfortable, and it ruins your wallet. The MagTag Ultra Slim Tracker Card offers the same smart tracking power without the bulge. Now just $23.97 (Reg. $42.99), this1.5mm tracking deviceis just about the size and thickness of a credit card, so it fits flush into your wallet, work lanyard, passport pouch, and luggage tag.MagTag works with the Apple Find My app, accessing the global real-time tracking power of Apples FindMy device network. Whether heading out on vacation or running errands, youll get left-behind alerts, a loud beep to help you locate it, and peace of mind knowing your valuables are protected.The sleek tracker is also rechargeable and holds up to 5 months of battery life per charge. Its Qi wireless charging compatible and IP68 waterproof and dustproof so that it can handle some necessary roughness. The built-in keyring hole even lets you attach it to keys or bags for added flexibility.Slip theMagTag Ultra Slim Tracker into your cart now for just $23.97(reg. $42.99) and streamline your setup before this 44% off deal disappears.MagTag Ultra Slim Tracker Card Works with Apple Find My AppSee DealStackSocialprices subject to change.
    0 Reacties ·0 aandelen ·19 Views
  • How to set up two-factor authentication for your Apple Account and iCloud
    www.macworld.com
    MacworldIf you arent using two-factor authentication to protect your Apple Account and iCloud account, you really should do it. In the past, hackers claimed to have millions of stolen iCloud credentials are demanded that Apple pay a ransom. So even as Apple does what it can to protect your data, events like thisthough unlikelycan happen. And with all the valuable info on your iPhone these days, you want to make sure nobody can get reset it.But guess what? Using two-factor authentication should protect you completely. Its easy to set up, so take a minute and do it now. Whats more, some Apple services and features require you to have 2FA enabled on your account, as an extra security precaution. Heres how to set it up on a Mac or iOS device. (Apple Account users who dont have a compatible device can still use an older two-step verification system. See below for more.)One note: Apple changed the name of Apple ID to Apple Account in iOS 18 and macOS Sequoia.iPhone and iPadFollow these steps on an iPad or iPhone. These steps are done with iOS 18 or later and should be similar in iPadOS. The device must be protected with a passcode (Settings > Touch ID/Face ID & Passcode).FoundryLaunch the Settings app.Tap your Apple Account profile at the top. Obviously, you need to be signed in with the account you want to protect with two-factor authentication.Tap Sign-In & Security.Find the Two-Factor Authentication setting and turn it on.Next, enter a phone number where you can receive a text message or a phone call with a two-factor code. You can also specify if you want a text or a call. Youll get that text message or call and enter the six-digit verification code on the next screen.Thats it! Two-factor is on, and this is your official Trusted Device. The next time you sign on to iCloud.com, or set up your iCloud account on a new device, youll have to first enter your username and password, and then be prompted to enter a code. That code will come in a pop-up on your trusted devices, texted/phoned to the number you provided, or, you can come back to this screen and tap Get Verification Code.MacSetting this up on a Mac requires nearly the same steps as on an iPhone or iPad. These instructions are done with macOS Sequoia 15.3.2Open System Settings.In your profile at the top, click Apple Account.The Apple Account pane should open. In the mian section, click on Sing-In & Security.Find the Two-Factor Authentication setting and turn it on.You might need to verify your identity by answering your security questions.Enter a phone number you can use to receive verification codes, and choose if you want to get text messages or calls.Enter the code thats sent you to right away to finish up.FoundryIf your Mac is running macOS Mojave or an earlier operating system, here are the instructions.Open System Preferences SelectiCloud. Click theAccount Detailsbutton, and sign in if prompted.In the Security tab, click the button labeledTurn on two-factor authentication. Read the message and click Continue.Verify your identity by answering your security questions.Enter a phone number you can use to receive verification codes, and choose if you want to get text messages or calls.Enter the code thats sent you to right away to finish up.What if my device is too old?If your Mac is using an operating system older than El Capitan, or your iOS device isnt running iOS 9 or later, you can still use two-step verification, which is slightly different than two-factor authentication, mostly because it relies on a text message being sent to a phone number, while the newer authentication is baked more seamlessly into the operating systems. Plus, the older verification method requires you to hold onto a Recovery Key in case you ever lose your password. Two-step verification always sends you a text message. With the newer two-factor authentication, youll get this cool pop-up on nearby trusted devices signed in to the same iCloud account.You can read more about the differences from Apple as well as from our own Glenn Fleishman.Apple still provides a way to enable two-step verification, by following this link, signing in, and following the instructions.
    0 Reacties ·0 aandelen ·21 Views
  • Save $100 on Apple's iPad mini 7 at Amazon, now discounted to $399
    appleinsider.com
    April iPad mini 7 deals have arrived, with the tablet dipping to $399 at Amazon.Save $100 on Apple's iPad mini 7 - Image credit: AppleMarking the return of the best price we've seen on the iPad mini 7, Amazon has dropped the 128GB Wi-Fi model to $399. All four colorways are eligible for the $100 discount, which provides shoppers with the lowest price on the compact model that makes a great travel companion.Buy for $399 Continue Reading on AppleInsider | Discuss on our Forums
    0 Reacties ·0 aandelen ·19 Views
  • Elden Ring Nightreign Trailer Highlights the Ironeye Nightfarer
    gamingbolt.com
    FromSoftwares Elden Ring Nightreign has received a new trailer showcasing a new Nightfarer the Ironeye. As a ranged DPS, its primary weapon is a bow and arrow (with Legolas-like agility to match. Check it out below.The Ironeye can slide and fire three arrows in a spread, though you can also aim over the shoulder for more precise hits. It can also unleash fire arrows, dash through and slice enemies with a dagger, and fire more powerful shots with a short delay. For those who have ever wanted the power of an Anor Londo archer from Dark Souls 3, its Ultimate Art fires a single heavy piercing arrow with shockwave capabilities. If that wasnt enough, the Ironeye even shoots an enemy at point-blank range and yanks the arrow out as a riposte. Its overall stats are unknown, but it could be on the weaker side when it comes to survivability.Elden Ring Nightreign launches on May 30th for Xbox Series X/S, Xbox One, PS4, PS5, and PC. Stay tuned for more character trailers ahead of release.
    0 Reacties ·0 aandelen ·25 Views
  • 0 Reacties ·0 aandelen ·20 Views
  • 0 Reacties ·0 aandelen ·33 Views
  • 0 Reacties ·0 aandelen ·31 Views
  • Atomfall tops 1.5m players | News-in-brief
    www.gamesindustry.biz
    Atomfall tops 1.5m players | News-in-briefRebellion's survival game released on March 27, 2025, is most successful launch in developer's historyImage credit: Rebellion News by Sophie McEvoy Staff Writer Published on April 2, 2025 This is a News-in-brief article, our short format linking to an official source for more information. Read more about this story by following the link below:Atomfall tops 1.5m players
    0 Reacties ·0 aandelen ·33 Views
  • Meta AI Proposes Multi-Token Attention (MTA): A New Attention Method which Allows LLMs to Condition their Attention Weights on Multiple Query and Key Vectors
    www.marktechpost.com
    Large Language Models (LLMs) significantly benefit from attention mechanisms, enabling the effective retrieval of contextual information. Nevertheless, traditional attention methods primarily depend on single token attention, where each attention weight is computed from a single pair of query and key vectors. This design inherently constrains the models ability to discern contexts requiring the integration of multiple token signals, thereby limiting its effectiveness on complex linguistic dependencies. For example, identifying sentences simultaneously containing both Alice and rabbit is challenging because conventional attention mechanisms struggle to integrate multiple separate attention signals efficiently without substantially increasing model complexity.Meta AI addresses this limitation by introducing Multi-Token Attention (MTA), an advanced attention mechanism that conditions attention weights simultaneously on multiple query and key vectors. MTA integrates convolution operations over queries, keys, and attention heads, thus enhancing the precision and efficiency of contextual information retrieval. Specifically, the MTA framework consists of two convolutional components: key-query convolution, which aggregates multiple token signals within individual attention heads, and head mixing convolution, which facilitates information sharing among different attention heads. Additionally, the implementation employs group normalization with depth-dependent scaling to stabilize gradient flow, further improving model training stability and efficacy.At a technical level, MTA modifies conventional attention calculations by incorporating a two-dimensional convolution operation on the attention logits prior to softmax normalization. This convolution allows adjacent queries and keys to influence attention scores mutually, thus enabling the attention mechanism to identify contextual relationships involving multiple tokens more precisely. Consequently, the model efficiently aggregates local token interactions without substantially increasing the number of parameters or the dimensionality of attention vectors. Moreover, head convolution promotes effective knowledge transfer among attention heads, selectively amplifying relevant context signals while mitigating less pertinent information. Collectively, these enhancements yield a more robust attention mechanism capable of capturing complex multi-token interactions.Empirical evaluations validate the efficacy of MTA across several benchmarks. In a structured motivating task explicitly designed to illustrate the shortcomings of single-token attention mechanisms, MTA demonstrated near-perfect performance, achieving an error rate of only 0.1%, in contrast to standard Transformer models that exhibited error rates above 50%. Further large-scale experiments involving an 880M-parameter model trained on 105 billion tokens showed MTA consistently outperforming baseline architectures. MTA achieved superior validation perplexity scores across datasets such as arXiv, GitHub, and Wikipedia. Specifically, in tasks requiring extended context comprehension, such as Needle-in-the-Haystack and BabiLong benchmarks, MTA significantly exceeded the performance of standard Transformer models. In the Needle-in-the-Haystack task with 4K token contexts containing multiple needles, MTA attained accuracies ranging from 67% to 97.6%, surpassing standard models by substantial margins.In summary, Multi-Token Attention (MTA) presents a refined advancement in attention mechanisms by addressing fundamental limitations of traditional single-token attention. Leveraging convolutional operations to concurrently integrate multiple query-key interactions, MTA enhances the ability of language models to handle intricate contextual dependencies. These methodological improvements facilitate more precise and efficient performance, particularly in scenarios involving complex token interactions and long-range contextual understanding. Through targeted modifications to standard attention mechanisms, MTA contributes meaningfully to the evolution of more sophisticated, accurate, and computationally efficient language models.Check outthe Paper.All credit for this research goes to the researchers of this project. Also,feel free to follow us onTwitterand dont forget to join our85k+ ML SubReddit. [Register Now] miniCON Virtual Conference on OPEN SOURCE AI: FREE REGISTRATION + Certificate of Attendance + 3 Hour Short Event (April 12, 9 am- 12 pm PST) + Hands on Workshop [Sponsored]The post Meta AI Proposes Multi-Token Attention (MTA): A New Attention Method which Allows LLMs to Condition their Attention Weights on Multiple Query and Key Vectors appeared first on MarkTechPost.
    0 Reacties ·0 aandelen ·45 Views