-
- EXPLORE
-
-
-
-
Tech Enthusiasts - Power Users - IT Professionals - Gamers
Recent Updates
-
WWW.TECHSPOT.COMMicrosoft relaunches Recall on Copilot+ Windows PCs after privacy overhaulEditor's take: We already have too much personal information floating around on the internet, and it's seemingly leaked on a monthly basis. So, is having snapshots of your screen taken every few seconds, stored and indexed – even locally – safe? These images could contain sensitive information that anyone with access to your computer can see. To me, it sounds like a bad idea. Let me hear your thoughts in the comments. Microsoft has officially launched its controversial Recall feature, but only for users of new Copilot+ PCs. The AI-powered tool, which captures screenshots every few seconds to build a searchable timeline of on-screen activity, returns nearly a year after Microsoft postponed it following a wave of privacy backlash. This time, the company disabled it by default and made it removable. Unveiled in 2024 alongside Microsoft's Copilot+ PC initiative, Recall aimed to act as an AI memory for the device. It indexes everything from visited websites to opened documents and past chats, allowing users to "scroll back" in time with natural language queries. However, early builds showed that Recall often captured sensitive material – passwords, private messages, and financial data – without redaction. Security researchers and privacy advocates quickly raised concerns about the tool's implications. Critics warned that if an attacker or malicious software gained access to Recall's local archive, it could expose a user's most private data. Microsoft responded by pulling the feature from preview builds, placing it in beta for five months, and reworking its privacy safeguards. Over the last couple of weeks, Windows Insiders tested the updated version in the preview channel, presumably without complaint. The updated feature launching this week includes several key changes. Most notably, Recall is now an opt-in feature. Users must manually enable it, and data is processed locally on the device – not uploaded to the cloud. Access to Recall's timeline also requires Windows Hello biometric authentication. Microsoft has added controls to pause data capture, exclude certain apps or websites, and delete stored content. Users can uninstall Recall completely through the system settings menu. To do so, go to Settings > System > Installed apps, search for "Recall," and select Uninstall. To opt-in, head to Settings > Privacy & Security > Recall & snapshots and toggle the switch. // Related Stories Microsoft is launching Recall exclusively on Copilot+ PCs, a new class of Windows 11 laptops with built-in NPUs for local AI processing. Most existing PCs are not compatible and will not receive the feature.0 Comments 0 Shares 8 ViewsPlease log in to like, share and comment!
-
WWW.TECHSPOT.COMNvidia RTX 5060 reportedly launching on May 19, one day after AMD's Radeon RX 9060 XTSomething to look forward to: Computex looks set to be the battleground where Nvidia and AMD reveal their mainstream GPUs from the latest generation. Nvidia's quiet unveiling of the RTX 5060 Ti and 5060 has sparked suspicion that it wants to downplay the 8GB VRAM on both cards. Nvidia told board partners it will launch the RTX 5060 on May 19, anonymous sources told VideoCardz. Reviews will go live the same day, likely giving customers little time to weigh third-party benchmarks before buying what may become the most popular card in the Blackwell lineup. The company revealed full specifications earlier this month but withheld launch and review embargo dates. With 3,840 CUDA cores, a 2.5GHz boost clock, and a $300 price tag, the RTX 5060 offers slightly better value than the 4060 and a meaningful upgrade over the 3060 – aside from its limited VRAM. Marketing for the RTX 5060 Ti and standard 5060 has focused primarily on the 16GB Ti variant. Our reviews show the $430 GPU brings decent 4K gaming within reach for mainstream consumers, but cutting its memory pool to 8GB completely changes the story. Nvidia likely delayed reviews of the 8GB variant to obscure severe performance limitations. Our results show that this amount of VRAM already bottlenecks many high-end games – and the gap will likely widen as new titles demand more memory. This story will repeat with the RTX 5060, which only offers an 8GB configuration. Steam survey data consistently ranks 60-class cards among the most popular, making the 5060 a key release for Nvidia. However, the release-day embargo suggests the company may be setting a trap for many of its customers. // Related Stories Meanwhile, the release date of the RTX 5060 falls one day after AMD's response to the 16GB RTX 5060 Ti – the Radeon RX 9060 XT. While details on AMD's latest mainstream cards remain scarce, including the release window for the standard 9060, the company is likely to follow Nvidia's VRAM configuration pattern, frustrating users who can't afford to spend over $300 on a GPU. Supply chain disruptions from inflation and tariffs have impacted Nvidia and AMD, each unwilling to sacrifice their margins on memory modules. This reluctance will likely hinder progress in GPU development, limiting meaningful improvements at the mainstream price tier.0 Comments 0 Shares 5 Views
-
WWW.TECHSPOT.COMUK bans gaming controller exports to Russia to hinder military useWhat just happened? Three years after Russia invaded Ukraine, the UK has joined the European Union in a new round of sanctions targeting technology exports. The latest measures include restrictions on video game controllers and other consumer electronics, with officials stating their goal is to degrade Russia's military capabilities. While the Pentagon now uses custom-built controllers to operate its systems, Russia continues to rely on off-the-shelf video game controllers manufactured in the West. Authorities in the UK aim to block exports of these devices after reports revealed that Russian forces have repurposed them to control drones in Ukraine air strikes. The controller ban is part of 150 new trade sanctions imposed by the UK, which is now aligning with the EU's earlier strategy. Foreign Office Minister Stephen Doughty says Russia has exploited British consumer markets for too long by purchasing seemingly harmless gaming products and repurposing them for military use. "Video game consoles will no longer be exploited to kill people in Ukraine," he said. Officials in London say the new measures aim to clamp down on Russia's efforts to circumvent trade restrictions and acquire equipment for its military. Doughty confirmed that gaming controllers are part of a broader package of export bans designed to weaken Russia's armed forces. Other banned products include software for identifying new fuel sources such as oil and gas. The sanctions also cover exports of chemicals, electronics, metals, machinery, and software tools linked to the energy and defense sectors. Doughty said these technologies could contribute to Russia's weapons production. The UK announced the new sanctions following one of Russia's deadliest strikes on Kyiv in recent weeks. On Thursday, Russian forces launched a barrage of "dumb" missiles and 215 drones overnight, killing nine people and injuring dozens more. Russia claimed the attack targeted Ukraine's military facilities. // Related Stories After the attack, UK Foreign Minister David Lammy condemned Russia for targeting Ukrainians while Kyiv officials were in London to discuss a potential peace treaty. Even President Donald Trump, who previously expressed trust in Vladimir Putin's word regarding Ukraine's invasion, called on the Russian leader to "stop" killing civilians once and for all.0 Comments 0 Shares 10 Views
-
WWW.TECHSPOT.COMNvidia's liquid-cooled AI racks promise 25x energy and 300x water efficiencyThe big picture: As artificial intelligence and high-performance computing continue to drive demand for increasingly powerful data centers, the industry faces a growing challenge: how to cool ever-denser racks of servers without consuming unsustainable amounts of energy and water. Traditional air-based cooling systems, once adequate for earlier generations of server hardware, are now being pushed to their limits by the intense thermal output of modern AI infrastructure. Nowhere is this shift more evident than in Nvidia's latest offerings. The company's GB200 NVL72 and GB300 NVL72 rack-scale systems represent a significant leap in computational density, packing dozens of GPUs and CPUs into each rack to meet the performance demands of trillion-parameter AI models and large-scale inference tasks. But this level of performance comes at a steep cost. While a typical data center rack consumes between seven and 20 kilowatts (with high-end GPU racks averaging 40 to 60 kilowatts), Nvidia's new systems require between 120 and 140 kilowatts per rack. That's more than seven times the power draw of conventional setups. This dramatic rise in power density has rendered traditional air-based cooling methods inadequate for such high-performance clusters. Air simply cannot remove heat fast enough to prevent overheating, especially as racks grow increasingly compact. To address this, Nvidia has adopted direct-to-chip liquid cooling – a system that circulates coolant through cold plates mounted directly onto the hottest components, such as GPUs and CPUs. This approach transfers heat far more efficiently than air, enabling denser, more powerful configurations. Unlike traditional evaporative cooling, which consumes large volumes of water to chill air or water circulated through a data center, Nvidia's approach uses a closed-loop liquid system. In this setup, coolant continuously cycles through the system without evaporating, virtually eliminating water loss and significantly improving water efficiency. According to Nvidia, its liquid cooling design is up to 25 times more energy efficient and 300 times more water efficient than conventional cooling methods – a claim with substantial implications for both operational costs and environmental sustainability. // Related Stories The architecture behind these systems is sophisticated. Heat absorbed by the coolant is transferred via rack-level liquid-to-liquid heat exchangers – known as Coolant Distribution Units – to the facility's broader cooling infrastructure. These CDUs, developed by partners like CoolIT and Motivair, can handle up to two megawatts of cooling capacity, supporting the immense thermal loads produced by high-density racks. Additionally, warm water cooling reduces reliance on mechanical chillers, further lowering both energy consumption and water usage. However, the transition to direct liquid cooling presents challenges. Data centers are traditionally built with modularity and serviceability in mind, using hot-swappable components for quick maintenance. Fully sealed liquid cooling systems complicate this model as breaking a hermetic seal to replace a server or GPU risks compromising the entire loop. To mitigate these risks, direct-to-chip systems use quick-disconnect fittings with dripless seals, balancing serviceability with leak prevention. Still, deploying liquid cooling at scale often requires a substantial redesign of a facility's physical infrastructure, demanding a significant upfront investment. Despite these hurdles, the performance gains offered by Nvidia's Blackwell-based systems are convincing operators to move forward with liquid cooling retrofits. Nvidia has partnered with Schneider Electric to develop reference architectures that accelerate the deployment of high-density, liquid-cooled clusters. These designs, featuring integrated CDUs and advanced thermal management, support up to 132 kilowatts per rack.0 Comments 0 Shares 10 Views
-
WWW.TECHSPOT.COMTP-Link's router pricing and China ties under US government investigationIn a nutshell: TP-Link Systems, one of the most popular router brands in the United States, has become the subject of a criminal antitrust investigation by the Department of Justice. According to reports, the China-linked company's pricing strategies and potential national security risks will be examined by the DoJ and the Commerce Department. The affordability of TP-Link's routers is part of what makes them so popular. Prosecutors at the DoJ are examining whether the company engaged in predatory pricing to undercut competitors and dominate the US market, writes Bloomberg. The probe began in 2024 under President Biden and continues today under the Trump administration. TP-Link is also being investigated by the Commerce Department over whether its ties to China pose a security threat. It was reported in December that an office of the Commerce Department had subpoenaed TP-Link, and that its routers could be banned in the US over national security concerns. TP-Link was founded by brothers Zhao Jianjun and Zhao Jiaxing in 1996. In 2008, TP-Link USA was set up to market and service products in North America, but ownership, management and supply chain all still reported to the Shenzhen-based TP-Link parent. In 2024, TP-Link USA completed a merger with TP-Link's non-Chinese operations to form TP-Link Systems Inc., headquartered in Irvine, California. This "organisational separation" from the Chinese company ensures each side has its own shareholding structure, board, R&D, production, marketing and support teams. Regulators and lawmakers are still reviewing whether the structural split truly insulates TP-Link's US arm from Chinese legal jurisdiction – hence the current antitrust and national-security probes. // Related Stories TP-Link has around 65% of the US market for routers used in homes and small businesses. Twelve of the top twenty best-selling routers on Amazon are TP-Link models, including the number one (TP-Link AX1800 WiFi 6 Router V4) and number two (TP-Link Dual-Band AX3000 Wi-Fi 6 Router Archer AX55) top sellers. In October 2024, Microsoft exposed a complex network of compromised devices that Chinese hackers used to launch highly evasive password spray attacks against Microsoft Azure customers. The network, dubbed CovertNetwork-1658, had been actively stealing credentials since August 2023. The attacks used a botnet of thousands of small office and home office (SOHO) routers, cameras, and other Internet-connected devices. At its peak, there were more than 16,000 devices in the botnet, most of which were TP-Link routers. There's a history of security flaws being discovered in TP-Link routers. A critical vulnerability with a CVSS score of 10.0 was found in the Archer C5400X tri-band router for gaming in May 2024, and in 2023, it was reported that Chinese state hackers were infecting TP-Link routers with custom, malicious firmware. The latter incident arrived soon after the US government said Mirai Botnet operators were using TP-Link routers for DDoS attacks.0 Comments 0 Shares 21 Views
-
WWW.TECHSPOT.COMYouTube at 20: The Video-Sharing Site That Conquered the InternetThe world's biggest video sharing platform, YouTube, just turned 20. What started inauspiciously in February 2005 as a modest experiment by three former PayPal employees – Chad Hurley, Steve Chen, and Jawed Karim – has since reshaped media, culture, and entertainment on a global scale. The first-ever upload on YouTube was a grainy, 19-second clip of Karim at the San Diego Zoo – hardly a sign of the media giant it would become. At the time, YouTube's impact on the media landscape was so minimal that it wasn't even mentioned in The Guardian's coverage of TV's digital revolution at the Edinburgh TV Festival. Editor's Note: Guest author Alex Connock is a Fellow at the Said Business School, University of Oxford and Lecturer at St Hugh's College Oxford. He also is Professor of Practice at Exeter University, and Head of the Department of Creative Business at the National Film and Television School. He is Vice Chair of UNICEF UK and a trustee at the Halle Orchestra. This article is republished from The Conversation under a Creative Commons license.Twenty years on, it's a different story. YouTube is a massive competitor to TV, an engagement beast, uploading as much new video every five minutes as the 2,400 hours BBC Studios produces in a whole year. The 26-year-old YouTube star MrBeast earned US$85 million in 2024 from videos – ranging from live Call of Duty play-alongs to handing out 1,000 free cataract operations. As a business, YouTube is now worth some $455 billion. That is a spectacular 275 times return on the $1.65 billion Google paid for it in 2006. For the current YouTube value, Google could today buy British broadcaster ITV about 127 times. YouTube has similar gross revenue to streaming giant Netflix – but without the financial inconvenience of making shows, since most of the content is uploaded for free. YouTube's first video: a 19-second look at the elephants of San Diego Zoo. YouTube has 2.7 billion monthly active users, or 40% of the entire global population outside China, where it is blocked. It is also now one of the biggest music streaming sites, and the second biggest social network (to Facebook), plus a paid broadcast channel for 100 million subscribers. YouTube has built a video Library of Babel, its expansive shelves lined eclectically with Baby Shark Dance, how to fix septic tanks, who would win a shooting war between Britain and France … and quantum physics. The site has taken over global children's programming to the point where Wired magazine pointed out that the future of this genre actually "isn't television." But there are flaws, too: it has been described as a conduit for disinformation by fact checkers. So how did all that happen? Eight key innovations have helped YouTube achieve its success. 1. How new creativity is paid for Traditional broadcast and print uses either the risk-on, fixed cost of hiring an office full of staff producers and writers, or the variable but risky approach of one-off commissioning from freelancers. Either way, the channel goes out of pocket, and if the content fails to score with viewers, it loses money. YouTube did away with all that, flipping the risk profile entirely to the creator, and not paying upfront at all. It doesn't have to deal with the key talent going out clubbing all night and being late to the set, not to mention other boring aspects of production like insurance, cash flow or contracts. 2. The revenue model of media YouTube innovated by dividing any earnings with the creator, via an advertising income split of roughly 50% (the exact amount varies in practice). This incentivises creators to study the science of engagement, since it makes them more money. MrBeast has a team employed just to optimise the thumbnails for his videos. MrBeast has a whole team dedicated to optimizing his thumbnails. 3. Advertising Alongside parent company Google/Alphabet, and especially with the introduction of YouTube Analytics and other technologies, the site adrenalised programmatic video advertising, where ad space around a particular viewer is digitally auctioned off to the highest buyer, in real time. That means when you land on a high-rating Beyoncé video and see a pre-roll ad for Grammarly, the advertiser algorithmically liked the look of your profile, so bid money to show you the ad. When that system works, it is ultra efficient, the key reason why the broad, demographics-based broadcast TV advertising market is so challenged. Also see: YouTube by the numbers: uncovering YouTube's ghost town of billions of unwatched, ignored videos 4. Who makes content About 50 million people now think they are professional creators, many of them on YouTube. Influencers have used the site to build businesses without mediation from (usually white and male) executives in legacy media. This has driven, at its best, a major move towards the democratization and globalization of content production. Brazil and Kenya both have huge, eponymous YouTube creator economies, giving global distribution to diverse voices that realistically would been disintermediated in the 20th century media ecology. 5. The way we tell stories Traditional TV ads and films start slow and build to a climax. Not so YouTube videos – and even more, YouTube Shorts – which prioritize a big emotive hit in the first few seconds for engagement, and regular further hits to keep people there. MrBeast's leaked internal notes describe how to do sequential escalation, meaning moving to more elaborate or extreme details as a video goes on: "An example of a one thru three minute tactic we would use is crazy progression," he says, reflecting his deep homework. "I spent basically five years of my life studying virality on YouTube." 6. Copyright Back in 2015, if someone stole your intellectual property – say, old episodes of Mr Bean - and re-broadcast it on their own channel, you would call a media lawyer and sue. Now there is a better option – Content ID – to take the money instead. Through digital rights monetization, owners can algorithmically discover their own content and claim the ad revenue, a material new income stream for producers. 7. Video technicalities Most technical innovations in video production have found their way to the mainstream via YouTube, such as 360-degree, 4K, VR and other tech acronyms. And now YouTube has started to integrate generative AI into its programme-producing suite for creators, with tight integration of Google's Veo tools. These will offer, according to CEO Neal Mohan, "billions of people around the world access to AI". This is another competitive threat to traditional producers, because bedroom creators can now make their own visual effects-heavy fan-fiction episodes of Star Wars. 8. News YouTube became a rabbit hole of disinformation, misinformation and conspiracy, via a reinforcement-learning algorithm that prioritizes view time but not editorial accuracy. Covid conspiracy fans got to see "5G health risk" or "chemtrail" videos, because the algorithm knew they might like them, too. How can the big, legacy media brands respond? Simple. By meeting the audience where the viewers are, and putting their content on YouTube. The BBC has 14.7 million YouTube subscribers. ITV is exploiting its catalogue to put old episodes of Thunderbirds on there. Meanwhile, as of February 2025, Channel 4 also announced success in reaching young viewers via YouTube. Full episode views were "up 169% year-on-year, surpassing 110 million organic views in the UK".0 Comments 0 Shares 19 Views
-
WWW.TECHSPOT.COMChrome is worth around $50 billion, DuckDuckGo CEO guesstimatesBig quote: The ongoing antitrust trial against Google has placed the search giant's Chrome web browser at the center of a heated debate over the future of internet search and competition. During testimony on Wednesday, Gabriel Weinberg, CEO of rival (but much smaller) search engine DuckDuckGo, told the court that Chrome could fetch a sale price of up to $50 billion if regulators force Google to divest the popular browser. Weinberg described his estimate as a "back-of-the-envelope" calculation, based on Chrome's vast user base and global reach – a figure that far exceeds previous estimates, such as the $20 billion valuation offered by Bloomberg analyst Mandeep Singh last November. Weinberg added that such a price tag would be well beyond DuckDuckGo's financial capabilities, remarking, "That's out of DuckDuckGo's price range." Nevertheless, he confirmed that DuckDuckGo would be interested in acquiring Chrome if cost were not a barrier, underscoring the browser's strategic value in the search ecosystem. The trial, overseen by U.S. District Judge Amit Mehta, follows his earlier ruling that Google illegally maintained a search monopoly – partly through default agreements and preferential payments to partners like Apple. The Department of Justice, joined by a coalition of states, is now seeking remedies that could include forcing Google to sell Chrome in order to foster greater competition in the search market. Interest in Chrome extends beyond DuckDuckGo. Executives from OpenAI and Perplexity have also testified that they would consider acquiring the browser if it were put up for sale as a result of the court's decision. Nick Turley, Head of Product at OpenAI, argued that deeper integration between Chrome and OpenAI's technology could create a more seamless, AI-first experience for users. Turley revealed that OpenAI had previously approached Google about a partnership to power ChatGPT with Google's search API, but the request was declined last August – leaving OpenAI to rely on Microsoft's Bing for search results. // Related Stories Perplexity's Chief Business Officer, Dmitry Shevelenko, echoed the sentiment, stating that his company would be eager to enter into a distribution agreement or even acquire Chrome if it became independent from Google. He described the difficulties smaller companies face in competing with Google's entrenched distribution channels and revenue-sharing agreements, which he characterized as the "root cause" of Google's dominance. As the trial continues, the future of Chrome remains uncertain. Google is not offering Chrome for sale voluntarily and is expected to appeal any ruling that mandates its divestment. The outcome could have far-reaching implications for the balance of power in internet search, digital advertising, and the rapidly evolving field of artificial intelligence.0 Comments 0 Shares 34 Views
-
WWW.TECHSPOT.COMOblivion Remastered quietly keeps the modding legacy alive – without Bethesda's helpBottom line: The Elder Scrolls IV: Oblivion Remastered doesn't officially support mods. However, that hasn't stopped fans from experimenting – and finding that quite a few legacy mods for the original 2006 release still work with a bit of manual effort. Users on Reddit and the Bethesda Game Studios Discord noted that several classic .esp files function more or less as intended when dropped into the Remastered version. Ars Technica confirmed it through basic testing, using a 2008 mod to add overpowered gear to the starting prison cell. Getting these legacy mods working isn't as streamlined as with the Oblivion Mod Manager, but it's not hard either. Players have found that dragging the file into the same directory as the downloadable content – Content/Dev/ObvData/Data – and adding the filename to the Plugins.txt list enables them in-game. Simple tweaks, cheats, and visual changes often function as intended since the "heart" of the remaster is the original Oblivion engine. More complex mods with new assets are less consistent, sometimes causing crashes or graphical glitches when they conflict with the Unreal 5 visual overhaul. The lack of official mod support is surprising, especially given this newly discovered legacy mod compatibility. Bethesda published the remaster, but Virtuos handled the development. The studio likely didn't have time to implement a full modding interface, prioritizing a stable core experience at launch instead. Mod support could still come in a future update. Skyrim, for example, didn't see official mod tools on PlayStation until years after release. Of course, that was largely Sony's fault. However, even on PC and Xbox, they arrived later. Given how central modding has always been to The Elder Scrolls community, it's hard to imagine Virtuos or Bethesda ignoring the demand completely. Virtuos is probably more concerned with fixing launch bugs, which some players have already reported. The foundation is there, considering some legacy mods work and new ones already showing up on Nexus Mods – like UI tweaks or faster-walking NPCs. Official tools or better integration could easily arrive in a future update. In the meantime, players who know their way around a directory folder are already reinstalling old favorites. The fact that a remastered game running on Unreal Engine can still recognize content made nearly 20 years ago is a testament to how deep modding hooks ran in the original. Whether intentional or not, the same community that kept the original Oblivion alive is already shaping its second life. // Related Stories0 Comments 0 Shares 27 Views
-
WWW.TECHSPOT.COMDiscord co-founder steps down, new CEO appointed as IPO preparations ramp upIn brief: Discord will enter the next phase of its journey under new leadership. Co-founder and CEO Jason Citron shared the news with employees in a recent memo, which was also made public on the company's blog. Discord was conceived by Citron and Stanislav Vishnevskiy, who wanted to create something akin to a virtual living room where friends could chat with each other about games they like. In fact, Final Fantasy XI was a big inspiration for the chat service which is now used by hundreds of millions of people each month. The service went live in May of 2015. Its focus is still on gamers, but others have also adopted Discord for things like workplace chat and to discuss various hobbies or interests. Humam Sakhnini has been brought in to succeed Citron. Sakhnini arrives with more than 15 years of experience in the gaming industry, which includes a stint at Activision Blizzard where he served as chief strategy officer overseeing franchises like Call of Duty and World of Warcraft. He also led King as president from 2019 through early 2022. Sakhnini will officially take over on April 28, but he won't be tossed into the deep end without a life preserver. Citron will stick around to help with the onboarding process and serve in an advisory capacity. He will also retain his position on the board of directors, we are told. In March 2024, Discord revealed it had over 200 million monthly active users and was planning an IPO in the not-too-distant future. That still has not happened, and it is interesting that it won't be coming to fruition under Citron's watch. Citron said building Discord has been one of the most rewarding experiences of his life, adding that he is deeply grateful for the many friendships, challenges, and achievements along the way. Looking ahead at what will be needed from Discord's CEO over the coming years, Citron said it's time for him to "hire himself out of a job." // Related Stories Image credit: Ella Don, Alexander Shatov0 Comments 0 Shares 45 Views
-
WWW.TECHSPOT.COMTrump tariffs push top PC makers Lenovo, HP, and Dell toward Saudi ArabiaSo Much Winning: According to multiple analysts, Donald Trump introduced his unprecedented tariff plan in an effort to force manufacturers to return to the US. If that was truly the goal, the tariffs ended up being instrumental in achieving the exact opposite outcome. The so-called "reciprocal tariffs" imposed by the Trump administration could push major PC manufacturers to find new production hubs, and it likely won't be in the US. Earlier this month, laptop makers were already forced to halt shipments to the US due to tariff-related uncertainty and logistical chaos. Now, some of the world's largest PC brands appear to be eyeing Saudi Arabia as their next manufacturing base. According to a recent report by DigiTimes, Lenovo, HP, and Dell are actively exploring new manufacturing initiatives in the Middle Eastern kingdom. Lenovo publicly announced its plans earlier this year, stating that the move is part of a broader strategy to diversify operations and gain privileged access to markets in the Middle East and Africa. Lenovo's initiative is backed by a $2 billion investment from Saudi Arabia's Public Investment Fund, a massive $620 billion fund aimed at transforming the kingdom's economy beyond its dependence on fossil fuels. PIF is also expected to play a role in supporting HP and Dell's potential relocations, although progress on those fronts has been slower. The two US-based OEMs have dispatched teams to Saudi Arabia after being approached by local government authorities. These scouting teams are tasked with assessing the situation on the ground and identifying potential sites for new manufacturing facilities. Sources indicate the new plants would likely be located near Riyadh, the capital of Saudi Arabia. In addition, Riyadh officials have extended invitations to several original design manufacturers including Foxconn, Quanta, Wistron, Compal, and Inventec. These companies are capable of both designing and manufacturing their own products and typically require specific industrial conditions to meet their production goals. To attract OEMs and ODMs, Saudi Arabia is offering a range of exclusive incentives including covering the full cost of constructing the new facilities. // Related Stories Relocating to Saudi Arabia could offer manufacturers strategic advantages amid current global economic volatility. While Donald Trump has imposed a steep 245 percent tariff on imports from China, Saudi Arabia faces a relatively modest 10 percent reciprocal tariff. For OEMs, improved access to Middle East and African markets is an appealing proposition, while ODMs may also leverage existing operations in Mexico to circumvent US tariffs altogether.0 Comments 0 Shares 51 Views
-
WWW.TECHSPOT.COMGoogle tells some remote workers to return to the office or face terminationIn brief: Google is following Amazon's hardline approach for workers who refuse to come back to the office. The company is demanding that some of its full-time remote employees return to their closest office locations for at least three days per week or face losing their jobs. Google was one of the companies who decided to play it safe when it came to bringing workers back following the pandemic. It introduced hybrid work models in April 2022, though managers were able to approve fully remote exceptions. Google started tightening the leash in June 2023, when it began tracking hybrid employees' office badges to find out if they come in on the days they're supposed to. These office attendance records are recorded for performance reviews. Google also asked fully remote staff to consider coming back three days each week. 2023 was also the year that Google's HR told managers to review long-term remote arrangements and require some people to move closer to an office or convert to hybrid. More changes came last year when Google decided to approve remote requests only in "exceptional" cases. Now, according to internal documents viewed by CNBC, Google is pushing many remote staffers to switch to hybrid or face the consequences. Employees in Google Technical Services have been told to either come in at least three days per week or take a voluntary exit package. Remote employees in the unit are also being offered a one-time relocation expense to move within 50 miles of an office. Moreover, remote employees in Google's People Operations division who live within 50 miles of an office must switch to hybrid work this month or their roles will be eliminated. // Related Stories Earlier this year, Google offered voluntary exit packages for employees working in the Platforms and Devices group, which includes the Pixel and Android teams that had just merged. The offer was extended to full-time employees in its People Operations division in March. Sergey Brin, the billionaire who co-founded Google with Larry Page in 1998 and is now assisting with its AI efforts, recently said that 60-hour in-office weeks were crucial for Google to come out on top in the ultra-competitive AI industry. Former Google CEO Eric Schmidt also made his feelings on remote work clear in 2024. He said that "Google decided that work-life balance and going home early, and working from home, was more important than winning." He later partially backtracked following an outcry over his comments.0 Comments 0 Shares 54 Views
-
WWW.TECHSPOT.COMNintendo Switch 2 teardown confirms Nvidia Tegra T239 chip, SK Hynix memory, and other detailsTL;DR: A teardown of the newly announced Nintendo Switch 2 has seemingly confirmed key hardware specs not yet officially announced. According to screenshots published by a reliable hardware modder, the device uses an Nvidia Tegra processor and SK Hynix memory. The teardown was performed by YouTuber and X user @KurnalSalts, known for his deep dive videos on Arm chips used in smartphones, laptops, AR headsets, and other gadgets. According to his since-deleted post, the Switch 2 is powered by Nvidia's Tegra T239 SoC, which comes with an Arm Cortex X1 HP-core, three Cortex A78 performance cores, and four Cortex A55 efficiency cores, paired with a custom Ampere-based GPU with 12 SMs and 1,536 CUDA cores. The tipster also revealed that the Switch 2 uses memory modules from SK Hynix, though the exact memory configuration remains unconfirmed. Earlier rumors suggested that it uses 12GB of LPDDR5 RAM in dual-channel mode with a 128-bit memory interface. The teardown also revealed a 256GB UFS 3.1 flash storage module from SK Hynix and what looks like a Wi-Fi chip from MediaTek. Krunal says he will publish his trademark deep dive into the Tegra chip and the rest of the hardware in the near future. Hardware enthusiasts are hoping that the promised video will reveal more details about the Switch 2, including information about the process node used to manufacture the Nvidia SoC. The processor powering the Switch 2 is believed to be made by Samsung, but there's been significant speculation in the recent past about whether it's based on the company's 8nm DUV foundry node or the newer 5nm EUV process. While the Digital Foundry YouTube channel is doubling down on the 8nm rumors, some Nintendo communities on Reddit and X believe that Nvidia switched to the 5nm technology for its new SoC. Nintendo announced the Switch 2 in January before sharing more details earlier this month. The new console features multiple hardware and software upgrades over its predecessor, including a bigger display, improved controls, enhanced audio, and 4K output for TV. Priced at $449.99, it goes on pre-order today at Best Buy, Target, Walmart, and GameStop. It is slated to hit store shelves in North America on June 5.0 Comments 0 Shares 27 Views
-
Nintendo files DMCA subpoena odering Discord to identify"Teraleak" Pokémon leakerWhat just happened? Nintendo, probably the most litigious games company in the world, has requested a DMCA subpoena ordering Discord to reveal the identity of the person behind last year's Pokémon "Teraleak." The leaker allegedly hacked Pokémon developer Game Freak and posted a slew of data covering not only unreleased and upcoming work, but also personal information about employees. In October 2024, Discord user GameFreakOUT posted 1 terabyte of data to a Discord server called FeakLeak that included so much unseen Pokémon-related material that the community gave it the nickname Teraleak. The leak included information about upcoming projects such as Pokémon Legends: Z-A, minutes from a Pokémon Company meeting, concept art from the original 1997 anime, source code for the DS games Pokémon HeartGold and SoulSilver, and much more. The source of all this was Game Freak's internal servers, which meant it also included employee details. Game Freak said data on more than 2,000 current and former members of staff had been breached in August 2024. Six months after the Teraleak incident, Polygon reports that Nintendo filed a request for a subpoena on April 18 in the US District Court for the Northern District of California. The subpoena would order Discord to reveal GameFreakOUT's name, address, phone number, and email address. Nintendo issued DMCA requests at the time of the leak to try to remove the data, but it can still be found online. // Related Stories Nintendo never said what it would do if the subpoena is granted and the leaker's identity is revealed, though it's easy to guess its plans. The company is famous for coming down hard on those who it calls copyright infringers. The most famous case is that of Gary Bowser, who was sentenced 40 months in federal prison (he was released after a year) for selling modchips and jailbreaks used to circumvent Nintendo's security measures. He also has to pay Nintendo a total of $14.5 million. While the leaker in this instance never sold anything, Nintendo in 2021 took two Pokémon Sword and Shield leakers to court. They had to pay The Pokémon Company $150,000 each in damages and attorneys' fees. The size of the Teraleak, the fact it included employee details, and the alleged hacking mean GameFreakOUT would likely receive a harsher punishment. Nintendo owns around 33% of Pokémon. Game Freak and The Pokémon Company own the other two-thirds. Masthead: Michael Rivera0 Comments 0 Shares 48 Views
-
WWW.TECHSPOT.COM4chan has been offline for over a week, and it's probably not coming backTL;DR: 4chan has long been one of the internet's most infamous communities, playing a central role in the rise of various memes and controversial incidents. Following a cyberattack that abruptly took the site offline, comments from an anonymous former moderator suggest its chances of returning are slim. Access to 4chan was disrupted on April 14, and troves of internal data appeared online shortly after. Members of Soyjak, a rival forum, claimed responsibility, declaring victory in a feud that has spanned several years. The attackers claimed they had infiltrated 4chan's systems for over a year before executing the attack. At least one admin account was compromised, a previously deleted board was revived and defaced, and visiting 4chan now results in a 503 error message. Information from the leaks indicates that the site's source code was extremely outdated and vulnerable to numerous exploits. The hack also exposed administrators' email addresses and revealed that they could see the IP addresses of everyone who posted on the ostensibly anonymous forum. Additionally, the attackers obtained the personal information of paid subscribers. Speaking on condition of anonymity, a 4chan moderator told TechCrunch that the damage likely extends beyond what has been publicly revealed. Given that the attackers appeared to gain complete control over the forum – and access has not been restored more than a week later – its return seems unlikely. // Related Stories 4chan was founded in 2003 by Christopher "moot" Poole, then a high school student, as an alternative space for discussing anime culture. Poole based the site's design and code on Japan's 2chan, one of the world's largest online communities. A reference to the viral "Chicken Jockey" meme from the Minecraft movie is likely 4chan's final post. Poole's English-language version quickly gained notoriety for its irreverent and often offensive content. In 2015, he sold 4chan to Hiroyuki Nishimura, the founder of 2channel (not to be confused with 2chan). The site's cultural influence and infamy only grew throughout the 2010s. Widely known trends like Pepe the Frog, wojaks, rage comics, and trolling were popularized on 4chan, but the forum also gained a reputation as one of the internet's darkest. 4chan played a central role in the rise of movements like QAnnon, the incel community, GamerGate, and the alt-right. The site was also linked to multiple mass shootings, the 2014 celebrity photo leak scandal, and other serious incidents. 4chan's influence on 21st-century culture is undeniable, but if it really is gone, many won't miss it.0 Comments 0 Shares 56 Views
-
WWW.TECHSPOT.COMNintendo apologizes as Switch 2 demand overwhelms supply in JapanFacepalm: You knew it would happen. Despite producing and sitting on inventory for a year and locking pre-orders behind a Nintendo Online subscription, there will not be enough Switch 2s to go around in Japan, but just wait. It will happen in the US too. The company will hold a second lottery sometime after launch but still won't have enough to cover initial orders. It's Switch one all over again. On Monday, the My Nintendo Store in Japan opened its first round of pre-orders for the long-awaited Switch 2 – and closed them just as fast. Within hours, the store marked the new system's availability as "sold out," with pre-order applications closed until at least May 6 – a day after launch. Nintendo President Shuntaro Furukawa said pre-orders in Japan alone amounted to more than 2.2 million, far exceeding its ability to deliver. The number even exceeds what Nintendo expects to supply during a second lottery round. "In order to avoid the trouble of those who were not selected in the first lottery sale having to reapply, My Nintendo Store will automatically carry over those who were not selected in the first lottery sale to the second lottery sale," Furukawa explained via X. "However, even including the quantity for the second lottery sale, we cannot fulfill all of the applications we received. We deeply apologize for not being able to meet your expectations despite our prior preparations." Unlike typical online pre-orders, Nintendo structured this release as a lottery. Users had to log in with a verified Nintendo account and register interest during a limited window. Winners will be randomly selected and notified after the application period ends. However, it's not entirely random. The company said it would give users who have paid for at least one year of Nintendo Online and have logged at least 50 hours of gameplay by April 2 higher priority. It's a harsh restriction designed to curb scalping bots and mass purchases, similar to strategies employed during PlayStation 5 scarcity, but did it work? // Related Stories Although Nintendo's lottery made it harder for automated bots and bulk buyers, anecdotal reports suggest that some scalpers have adapted. Japanese resellers on platforms like Mercari and Yahoo Auctions have already listed Switch 2 "pre-order reservations" at inflated prices despite the lottery still being open outside Japan. Mind you, Nintendo has not even announced winners yet. These listings don't guarantee a console, only an entry into the lottery – yet some buyers who were late on the draw are willing to take the risk. These early gray-market listings make clear the pre-order system isn't airtight. However, it's a far cry from the chaos of earlier console launches. In past cycles, scalpers openly boasted about automated systems that could buy dozens of units in seconds. The Switch 2's limited application-based rollout has at least forced them to work harder. Nintendo hasn't revealed how many units it allocated for this first wave, so it's difficult to gauge how much of the sellout reflects genuine demand versus opportunistic flipping. What is clear, though, is that interest in the Switch 2 is high – and the company's efforts to rein in scalpers, while imperfect, have shifted the landscape. Image credit: The Shortcut0 Comments 0 Shares 66 Views
-
WWW.TECHSPOT.COMMeta expands Ray-Ban smart glasses with live translation, visual AI, and new framesIn brief: Meta is broadening the reach and capabilities of its Ray-Ban smart glasses, unveiling a suite of new features and style options that signal a significant step forward in wearable technology. The company announced that its live translation tool, previously limited to select early adopters, is now rolling out to all markets where Ray-Ban Meta glasses are available. This update enables real-time translation in English, French, Italian, and Spanish, allowing users to hold conversations across language barriers and hear instant translations through their glasses. For travelers and users without reliable internet access, the feature can also function offline, provided the necessary language packs are downloaded in advance. Meta is also introducing new color lens combinations for the Skyler frame, including a shiny chalky gray paired with sapphire transitions lenses and a shiny black option that can be fitted with either clear or green-tinted lenses. Meta is also pushing the boundaries of what smart glasses can do with the company's AI assistant, Meta AI. In the United States and Canada, users will soon be able to engage in more natural, free-flowing conversations with Meta AI, which can continuously process visual information from the glasses' camera. This "see what you see" capability allows the assistant to provide context-aware responses – whether identifying landmarks, offering cooking advice, or translating a foreign menu in real time. The feature, previously in beta, is now poised for general release. Communication features are also expanding. The glasses will soon support sending and receiving direct messages, photos, and both audio and video calls via Instagram, complementing existing integrations with WhatsApp, Messenger, and native phone messaging apps. // Related Stories Music lovers will find new reasons to embrace the update, as Meta is extending support for popular streaming services such as Spotify, Amazon Music, Apple Music, and Shazam beyond North America. Users in more regions can now control music playback and access information about the songs they're listening to, provided their default language is set to English. The international rollout continues, with Meta confirming plans to launch the Ray-Ban Meta smart glasses in Mexico, India, and the United Arab Emirates, though specific release dates remain under wraps. Meanwhile, users in the European Union will soon gain access to Meta AI and its visual search capabilities, further bridging the gap in feature availability across regions.0 Comments 0 Shares 68 Views
-
WWW.TECHSPOT.COMGoogle Chrome abandons plans to phase out third-party cookiesWhat just happened? In a significant reversal that will send ripples through the advertising industry, Google has announced that it will no longer introduce a standalone prompt for third-party cookies in its Chrome browser. The decision marks a dramatic departure from the company's long-standing plan to phase out cookies entirely, a move that has been in waiting for several years now and was closely monitored by regulators, advertisers, and privacy advocates alike. The announcement, delivered by Anthony Chavez, VPt of Privacy Sandbox at Google, confirmed that Chrome users will continue to manage their third-party cookie preferences through existing privacy and security settings, rather than being presented with a new, explicit prompt. "We've made the decision to maintain our current approach to offering users third-party cookie choice in Chrome, and will not be rolling out a new standalone prompt for third-party cookies," Chavez wrote in a blog post on April 22. He emphasized that users can still choose the best option for themselves within Chrome's settings. This policy shift effectively halts Google's multi-year campaign to eliminate third-party cookies from Chrome, a browser that commands over 60 percent of the global market. The original plan, announced in 2020, aimed to bring Chrome in line with competitors like Firefox and Safari, which had already blocked third-party cookies by default. Google's approach, however, was more cautious, citing the need to balance user privacy with the economic realities of the ad-supported web. The company's Privacy Sandbox initiative was intended to develop alternative technologies that would enable targeted advertising while preserving user privacy. These included tools such as the Topics API and various new APIs for ad measurement and fraud prevention. Despite these efforts, industry feedback revealed deep concerns. Many in ad tech argued that the proposed replacements couldn't match the scalability or real-time processing capabilities of third-party cookies, while publishers worried about revenue loss and the technical complexity of implementing new systems. Regulatory scrutiny also played a decisive role in Google's change of course. In April 2024, the UK's Competition and Markets Authority (CMA) intervened, requesting a pause in the rollout over concerns that Google's dominance in both browsers and digital advertising could be further entrenched by the proposed changes. The CMA demanded assurances that any new system would not unfairly advantage Google's own ad products. // Related Stories Meanwhile, privacy advocates and organizations such as the Electronic Frontier Foundation continued to criticize Google's alternatives, arguing they still enabled user tracking and introduced new privacy concerns. Chavez acknowledged these divergent perspectives in his post, noting ongoing engagement with both industry stakeholders and regulators. While the complete removal of third-party cookies is now off the table, he said the Privacy Sandbox project will continue in a modified form. Google plans to keep developing privacy features – such as IP Protection for Incognito users – and will gather additional feedback before updating its roadmap for future technologies. Critics responded swiftly. The Movement for an Open Web, a group that had previously challenged Google's plans before the CMA, described the announcement to The Verge as "an admission of defeat." They argued that Google's attempt to reshape the digital advertising ecosystem in its own favor was ultimately stymied by regulatory and industry resistance. For now, third-party cookies will remain a fixture in Chrome, leaving the digital advertising industry grappling with the implications.0 Comments 0 Shares 69 Views
-
WWW.TECHSPOT.COMRunning on empty: California is about to run out of license plate combinationsTL;DR: California is on pace to run out of license plate character combinations by the end of the year. Since 1980, the state has used a sequence consisting of one number, followed by three letters, followed by three more numbers. Fortunately, the state already has a solution and it's relatively simple: they're just going to reverse the current order. The current sequence for non-commercial vehicles started with 1AAA000 and will end with 9ZZZ999. Try as I might, I was unable to determine exactly why California went with the one number / three letters / three numbers sequence to begin with. Perhaps it has something to do with regional registrations or sheer consistency? Why not just allow for each of the seven character slots to be either a number or a letter? If my math is correct, that'd result in more than 78 billion possible combinations – plenty for the foreseeable future and beyond. Other options could include adding an eighth character or reusing retired sequences, although that would probably get cumbersome from a legal standpoint. Fortunately, the state already has a solution and it's relatively simple: they're just going to reverse the current order. Once 9ZZZ999 has been issued, the next plate will be in the format of three digits, three letters, and one number. So, 000AAA1 or 001AAA1, depending on how they decide to do it. 100AAA1 could also be an option if the "no leading zeroes" rule stands. License plate collectors will no doubt be keeping an eye on the transition, and it is expected that the last old plate and the first new plate could become quite valuable. Most residents, on the other hand, likely won't even notice the change. // Related Stories It's worth reiterating that this change only applies to standard-issue, non-commercial plates. Like many other states, California also offers special interest license plates for those interested in supporting specific causes or organizations such as breast cancer awareness, environmental causes, pets, or colleges. You'll usually pay extra for these and depending on what you choose, a portion of the fee could go to support said organization.0 Comments 0 Shares 68 Views
-
WWW.TECHSPOT.COMOpenAI wants to buy Chrome if Google is forced to sellThe big picture: If Google really is forced to sell Chrome – as proposed by the DOJ after the company was ruled a monopoly in its antitrust trial – OpenAI could emerge as a potential buyer. The ChatGPT maker has admitted it's interested in acquiring the world's most popular browser and turning it into an "AI-first" experience. Following Judge Amit Mehta's ruling that Google was a monopolist in online search last year, the Justice Department pushed for the immediate sale of Chrome. The remedies phase of the trial began this week in Washington. Nick Turley, head of product for ChatGPT at OpenAI, was one of the DOJ's witnesses on Tuesday. He testified that OpenAI had contacted Google last year about a partnership that would improve ChatGPT. The chatbot already uses Bing's search data, but Turley mentioned there had been "significant quality issues" with a company referred to as "Provider No. 1," which was likely a reference to Microsoft. "We believe having multiple partners, and in particular Google's API, would enable us to provide a better product to users," OpenAI told Google in an email that was revealed during the trial. Google turned down the offer as it believed the deal could harm the company's lead in search. Turley added that OpenAI doesn't have any partnership with Google today. Turley was also asked if OpenAI would be interested in purchasing Chrome if Google is forced to sell its browser. "Yes, we would, as would many other parties," he replied. // Related Stories In November 2024, reports claimed OpenAI was considering releasing a Chromium-based web browser with ChatGPT integration that could compete with Chrome. The company hired two key Chrome developers last year, Ben Goodger and Darin Fisher, founding members of the Chrome team. Chrome has dominated the global browser market since its user share passed Internet Explorer in 2012. Today, it has a 66% overall share and 4 billion users. Second-place Safari has an 18% share. If OpenAI were to buy Chrome, Turley predicted that it would become an "AI-first" experience. That means tight integration with ChatGPT and other OpenAI products, while data from those billions of users could be used to train its AI systems. Another judge this month ruled that Google built and maintained an illegal monopoly in key segments of the online display advertising industry. The decision could lead to the government breaking up Google's advertising operations.0 Comments 0 Shares 60 Views
-
WWW.TECHSPOT.COMIntel's overclocking tool offers 7.5% performance gains without voiding warrantyWhat just happened? Intel has released a new overclocking tool called "200S Boost" that can increase performance on select Intel systems without voiding the warranty. The utility is specifically designed for Core Ultra 200S series processors, which received a mixed response from reviewers following their launch late last year. The new feature offers a simple overclocking option for systems powered by unlocked Arrow Lake chips when paired with compatible Z890 motherboards and supported memory modules. It will be available via a BIOS update on motherboards from Asrock, Asus, MSI, Gigabyte, and other major vendors. 200S Boost will reportedly allow users to achieve higher fabric, die-to-die, and memory frequencies, resulting in performance gains for low-latency workloads such as gaming. Intel claims the feature can increase inter-die fabric frequencies on 200S-series processors from 2.6GHz to 3.2GHz (VccSA ≤ 1.2V), and die-to-die frequencies from 2.1GHz to 3.2GHz (VccSA ≤ 1.20V). It can also help overclock DDR5 memory from 6,400 MT/s to 8,000 MT/s. The best part about 200S Boost is that using it to tune your PC won't void your warranty if something goes wrong with the CPU. That's because Intel continues to offer its standard three-year limited warranty on processors, regardless of whether they've been overclocked using this feature or run at default settings without any modifications. Supported CPUs include: Intel Core Ultra 9 285K Intel Core Ultra 7 265K Intel Core Ultra 7 265KF Intel Core Ultra 5 245K Intel Core Ultra 5 245KF Tom's Hardware tested the new tool and found that it delivers an average performance boost of around 7.5%. Their test system featured a Core Ultra 9 285K CPU, an MSI MEG Z890 ACE motherboard, and an RTX 5090 Founders Edition graphics card. // Related Stories On the memory side, six configurations and three memory speeds were tested across 16 games at 1080p. Moving from 6400 MT/s to 8000 MT/s, Baldur's Gate 3 saw the largest performance gain at 11.6 percent, while A Plague Tale: Requiem showed a more modest 3.7 percent improvement. Tom's also noted that relatively affordable DDR5-7200 memory kits delivered nearly the same performance in most games and applications, potentially making them a better choice for improved system stability. The publication additionally benchmarked several productivity applications to evaluate the impact of the new overclocking tool. It found that software known to benefit from overclocked hardware saw slight performance gains, while other applications remained largely unaffected.0 Comments 0 Shares 61 Views
-
WWW.TECHSPOT.COMIntel to adopt TSMC's next-gen 2nm process for upcoming Nova Lake CPUsSomething to look forward to: Upcoming chips from Intel, Apple, and AMD will utilize next-gen semiconductors featuring gate-all-around (GAA) transistors. While Intel is set to debut its own 18A process incorporating GAA later this year, reports suggest that the company's CPUs slated for 2026 will actually be among the first to adopt TSMC's version of the technology. Intel plans to build its upcoming Nova Lake CPUs on TSMC's forthcoming 2nm semiconductor process node, according to Economic Daily News. If accurate, next year's desktop PCs could be among the first devices to feature 2nm technology, alongside the iPhone 18 Pro. Both TSMC and Intel declined to comment on the report, however the Taiwanese chipmaker is expected to begin 2nm trial production at its Hsinchu plant soon, aiming to improve yield rates ahead of mass production in the second half of the year. TSMC's 2nm node, which will utilize GAA transistors to reduce power leakage and enhance performance, is also expected to power some AMD chips and the flagship iPhone SoC scheduled for release in 2026. Intel has already used TSMC's 3nm process for the compute tiles on its Arrow Lake Core Ultra 200 chips, and may decide to upgrade to the foundry's 2nm node for the same section in Nova Lake. Nova Lake is expected to succeed Arrow Lake in desktop and possibly high-end laptop CPUs next year. According to Tom's Hardware, Nova Lake will also require a new motherboard socket, LGA 1954, which could feature over 2,000 pins. Reliable leaker Olrak recently shared shipping manifests from NBD.ltd referencing voltage regulator testing tools and various jig models that mention LGA1954. Meanwhile, Intel is preparing to integrate GAA technology into its 18A node, which is currently in risk production and is expected to enter mass production in time for the launch of the company's Panther Lake processors later this year. A brief for the upcoming 2025 VLSI Symposium notes that GAA and backside power delivery will improve 18A's density scaling and performance by over 30 percent compared to Intel 3. // Related Stories Panther Lake will follow Intel's low-power Core Ultra 200V laptop processors, scheduled for release in the second half of 2025. The company's Clearwater Forest server processors, expected in 2026, will also use 18A. With 18A, Intel aims to regain a competitive edge by bringing GAA and backside power delivery to market ahead of TSMC. TSMC, in turn, plans to introduce its version of backside power delivery a year later with its A16 node.0 Comments 0 Shares 69 Views
-
WWW.TECHSPOT.COMApple and Meta hit with combined $797 million fine for violating EU's DMA antitrust rulesWhat just happened? The European Commission has just hit Apple and Meta with combined fines of almost $1 billion. It marks the first fines handed out by the Commission under its Digital Markets Act (DMA), and arrives just after President Trump threatened to levy tariffs against any countries that penalize US companies. Apple was handed the larger fine of 500 million euros ($570 million), while Meta has to pay 200 million euros ($228 million), making a combined total of 700 million euros, or $797 million. In addition to its $570 million fine, Apple has been slapped with a cease-and-desist order requiring it to make further product changes by June. If it fails to comply with this order, the Commission can fine it for every additional day it refuses to cooperate. The penalties come after a year-long investigation in which the Commission found that Meta forced Facebook and Instagram users to either pay a subscription fee to avoid ads or consent to their personal data being used for targeted advertising. In response to the Commission's findings, Meta has modified its ad approach in the EU, now offering unpaid users a version of the platforms with fewer unskippable, full-screen personalized ads. However, in a compliance report published on March 6, the company argued that it has "continued to receive additional demands that go beyond what is written in the law," despite taking steps to align with the DMA. The Commission is currently examining this model to determine if it complies with the rules. Apple, meanwhile, broke the DMA's steering rule. This requires gatekeepers – Apple, Meta, Alphabet, Amazon, ByteDance, and Microsoft – to allow business users (like app developers or online sellers) to steer customers to offers or alternative distribution channels outside the gatekeeper's platform, without penalties or restrictions. // Related Stories There was some good news for the companies. The Commission has also closed an investigation into Apple's compliance with the DMA's rules on browsers and default apps following changes that it introduced. Moreover, Facebook's Marketplace will no longer be designated as a regulated service, so it will no longer fall under the DMA's remit. An Apple representative said it will appeal the decision, which it called "yet another example of the European Commission unfairly targeting" the company and forcing it to "give away (its) technology for free." "We have spent hundreds of thousands of engineering hours and made dozens of changes to comply with this law, none of which our users have asked for. Despite countless meetings, the Commission continues to move the goal posts every step of the way," the representative said. Meta said it also plans to appeal the ruling. "The European Commission is attempting to handicap successful American businesses while allowing Chinese and European companies to operate under different standards," said Joel Kaplan, Meta's chief global affairs officer. "This isn't just about a fine; the Commission forcing us to change our business model effectively imposes a multi-billion-dollar tariff on Meta while requiring us to offer an inferior service. And by unfairly restricting personalized advertising the European Commission is also hurting European businesses and economies." Apple and Meta must pay the fines within 60 days or risk further financial penalties. Under its rules, the Commission could have fined Meta up to $16 billion and Apple $39 billion based on their earnings last year.0 Comments 0 Shares 59 Views
-
WWW.TECHSPOT.COMNvidia GeForce RTX 5060 Ti 8GB Review: Instantly ObsoleteNvidia succeeded in delaying reviews of their 8GB RTX 5060 Ti, but they couldn't hide them forever. After about a week's delay, we can show you just how underwhelming this product really is. In case you missed it, Nvidia launched the GeForce RTX 5060 Ti last week, available in both 8GB and 16GB configurations. Leading up to this release, we learned that Nvidia was actively holding back the 8GB model from reviews. This meant it wouldn't appear in initial coverage but would still be available for purchase when reviews went live – or shortly thereafter – which is exactly what happened. We picked up the Asus Prime model for $720 AUD, currently the most affordable RTX 5060 Ti in stock. The most affordable 16GB model at the time was the MSI Ventus 2X for $880, a 22% premium. The issue is that both the 8GB and 16GB versions are branded simply as RTX 5060 Ti. While the GPU configuration is identical, the difference in memory capacity makes them fundamentally different products. 8GB of VRAM in 2025 for a GPU as fast as the RTX 5060 Ti – which, to be fair, isn't exactly impressive in terms of performance per dollar – is still too little memory for what this class of GPU is expected to handle. There are countless real-world scenarios where the RTX 5060 Ti will suffer severely, or even become unusable, due to the limited 8GB frame buffer. We're going to explore a few of those cases today. The core problem with a product like the RTX 5060 Ti is that many buyers will base their expectations on the performance of the 16GB model, only to choose the cheapest version available – which will often be the 8GB model. That results in people unknowingly buying a product that is arguably already obsolete. To be clear, 8GB of VRAM is still enough for the majority of games today, and in many cases where it's not, lowering the visual settings can still provide a playable experience. However, it's no longer sufficient for an optimal experience in many of the latest titles – and this situation is only going to deteriorate over the next few years. Most people buying a GeForce 50-series graphics card right now, especially something like the RTX 5060 Ti, are likely planning to use it for at least the next three years, if not longer. We don't even want to imagine how poorly 8GB cards will perform by then. It'll likely mirror what we're currently seeing with 4GB GPUs. And let's be honest: when spending over $400, do you really want to be constantly worrying about VRAM? Tweaking settings just to squeeze under the memory cap shouldn't be necessary at this price point. That's a compromised and frustrating experience. The 8GB model is supposed to have an MSRP of $380, but the lowest listing we've found on Newegg is $420, with most models priced at $440 or higher. Meanwhile, 16GB models start at $430, though many go for $480 or more. Even in the worst-case scenario, you're looking at only a 14% premium for double the memory. That makes the 8GB model a serious trap for buyers unaware that two distinct versions of the same product exist. The point is, there should never have been an 8GB version of the RTX 5060 Ti, and Nvidia knows it. They know this is a weak product. They know that 8GB of VRAM in 2025 is far from adequate. And they also know that many of you know this. But they also know they can make a lot of money from it, because many gamers aren't tech-savvy and will just buy the cheapest option. They're also counting on the pre-built PC market. By selling people a product that's already outdated, Nvidia ensures they'll return sooner than they otherwise would – whether for the next generation or even a mid-cycle refresh. We've been saying it for years: this is planned obsolescence. And if you can't see it now, well you're Nvidia's favorite type of customer. Test Notes and System Specs This review takes a different approach from our usual lineup of blue bar graphs, performance summaries, and cost-per-frame evaluations. To highlight just how poorly the 8GB RTX 5060 Ti performs in 2025, we purchased one and spent several days running extensive side-by-side tests. In the graphs, you'll find data such as average frame rate, 1% low FPS, VRAM usage, and frametime graphs that highlight stuttering and frame pacing issues. These issues are easier to showcase in the video version of this review if you want to check that out. For all of this testing, we're using a PCIe 5.0-enabled AM5 system with the 9800X3D and 32GB of DDR5-6000 CL30 memory. This setup is essentially a best-case scenario for running over the VRAM buffer with an RTX 5060 Ti. Performance will be worse on PCIe 4.0 systems, and drastically worse on PCIe 3.0. Gaming Benchmarks The Last of Us Part II 4K DLSS Quality, Very High Preset We'll start with The Last of Us Part II, running at 4K. Now, you might think 4K is a questionable choice for a product like this, but we'd argue otherwise for a few reasons. Firstly, the RTX 5060 Ti is capable of 4K gaming – especially with upscaling like DLSS set to quality mode. Secondly, high-quality 4K high refresh rate monitors now cost less than the RTX 5060 Ti itself and can deliver a truly stunning visual experience. The key takeaway here is that the 16GB card averaged 68 fps, while the 8GB model barely surpassed 30 fps, with frequent frametime spikes. By the end of our test, the 16GB version delivered 120% better 1% low performance – a massive difference. We know some will dismiss 4K results, so let's move to 1440p. 1440p DLSS Quality, Very High Preset Performance does improve for the 8GB model at 1440p, but it still suffers from severe frametime issues. Meanwhile, the 16GB card delivers frame rates around 30% higher. We're using the Asus Prime model for both versions, and while the 16GB card clocks about 1% higher, that minor difference doesn't explain the performance gap. By the end of testing, the 16GB version was 34% faster on average and delivered 215% better 1% lows. 1440p DLSS Quality, Very High Preset + Frame Generation Frame generation has been a major selling point for both the GeForce 40 and 50 series. But when enabled, the additional VRAM requirements severely affect the 8GB model, dropping 1% lows into the single digits – making for a truly poor experience. 1440p DLSS Quality, High Preset What if we drop down to the High preset? At 1440p with DLSS enabled, we encounter occasional stutters on the 8GB card, but overall performance is acceptable. Still, the 16GB card maintains a clear lead – 18% faster on average, with 35% better 1% lows. 1440p DLSS Quality, Medium Preset Switching to the Medium preset cleans up frame pacing for the 8GB model, making the experience smoother. Even then, the 16GB model still delivers 12% higher average performance and 11% better 1% lows. 1440p DLSS Quality, Low Preset Even with the Low preset, the 16GB version remains 8% faster on average. It's not a huge difference, but still surprising to see any gain considering we're nearing the absolute limit of the 8GB card's capability. 1080p Native, Very High Preset At native 1080p using the Very High preset, the game demands over 9GB of VRAM. This causes the 8GB model to struggle significantly, with terrible frametime performance – even at this lower resolution. As a result, the 16GB card delivered 25% better average frame rates and an astounding 320% improvement in 1% lows. Let's move on to another title. Final Fantasy XVI 1080p Native, Ultra Preset Here we're testing Final Fantasy XVI at native 1080p using the Ultra preset. While the 8GB model offers a playable experience, the 16GB version still comes in 14% faster. That said, 1080p is quite low by 2025 standards – so let's jump up to 1440p. 1440p Native, Ultra Preset At native 1440p, the 16GB card pulls ahead by a significant margin – 58% higher average frame rate and 218% better 1% lows. While we personally wouldn't play this game at around 50 fps, we know some prioritize visual fidelity over frame rates. 1440p DLSS Quality, Ultra Preset Using DLSS for upscaling delivers what we'd consider a more optimal setup. The 16GB card averaged 74 fps, making it 80% faster than the 8GB model, which only managed 41 fps. 1440p DLSS Quality + Frame Generation, Ultra Preset Out of curiosity, we enabled frame generation. It helped the 8GB model somewhat, but the 16GB card was still 24% faster on average, with 30% better 1% lows. Despite the seemingly decent frame rates, the experience on the 8GB model felt noticeably worse. 1440p Native, High Preset Finally, running the game at native 1440p with the High preset allowed the 8GB card to roughly match the 16GB version, trailing by just a few percent. Indiana Jones and the Great Circle 1080p Native, Medium Preset Getting the 8GB RTX 5060 Ti to run Indiana Jones and the Great Circle wasn't easy. Only the Low and Medium presets worked, and even then, we couldn't test 4K with DLSS enabled using the Medium preset, as the game would immediately crash to desktop. At native 1080p using just the Medium preset, the 8GB card averaged 114 fps, while the 16GB version was 7% faster. Overall, the 8GB card handled this scenario fairly well. 1440p Native, Medium Preset However, moving to 1440p, the 8GB card begins to struggle. While it maintains decent frame time consistency, the 1% lows were 90% higher on the 16GB model, and average frame rates improved by 82%. Under these conditions, the 16GB card delivered a far superior experience. 1080p Native, Ultra Preset The 16GB version also performed well at 1080p using the Ultra preset – and even at 1440p with upscaling. In contrast, the 8GB model crashed to desktop whenever we attempted anything above the Medium preset. In this case, the difference was night and day. Hogwarts Legacy 1440p Native, Ultra Preset When playing Hogwarts Legacy at native 1440p with the Ultra preset, both GPUs performed similarly in terms of average frame rate. However, the 8GB card suffered from texture pop-in issues – something we've previously seen with the 8GB RTX 4060 Ti. 1440p Native, High Preset, Ray Tracing High Performance issues for the 8GB card became much more severe once ray tracing was enabled. Using the High preset with High ray tracing at 1440p, the 16GB model delivered 62% higher average frame rates and a massive 483% improvement to 1% lows. The 8GB version, by comparison, was an unplayable stuttering mess. 1440p Native, Medium Preset, Ray Tracing Medium We then tried the Medium preset with Medium ray tracing, and the 8GB model still exhibited frequent frame time spikes and poor 1% lows. The 16GB card was up to 96% faster in this scenario. Horizon Forbidden West 4K DLSS Performance, Very High Preset Next up is Horizon Forbidden West at 4K using DLSS in Performance mode with the Very High preset. These settings were completely unplayable on the 8GB card, which delivered 1% lows of just 9 fps. In contrast, the 16GB card averaged 72 fps and maintained 1% lows above 60 fps, making it 350% faster on average and 589% faster in 1% lows. 1440p DLSS Quality, Very High Preset Dropping to 1440p with DLSS set to Quality helped the 8GB card somewhat, but the game was still not playable, with 1% lows of only 18 fps. The 16GB model, meanwhile, was even faster than it was at 4K, averaging nearly 100 fps – 234% faster overall, or 344% faster based on 1% lows. 1080p Native, Very High Preset At 1080p native resolution with the Very High preset, the 8GB model was finally playable, although frame pacing issues remained. The 16GB card, on the other hand, delivered a flawless experience, averaging 100 fps with 1% lows above 80 fps – making it nearly 80% faster on average, with 131% better 1% lows. 1440p DLSS Quality, High Preset At 1440p with DLSS set to Quality and using the High preset (not even the highest), the 16GB card still pulled significantly ahead. A quick note: we mistakenly scaled the overlay stats in RivaTuner during testing, as we had to revisit the 16GB model to ensure both GPUs were tested under the same lighting and time-of-day conditions. Despite this minor discrepancy, both cards were tested under identical settings. Under these conditions, the 16GB version was 148% faster in average frame rate and 193% faster in 1% lows. The 8GB model also struggled with frame pacing, resulting in a noticeably jittery experience. 1440p DLSS Quality, Medium Preset Finally, at 1440p with DLSS Quality and the Medium preset, we found a configuration where the 8GB model was usable – not great, but functional. However, frame time issues persisted, and the 16GB card still delivered 47% higher 1% lows. Space Marine 2 4K DLSS Q, Ultra Preset, 4K Textures Space Marine 2 offers an interesting look at the 8GB vs. 16GB comparison, and a key reminder of why benchmark graphs alone don't always tell the full story. Using the Ultra preset with the 4K texture pack enabled at 4K with DLSS set to Quality, both GPUs delivered frame rates comfortably above 60 fps. Oddly enough, the 8GB card appeared slightly faster – until you realize it's not fully rendering the game. Looking at static image comparisons, it's immediately obvious how poor the 8GB presentation is compared to the 16GB model. The differences are striking. The low-resolution, muddy textures on the 8GB card are worsened by frequent pop-in as the game attempts to load higher-resolution assets – resulting in a flickery mess. Under these settings, very few textures render correctly with the 8GB card, while the 16GB model delivers both excellent visuals and performance. We've seen this issue before – games like Halo Infinite and Forspoken come to mind – where performance may look fine on paper, but the actual presentation is severely degraded due to limited VRAM. A Plague Tale: Requiem 4K DLSS Quality, Ultra Preset Next, in A Plague Tale: Requiem, we tested at 4K using DLSS Quality with the Ultra preset. While the 16GB model didn't break performance records, it still averaged 52 fps – 33% faster than the 8GB card – and delivered 51% better 1% lows. 4K DLSS Performance, Ultra Preset Using DLSS Performance mode at 4K yields more realistic results for this title. Here, both cards averaged over 60 fps, but the 16GB model still managed 23% better 1% lows. 4K DLSS Performance, High Preset + Ray Tracing On For over 80 fps, we enabled ray tracing at 4K with DLSS Performance and the High preset. The 16GB card averaged 86 fps with 1% lows of 72 fps. The 8GB card, however, was unable to maintain playable performance and actually crashed during this test. Assassin's Creed Shadows 1440p DLSS Balanced, Very High Preset Testing Assassin's Creed Shadows at 1440p with DLSS set to Balanced and the Very High preset, the 8GB card struggled with poor frame pacing. The 16GB card was 52% faster in 1% lows and 50% faster on average. Most importantly, the game was enjoyable at 66 fps – but not at 44 fps with erratic frame times on the 8GB model. 1440p DLSS Balanced + Frame Generation, Very High Preset Enabling frame generation didn't fix the stuttering issues on the 8GB model. In this configuration, the 16GB version was 62% faster in average frame rate. Cyberpunk 2077 1440p DLSS Quality, Ray Tracing: Medium At 1440p with DLSS Quality and ray tracing set to Medium, the 8GB card struggled badly. While the 16GB model averaged 54 fps – a 26% uplift – the real difference was in 1% lows, which were 73% higher, making gameplay on the 8GB card noticeably worse. 4K DLSS Performance, Ray Tracing: Low If you're hoping to enjoy ray tracing at 4K with DLSS Performance and the Low RT preset, the 8GB card falls apart – delivering just 29 fps on average with 1% lows of 16 fps. The 16GB card was up to 206% faster. Marvel Rivals 4K DLSS Performance, Ultra Preset In Marvel Rivals at 4K with DLSS Performance and the Ultra preset, the 16GB model averaged 69 fps with 1% lows of 53 fps. While not ideal for a competitive shooter, it's at least playable. Compared to the 8GB version, the 16GB card delivered 30% better average frame rates and an 89% improvement in 1% lows. Spider-Man 2 4K DLSS Performance, Very High Preset At 4K with DLSS Performance and the Very High preset, the game was unplayable on the 8GB card, with terrible frame pacing and an average frame rate below 20 fps. In contrast, the 16GB version averaged just over 60 fps, with a 533% improvement in 1% lows. 1440p DLSS Quality, Very High Preset Dropping to 1440p with DLSS Quality, the 8GB model remained broken and unusable, while the 16GB card averaged over 80 fps with 1% lows of 47 fps – resulting in smooth and enjoyable gameplay. 1440p DLSS Quality, High Preset Switching to the High preset at 1440p, the 8GB card was still inadequate. The 16GB version averaged over 90 fps, offering a 221% increase in average frame rate and a 358% boost in 1% lows. 1080p Native, Very High Preset We then dropped to native 1080p, where upscaling isn't ideal, using the Very High preset. Even here, the 8GB card averaged just 49 fps. The 16GB card was nearly 60% faster, averaging 78 fps. 1080p Native, High Preset Finally, using the High preset at 1080p, the 8GB model still performed poorly with erratic frame times and subpar overall performance. The 16GB version was 35% faster in average frame rate and 58% faster in 1% lows. Simply put, 8GB of VRAM is a terrible choice for Spider-Man 2. Star Wars Jedi: Survivor 4K DLSS P, Epic Preset Moving on to Star Wars Jedi: Survivor, we didn't even enable ray tracing here. At 4K using DLSS Performance with the Epic preset, the 16GB model averaged 68 fps, while the 8GB version was limited to just 32 fps. That's a 113% performance advantage in favor of the 16GB card and a 104% improvement in 1% lows. Additionally, the 8GB card rendered the game incorrectly, with missing textures and visual artifacts. Alan Wake 2 1440p DLSS Q, High Preset A game where the 8GB model performed well was Alan Wake 2. Here, we observed very similar performance between both versions of the RTX 5060 Ti. As we mentioned at the start of this review, this is what you'll see in the majority of games. God of War Ragnarök 4K DLSS P, Ultra Preset God of War Ragnarök is another example where 8GB of VRAM was mostly adequate. There was a small performance hit, but at 4K with DLSS set to Performance and the Ultra preset enabled, the 8GB card managed fine – at least in our test area. It's possible other sections of the game could expose limitations. Black Myth: Wukong 1440p DLSS Q, Very High Preset Black Myth: Wukong also ran reasonably well on the 8GB model, though the 16GB card did deliver slightly better performance. This could change in more demanding areas or during longer gameplay sessions. The important point is that even in games where performance seems fine, we're operating right on the edge of the 8GB VRAM buffer. What We Learned We just looked at 15 games, each with numerous examples where the 8GB version of the RTX 5060 Ti is clearly held back by its memory buffer. In each case, the 16GB model delivered a consistently superior experience under the same conditions. It's painfully obvious that an 8GB frame buffer is no longer satisfactory for PC gaming beyond the most entry-level products, which can't cost more than about $150. Since such products no longer exist in the current market, neither should 8GB VRAM configurations. Realistically, 12GB should be the new minimum. But if you're aiming for an uncompromised experience, especially over the typical lifespan of a GPU, then 16GB is what you need – and we're confident that this will become increasingly obvious in the coming years. With that in mind, don't let anyone convince you that the RTX 5060 Ti isn't suitable for 4K gaming. As we've shown, with enough VRAM, it absolutely is. And 1440p gaming with upscaling is exactly what a $400 GPU should be capable of – and it is, provided it comes with 16GB of VRAM. We've seen people argue that the RTX 5060 Ti is meant for 1080p gaming, but that's simply not true. That line of reasoning does a disservice to consumers. As clearly demonstrated here, the RTX 5060 Ti is very capable at both 1440p and 4K with upscaling – provided it has 16GB of VRAM. Our take is straightforward: if you're buying a new graphics card in 2025 and spending around $300, it should come with at least 12GB of VRAM. If you're spending $400 or more, it needs to have 16GB. And anything beyond that – like a future RTX 5080 – should offer at least 24GB, especially with next-gen consoles expected to launch within the lifespan of these GPUs. Unfortunately, we suspect this will turn into another RTX 3080 10GB situation – except at a much higher price. And once again, this benefits Nvidia, as their planned obsolescence strategy will push RTX 5080 owners to upgrade sooner than necessary. As for the GeForce RTX 5060 Ti, our biggest issue is that the 8GB and 16GB versions share the same name. Realistically, there should only be one configuration. The performance gap between them is too substantial to ignore. Ideally, the RTX 5060 Ti should come with 16GB, and a non-Ti version could offer 12GB – or better yet, 16GB as well. There's no justification for an 8GB model in this product tier. Another factor worth considering is resale value. Based on sold listings on eBay, the 16GB version of the RTX 4060 Ti is fetching 42% more than the 8GB version, despite only costing up to 25% more at launch. Expect that disparity to grow even wider for the RTX 5060 Ti. Two years from now, the 8GB versions will likely be nearly worthless in comparison. To wrap things up: the 8GB version of the RTX 5060 Ti is possibly the worst trap we've ever seen laid for mainstream gamers. Nvidia has really outdone themselves here. Their efforts to suppress early reviews and exploit uninformed buyers are unacceptable. And they're planning to do the same with the RTX 5060 – but rest assured, we'll be there to cover that as well. This is yet another reason why the GeForce 50 series may go down as Nvidia's worst generation ever. And the most frustrating part? It didn't have to be this way. For now, we're done with the 8GB RTX 5060 Ti. But on the bright side, Nvidia has inadvertently provided us with the perfect case study for examining VRAM limitations moving forward. Shopping Shortcuts: Nvidia RTX 5060 Ti 16GB on Amazon Nvidia RTX 5060 Ti 8GB on Amazon Nvidia GeForce RTX 5070 on Amazon AMD Radeon RX 9070 on Amazon AMD Radeon RX 9070 XT on Amazon Nvidia GeForce RTX 5080 on Amazon Nvidia GeForce RTX 5090 on Amazon0 Comments 0 Shares 81 Views
-
WWW.TECHSPOT.COMCATL's new battery tech promises 800-km range and five-minute chargingForward-looking: CATL has announced a series of breakthroughs that could reshape the EV industry, promising batteries that are cheaper, lighter, faster to recharge, and more resilient in extreme temperatures, all while extending driving range. The company, which supplies a third of the world's EV batteries to major automakers including GM and Tesla's Shanghai plant, unveiled these just ahead of the Shanghai Auto Show. At a press event reminiscent of a high-profile car launch, China's leading battery maker CATL detailed innovations that could bring electric cars closer to price and performance parity with their gasoline-powered counterparts within the next few years. Batteries account for at least a third of an EV's cost, making CATL's progress particularly significant for automakers worldwide. One of the most notable developments is CATL's new approach to auxiliary batteries. Traditionally, EVs have relied on a single large battery pack, but CATL's design introduces a secondary battery that shares space in the vehicle's underbody. This auxiliary battery is the first commercially available EV battery to eliminate graphite from one of its poles, which could eventually reduce costs and increase energy density by 60 percent per cubic inch. According to Gao Huan, CATL's chief technology officer for EVs in China, this innovation could either extend a car's range or allow for smaller battery packs, freeing up more passenger space. The auxiliary battery also serves as a backup, an increasingly important feature as more vehicles adopt self-driving technologies that demand uninterrupted power supplies. // Related Stories CATL's co-president for research and development, Ouyang Chuying, indicated that these graphite-free batteries could appear in production vehicles within two to three years, though he declined to name specific automakers. However, the company acknowledged that removing graphite comes with trade-offs, namely that such batteries recharge more slowly and have a shorter lifespan. CATL has also made strides in charging speed for its main batteries. The latest iteration of its flagship Shenxing battery cell can add 520 kilometers (about 320 miles) of range with just five minutes of charging, surpassing even the recent advancements announced by rival BYD and placing CATL ahead of Western competitors like Tesla and Mercedes-Benz. The second-generation Shenxing battery offers an 800-kilometer range on a single charge, achieving a peak charging speed of 2.5 kilometers per second. CATL's Gao emphasized that the new batteries do not compromise on energy density and are slated to be installed in more than 67 electric vehicle models this year. In addition to lithium-based innovations, CATL is pushing forward with sodium-ion battery technology. The company's new Naxtra brand of sodium-ion batteries, set to enter mass production in December, promises over 90 percent charge retention even at temperatures as low as minus 40 degrees Celsius. This makes them especially attractive for vehicles operating in the frigid climates of northern China, where traditional lead-acid batteries often fail. The first customer for these batteries will be freight trucks from First Auto Works, based in Changchun, a region known for its harsh winters. Sodium-ion batteries are considered a safer and more affordable alternative to lithium-based cells, largely because sodium is abundant and inexpensive. The new Naxtra battery boasts an energy density of 175 watt-hours per kilogram, nearly matching the widely used lithium iron phosphate batteries. CATL's founder, Robin Zeng, has suggested that sodium-ion batteries could eventually replace up to half of the market for lithium iron phosphate batteries, which the company currently dominates. Beyond technical specifications, CATL has demonstrated the safety of its sodium-ion batteries through rigorous stress tests, including puncturing and cutting the cells without causing fires or explosions – a notable shift from the company's stance just five years ago. These batteries are also being positioned as a solution for internal combustion vehicles, offering compatibility with existing electrical systems, though some models may require modifications to accommodate the new battery size. CATL's rapid pace of innovation comes even as the company faces increased competition and market pressures. Last month, the company reported a 15 percent growth in net profit for 2024, its slowest rate in six years, amid a prolonged price war in China's EV market. Still, with over 18 million cars equipped with its batteries operating in more than 66 countries, CATL's influence on the future of electric mobility remains formidable.0 Comments 0 Shares 80 Views
-
Doom can now run in a self-contained QR code. Sort ofIn context: QR codes were originally designed to efficiently track the types and quantities of automobile parts. Today, thanks to smartphones and mobile apps, their use has expanded far beyond that. If you really know your trade, you could even try packing a functional program into a single QR code – and maybe run Doom on it, because why not? A resourceful developer named Kuber Mehta has taken the "Can it run Doom?" meme to new heights with a wild new project that pushes the boundaries of extremely limited execution environments. While the Backdooms project doesn't technically run the original Doom engine inside a QR code, Mehta says he was directly inspired by id Software's legendary shooter – as well as the viral "Backrooms" creepypasta – to develop his concept. Backdooms is a compressed, self-extracting program encoded entirely within a single QR code. When scanned, it launches an infinitely generated HTML environment resembling Doom-style corridors, which players can navigate and interact with. The game runs entirely in modern web browsers and doesn't require an internet connection – the entire game is stored in the URL itself. Mehta, a computer science and artificial intelligence student in New Delhi, spent a week exploring how to maximize QR code storage and compression. He ultimately chose a Doom-like interactive experience to demonstrate his progress, but the same technique could, in theory, be used to encode lightweight web apps within QR codes, unlocking new possibilities for ultra-portable software delivery. The developer chronicled his journey on the MindDump blog, where he explained the absurd premise – running code within a 3KB QR code – alongside the origin of the idea and the detailed process behind creating Backdooms. Notably, Mehta had to rely on a technique called minification – or in this case, extremely aggressive minification – to squeeze a functional HTML program into such a tiny space. This compressed code generates graphics, Doom-like corridors, enemies to shoot at, and even music. A breakthrough came when Mehta received a helpful hint from a chatbot, which suggested using DecompressionStream – a little-known Web API available in all modern browsers. Thanks to this component, the Backdooms code can be dynamically decompressed and executed directly in the browser. The game can be played on desktops, smartphones, and potentially other devices via a link or by scanning the QR code available on the project's GitHub page. // Related Stories Though only loosely related to Doom, Backdooms keeps the "Can it run Doom?" tradition alive. Developers continue to push the boundaries of where the open-source FPS engine can run. Recent feats include running Doom on a Collector's Edition game box, inside TypeScript's type system, within a Microsoft Word document, and even directly on a GPU.0 Comments 0 Shares 63 Views
-
New Nvidia GeForce hotfix driver addresses crashes and black screen issuesGeForce Hotfix Display Driver version 576.15 is based on our latest Game Ready Driver 576.02. A GeForce driver is an incredibly complex piece of software, We have an army of software engineers constantly adding features and fixing bugs. These changes are checked into the main driver branches, which are eventually run through a massive QA process and released. Since we have so many changes being checked in, we usually try to align driver releases with significant game or product releases. This process has served us pretty well over the years but it has one significant weakness. Sometimes a change that is important to many users might end up sitting and waiting until we are able to release the driver. The GeForce Hotfix driver is our way to trying to get some of these fixes out to you more quickly. These drivers are basically the same as the previous released version, with a small number of additional targeted fixes. The fixes that make it in are based in part on your feedback in the Driver Feedback threads and partly on how realistic it is for us to quickly address them. These fixes (and many more) will be incorporated into the next official driver release, at which time the Hotfix driver will be taken down. To be sure, these Hotfix drivers are beta, optional and provided as-is. They are run through a much abbreviated QA process. The sole reason they exist is to get fixes out to you more quickly. The safest option is to wait for the next WHQL certified driver. But we know that many of you are willing to try these out. These hotfix drivers represent a lot of additional work by our engineering teams, I hope they provide value for you. We'll try it out and see if people like the idea and want us to continue. What's New: This hotfix addresses the following issue: [RTX 50 series] Some games may display shadow flicker/corruption after updating to GRD 576.02 [5231537] Lumion 2024 crashes on GeForce RTX 50 series graphics card when entering render mode [5232345] GPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307] [RTX 50 series] Some games may crash while compiling shaders after updating to GRD 576.02 [5230492] [GeForce RTX 50 series notebook] Resume from Modern Standy can result in black screen [5204385] [RTX 50 series] SteamVR may display random V-SYNC micro-stutters when using multiple displays [5152246] [RTX 50 series] Lower idle GPU clock speeds after updating to GRD 576.02 [5232414] For questions, please visit the FAQ below: Nvidia DCH/Standard Display Drivers for Windows FAQ0 Comments 0 Shares 63 Views
-
WWW.TECHSPOT.COMThe Oblivion remake is real, it's gorgeous, and it's out nowAt last: Get ready to close the gate of Oblivion. Bethesda just made it official. The Elder Scrolls IV Remastered launched today. Much of the game has been rebuilt from scratch so it's not just a cosmetic refresh. It's got a modernized UI, streamlined leveling, and much more. A massive leak last week revealed almost everything fans wanted to know about the long-rumored Oblivion remake. The cache included screenshots and side-by-side comparisons. An Xbox Support representative even let it slip that the game would launch on April 21. Well, it's a day late, but Bethesda greeted us this morning with a live feed officially revealing the reboot, and from the looks of it, all the rumors were true aside from the release date. Even some of the side-by-side comparisons appear to have come straight from Bethesda's presentation (masthead). However, Bethesda managed to throw us a few surprises. The most pleasant is that TES: Oblivion Remastered is available as of "right now." The April release is somewhat surprising. May seemed more likely, but maybe that was just me being pessimistic. What's more surprising is that it is available on most platforms, including PlayStation 5! Considering Microsoft has kept most of Bethesda's newer titles away from its biggest rival, it is remarkable that it didn't at least make it a timed exclusive. Xbox may view the Oblivion remake differently than new releases like Starfield and TESVI, which are exclusives (for now). Whatever the case, it's a smart move - millions of PS5 owners will snap this title up, substantially boosting sales. Also read: 26 Years of The Elder Scrolls On appearances alone, there is little reason that fans shouldn't pick up this carefully done remaster unless it turns out to be buggy, a real possibility given Bethesda's track record. It is one of the finer makeovers I have seen recently. It looks gorgeous. I've included screenshots throughout, but please do check out the live footage in the masthead - stills just don't do it justice. The game didn't just get a new coat of paint. Design studio Virtuos rebuilt all models and environments from scratch. Virtuos said it used the Oblivion game engine as the heart of the game while Unreal 5 produced the stunning visual aesthetic and special effects. "We've leveraged nearly every major feature from the latest version of Unreal 5," said Virtuos Executive Producer Alex Murphy. Utilizing Unreal Engine clearly paid off in spades. The only real question is, did developers give the old Oblivion engine any love? I recall revisiting Oblivion on the PS3 a few years ago and had to give up because the control scheme felt too clunky and outdated. Virtuos said that it updated a lot of gameplay elements, like the user interface and experience. Leveling is not as janky anymore - no more hopping around like a crazy rabbit just to level up that agility stat. However, Murphy failed to mention anything about the game controls. Overlooking control modernization would be a rookie misstep, so here's hoping that Virtuos remembered something so simple yet fundamental to the player experience. Some fans in the forums questioned whether Bethesda would include the two Oblivion DLCs, Knights of the Nine and Shivering Isles, or if it would split them off to sell separately and push a deluxe bundle. Good news: Oblivion Remastered includes all original DLC. Bad news (depending on how you view it): There is a deluxe version, which offers two weapon and armor skins, a digital artbook, and the soundtrack. The Standard Edition is $50, while the Deluxe costs $60. If you don't want to commit to the deluxe bundle, you can always upgrade later for $10. The Elder Scrolls IV: Oblivion Remastered is available on PC through Steam, Xbox Series X|S, and PlayStation 5. // Related Stories I'm anxious to hear early reviews, especially from our readers. It will also be interesting to see if this opens the doors for other TES remasters. Morrowind, anyone?0 Comments 0 Shares 81 Views
-
WWW.TECHSPOT.COMNew SD Express 8.0 cards double the speed of today's fastest microSDsForward-looking: Most highly recommended microSD cards offer maximum read speeds of around 250 MB/s, but the impending release of the Nintendo Switch 2 has increased demand for significantly faster memory cards. Just as microSD Express technology begins to gain mainstream acceptance, one vendor has introduced a new standard that doubles the theoretical performance. Adata has unveiled a new performance standard for SD Express cards, boasting a 1.6 GB/s maximum read speed and a 1.2 GB/s write speed – roughly double the fastest models currently available. Although the company hasn't disclosed release details, users shopping for Nintendo Switch 2 memory cards will have another, faster (and likely more expensive) option. The SD Express standard was introduced in 2018 with version 7.0, offering read speeds of up to 1 GB/s by leveraging NVMe SSD technology. However, because few portable devices required such high transfer rates, SD Express languished in obscurity for years, while most users and manufacturers stuck with more affordable and established options. Also read: microSD and SD Card Buying Guide Nintendo's upcoming Switch 2 handheld could change that when it launches on June 5. It is expected to be the first mass-market device to require microSD Express cards, and stores across Japan have already reported selling out of them. Games purchased on physical Switch 2 game cards or installed on memory cards will benefit from significantly faster load times compared to the original Switch. For example, Nintendo recently demonstrated that The Legend of Zelda titles can load new areas more than twice as quickly on the Switch 2, taking just a few seconds. Lexar microSD Express card Online retail listings show that only SanDisk and Lexar currently offer microSD Express cards, featuring read speeds of around 900 MB/s. Adata has raised the bar with the new SD 8.0 standard, although it remains unclear how quickly other vendors will follow suit. // Related Stories Adata has yet to reveal pricing details, too, but these next-gen memory cards are unlikely to be cheap. SanDisk's 256 GB SD 7.0 cards start at $60, while Lexar's 512 GB models retail for around $100. In addition, Adata recently announced several new flash memory and SSD products. The UE720 is a USB 3.2 Gen2 flash drive with read and write speeds of 500 MB/s and 450 MB/s, respectively, available in capacities up to 256 GB. The company's new EC680 M.2 SSD enclosure uses a USB 3.2 Gen2x1 interface and a Type-C connector to achieve read/write speeds of approximately 1,050/1,000 MB/s. It supports 2230, 2242, and 2280 form factors.0 Comments 0 Shares 85 Views
-
WWW.TECHSPOT.COMNew study reveals cybersecurity threats in next-generation DNA sequencingA hot potato: Next-generation DNA sequencing (NGS) faces mounting scrutiny over its cyber vulnerabilities. While NGS has revolutionized fields ranging from cancer diagnostics to infectious disease tracking, a new study warns that the systems enabling these advances could also be exploited as a gateway for hackers and malicious actors. The research, published in IEEE Access and led by Dr. Nasreen Anjum of the University of Portsmouth's School of Computing, is the first to systematically map cyber-biosecurity threats across the entire NGS workflow. NGS technology, which allows for rapid and cost-effective sequencing of DNA and RNA, underpins not only cancer research and drug development but also agricultural innovation and forensic science. Its ability to process millions to billions of DNA fragments simultaneously has dramatically lowered the cost and increased the speed of genome analysis, making it a staple in labs worldwide. However, the study highlights a less-discussed side of this technological leap: the growing number of vulnerabilities at each stage of the NGS pipeline. From sample preparation to sequencing and data analysis, each step relies on specialized instruments, complex software, and networked systems. According to Dr. Anjum, these interconnected processes create multiple points where security can be breached. As vast genomic datasets are increasingly stored and shared online, the risk of cybercriminals accessing and misusing this sensitive information grows. The study warns that such breaches could enable not only privacy violations or identity tracing but also more sinister possibilities, such as data manipulation or the creation of synthetic DNA-encoded malware. "Protecting genomic data isn't just about encryption – it's about anticipating attacks that don't yet exist," Dr. Anjum said, calling for a fundamental rethink in how the field approaches security. // Related Stories The research was conducted with experts from Anglia Ruskin University, the University of Gloucestershire, Najran University, and Shaheed Benazir Bhutto Women's University. The team identified several emerging threats, including AI-driven manipulation of genomic data and advanced re-identification techniques that could compromise individual privacy. These risks, they argue, extend beyond the individual to threaten scientific integrity and even national security. Despite these dangers, Dr. Anjum notes that cyber-biosecurity remains a neglected area, with fragmented protections and little collaboration between the disciplines of computer science, bioinformatics, biotechnology, and security. To address these challenges, the study recommends a suite of practical solutions: secure sequencing protocols, encrypted data storage, and AI-powered anomaly detection systems. The authors urge governments, regulatory bodies, and academic institutions to prioritize investment in research, education, and policy development to close the current gaps in biosecurity. The urgency of these recommendations is heightened by the rapid drop in sequencing costs and the proliferation of NGS applications. Where sequencing a human genome once cost tens of thousands of dollars, some companies now offer the service for as little as $200, with prices expected to fall further. This affordability has democratized access to genomic data and expanded the attack surface for potential cyber threats.0 Comments 0 Shares 64 Views
-
WWW.TECHSPOT.COMMicrosoft warns AI is making it faster and easier to create online scamsIn brief: It seems one profession that really loves generative AI is that of the cybercriminal. Microsoft warns that the technology has evolved to the point where creating an online scam can now take minutes rather than days or weeks and requires little technical knowledge. In its latest edition of the Cyber Signals report, Microsoft writes that AI has started to lower the technical bar for fraud and cybercrime actors looking for their own productivity tools. The range of cyber scams AI can be used for is extensive. The tools can, for example, help create social engineering lures by scanning and scraping the web to build detailed profiles of employees or other targets. There are also cases of complex fraud schemes that use AI-enhanced product reviews and AI-generated storefronts, with scammers creating entire sham websites and fake e-commerce brands, complete with fabricated business histories and customer testimonials. Scammers can even use AI for customer service chatbots that can lie about unexplained charges and other anomalies. It's long been reported that advancing deepfake technology is making this a popular tool for scammers. We've seen it used to create fake celebrity endorsements, impersonate friends and family members, and, as Microsoft notes, for job interviews – both hiring and applying – conducted via video calls. The company notes that lip-syncing delays, robotic speech, or odd facial expressions are giveaway signs that the person on the other end of a video call might be a deepfake. Microsoft recommends that consumers be wary of limited-time deals, countdown timers, and suspicious reviews. They should also cross-check domain names and reviews before making purchases, and avoid using payment methods that lack fraud protections, such as direct bank transfers and cryptocurrency payments. Tech support scams are also on the rise. While AI doesn't always play a part in these incidents, tech support scammers often pretend to be legitimate IT support from well-known companies and use social engineering tactics to gain the trust of their targets. The Windows Quick Assist tool, which lets someone use a remote connection to view a screen or take it over to fix problems, is regularly used in these scams. As such, Microsoft is adding warnings to Quick Assist and requires users to check a box acknowledging the security implications of sharing their screen. Microsoft also recommends using Remote Help instead of Quick Assist for internal tech support. While the post focuses on the dangers of AI scams, it also notes that Microsoft continues to protect its platforms and customers from cybercriminals. Between April 2024 and April 2025, Microsoft stopped $4 billion worth of fraud attempts, rejected 49,000 fraudulent partnership enrollments, and blocked about 1.6 million bot signup attempts per hour. // Related Stories0 Comments 0 Shares 68 Views
More Stories