• ICO launches major review of cookies on UK websites
    www.computerweekly.com
    The Information Commissioners Office (ICO) has embarked on a major review of cookie usage and compliance across some of the UKs 1,000 most-frequented websites as it prioritises giving consumers more choice and confidence in how their data is collected, stored and used.The regulator has already investigated the top 200 UK websites and said it has found concerns with 134, or 67%, of them which have been communicated to their owners.The ICO said it wanted to set out clearly the expectation that website operators must comply with data protection law by giving users meaningful choices and control as to how their data is used.Uncontrolled tracking intrudes on the most private parts of our lives and can lead to harm. For example, gambling addicts being targeted with more betting ads due to their browsing history or LGBTQ+ people altering their online behaviour for fear of unintended disclosure of their sexuality, said ICO executive director of regulatory risk, Stephen Almond.Our ambition is to ensure everybody has meaningful choice over how they are tracked online and what were publishing today sets out how we intend to achieve that.Last year, we saw significant improvements in compliance among the top 200 websites in what was a promising step forward for the industry. Now, we are expanding our focus to the top 1,000 websites and beyond that to apps and connected TVs.Well continue to hold organisations to account, but were also here to make it easier for publishers to adopt compliant, privacy-friendly business models. By combining advice, guidance and targeted enforcement, we aim to create an environment where businesses can succeed and people can have trust and control over their online experiences, said Almond.The concept of giving end-users meaningful control is one to which the ICO is cleaving in its 2025 strategy, through which it hopes to address the significant harm that can occur to ordinary people when online tracking practices are abused or misused.Complementing this strategy are a number of new measures to support businesses in adopting privacy-friendly practices and business models. These include the publication of draft guidance on tracking people using storage and access technologies such as cookies and fingerprinting; final guidance on the use of so-called consent-or-pay business models to help businesses balance tech innovation and revenue with data protection law; and potential reforms to support the use of new, privacy-preserving adtech, such as contextual models.The final guidance on consent-or-pay models which is now available for organisations to review via the ICOs website covers the practice of offering a choice between agreeing to receive personalised, targeted advertising to access a service for free, or paying for the service to avoid these adverts.In essence, it clarifies how organisations can use these models to give website visitors meaningful control while still supporting their own economic viability. It includes a set of best practices against which organisations should be prepared to assess their models to demonstrate their users have freedom of choice and consent.Tracking should work for everyone, said Almond. Giving people clear choices and confidence in how their information is used, while enabling businesses to operate fairly and responsibly. Our strategy ensures both.Read more about the ICOs workThe Open Rights Group is urging the Information Commissioners Office to revise its light touch approach to public sector data protection issues.ICO tool designed to make it easier for small businesses and sole traders operating online to create bespoke data privacy notices for compliance purposes.UK data protection watchdog joins forces with law enforcement agency to provide more support for organisations that fall victim to cyber crime and ransomware attacks.
    0 Comments ·0 Shares ·35 Views
  • Samsung unpacks Galaxy AIs personal data engine
    www.computerweekly.com
    mangpor2004 - stock.adobe.comNewsSamsung unpacks Galaxy AIs personal data engineThe headlines are the Galaxy S25 line-up, but the hardware is powered by Samsungs ambition to develop an AI ecosystemByCliff Saran,Managing EditorPublished: 23 Jan 2025 14:00 The 165 minutes of slick videos and presentations at Wednesday nights Galaxy Unpacked event has set out how the tech giant wants everyone to interact with artificial intelligence (AI). The mobile phone manufacturer used the event to unveil its latest S25 family of smartphone devices powered by Galaxy AI.Powered by the Snapdragon 8 Elite for Galaxy processor, Samsung said it has worked with Qualcomm to develop unique customisations, which it claims delivers a performance boost of 40% in NPU (for AI processing), 37% in CPU (for improved application performance) and 30% in GPU (graphics processing) compared with the previous generation.From a data security perspective, the Galaxy S25 introduces post-quantum cryptography, which the company said safeguards personal data against emerging threats that could increase as quantum computing evolves. Its One UI 7 software has been updated with what Samsung describes as an extra, fortified layer of device safety designed for the age of AI and hyperconnectivity. There are also Maximum Restrictions settings, enhanced Theft Protection, and a new Knox Matrix dashboard to monitor the security status across a connected device ecosystem.While presenters at the Unpacked event spoke about keeping personal data safe, the main focus was on Galaxy AI and how it makes the S25 more useful as a personal digital assistant. During his presentation, Drew Blackard, vice-president of product management said that Galaxy AI has redefined how people interact with technology by providing convenient experiences.Millions more have used Galaxy AI to experience deeper, more meaningful connections. This includes breaking down language barriers in real time, making conversations simpler and more efficient, he said.DeepMind CEO and co-founder Demis Hassabis took to the stage, proclaiming: AI will be agentic systems that are able to accomplish tasks and do useful things for you. He said Google has partnered with Samsung to bring some of its groundbreaking capabilities to the Gemini app.Samsungs head of customer experience, Jay Kim, discussed the personalised AI that Samsung has developed. Our goal is to enable personalised AI experiences by integrating AI agents and multimodal capabilities on all our devices, he said, adding that this requires open collaboration, which leads to better user experiences and helps grow the AI ecosystem with developers and partners.On-device AI processes data locally, using what Samsung calls its Personal Data Engine, which is protected by Samsungs dedicated security hardware, Knox Vault.Kim said: The cloud AI will access your data only when its necessary and your data is deleted shortly after your request is completed. This means you dont have to worry about your information being used to train AI models or for advertising. Our secure hybrid AI approach keeps your data private and secure while delivering a cutting-edge, personalised AI.Ian Horrocks, a professor of computer science at Oriel College, Oxford University, said: At Oxford Semantic Technologies, we have been developing the technology behind the Personal Data Engine, known as knowledge graph technology. Using knowledge graphs allows computers to better understand information and thereby draw more accurate conclusions and deliver more accurate insights.According to Horrocks, this improves the usefulness of smartphones and other devices: Until recently, smartphone services and applications have operated independently of one another, limiting their ability to provide insightful responses. The Personal Data Engine allows us to overcome this limitation by understanding a users experience along with the surrounding context and storing it in the users personal knowledge graph.The company has also chosen to implement the C2PA standard with the Galaxy S25 series. This adds a digital watermark to AI-generated work and its origins are captured in metadata.Discussing the Unpacked event, Ben Woods, principal analyst at CCS Insight, said: At a time when improvements to hardware capabilities and product design are largely incremental, Samsung is doubling down on its AI story. AI is a boon for someone who needs an upgrade, but not enough to move the needle for consumers who already have a relatively up-to-date phone.Read more about on-device AIApple adopts ChatGPT to put the A in AI: At its worldwide developer conference, Apple updated macOS and iOS, making Siri more context-aware.A fifth of new PCs shipped in Q3 were AI-optimised: PC manufacturers are working hard to showcase the benefits of premium devices that use neural processing units to deliver on-device AI acceleration.In The Current Issue:Can the UK government achieve its ambition to become an AI powerhouse?A guide to DORA complianceDownload Current IssueQdrant squares up for real-time AI apps with GPU-accelerated vector indexing Open Source InsiderData engineering - Pryon: Turning chaos into clarity CW Developer NetworkView All Blogs
    0 Comments ·0 Shares ·35 Views
  • Samsung Galaxy S25 Ultra hands-on: Why I'm nearly sold on the flagship Android phone
    www.zdnet.com
    Kerry Wan/ZDNETSamsung's Big Bad Galaxy phone has arrived, and it's almost everything I expected. The Galaxy S25 Ultra, which starts at a bold $1,299 price tag, remains the company's do-everything handset, with a more durable display than ever, an upgraded camera system with some useful editing tricks, and the infamous S Pen stylus. Oh, and yes, AI is everything, everywhere, all at once.Also:Everything announced at Samsung Unpacked 2025: Galaxy S25, Gemini AI, moreWhile last year's Galaxy S24 series ushered in Samsung's Galaxy AI era, this year feels more like the company cementing itself as the leader in the space, besting Apple and even Google (for now) on features and capabilities. Of course, how decorative that honor is ultimately depends on how invested you are in using AI.I spent an intimate afternoon with the new Galaxy S25 phones, with a particular focus on the flagship Galaxy S25 Ultra, and wasn't too surprised to find myself equal parts fascinated, equal parts bored. After all, this is now Samsung's fifth iteration of the S-series Ultra model, and the changes year-over-year have been modest, to be generous. But if even a part of you is considering upgrading to the latest Galaxy flagship, here are the changes you should know about.1. A slimmer, more durable design Kerry Wan/ZDNETThe Galaxy S Ultra has always been a big-phone, two-handed affair, and that holds true this year. In the hand, the S25 Ultra feels like the fanciest calculator ever (and I say that in a nice way), with its squared-off edges and tall stature.Fortunately, the device is not as top-heavy as I remember previous models being. In fact, the S25 Ultra is lighter than its predecessor by 16 grams, and Samsung says it's the company's thinnest-ever S-series device. That is, until the Galaxy S25 Edgereleases.Also:I went hands-on with the Samsung Galaxy S25 - and the AI features were surprisingly polishedThe other notable design upgrade is the Corning Gorilla Armor 2 layered over the front display. It doesn't take away from the anti-reflection surface treatment that made headlines on last year's S24 Ultra, yet promises improved drop protection and scratch resistance. I remember my S24 Ultra having numerous micro-scratches after just a week of usage, so I'm hopeful that the S25 Ultra will hold up better this time around.Now if only Samsung could bring back the glory days of psychedelic Aura Glow finishes. These S25 Ultra colors are not it.2. The best AI phone by Samsung (and Google) Kerry Wan/ZDNETUnder the hood is a new Qualcomm Snapdragon 8 Elite chip, the same processor powering competing Android phones like the OnePlus 13 and Honor Magic 7 Pro. The most notable benefit of the chipset comes in the form of AI experiences -- specifically, a 40% increase in NPU performance compared to last year's Snapdragon 8 Gen 3.In layman's terms, on-device AI features should take up less bandwidth and power to operate while also functioning quickly. I saw the results in person, placebo effect or not, as my S25 Ultra demo unit swiftly removed subjects (along with their shadows) from the background of photos via Generative Edit and pulled up Gemini search results in a matter of seconds.Also: I may finally ditch my Google Pixel for a Samsung Galaxy phone in 2025. Here's whyYou can also chalk up the performance to the enhanced Gemini app, which can now be fired up with a long press of the power button. Previously, Samsung phones were limited to Bixby -- or Google Assistant, if you made some higher-level, backend modifications.The new Gemini app can handle natural-sounding, conversational chains of actions, like telling the AI assistant to look up the upcoming schedule of your favorite sports team and adding the date(s) to your calendar. While the agent-like capability mainly works across Google and Samsung apps, some third-party services like Spotify are supported, too.Considering last year's S24 series introduced Google's Circle to Search feature, and this year's S25 series is doing the same for the enhanced Gemini experience, it wouldn't be farfetched to say that Samsung, not Google, makes the best AI phone for Android users right now.3. Camera features fit for professionals Kerry Wan/ZDNETOne of the key reasons to buy a Galaxy Ultra phone is still the camera, though Samsung hasn't improved the specifications much this year -- at least on paper. The only hardware change this year is with the 50-megapixel ultrawide lens (up from 12MP), which yields greater detail and vividness, especially when capturing macro photos. The rest of the lenses, such as the 200MP wide, 50MP 5x telephoto, and 10MP 3x telephoto, are identical to last year's S24 Ultra. Boo.Also:The best Android phones to buy in 2025There's a bigger focus on the software side of things, with Samsung introducing helpful photo and video features like 10-bit HDR video recording by default to capture a wider dynamic range, a new Audio Eraser feature that lets you adjust the volumes of various frequencies (classified as audio subjects like voices, wind, and noises), and the ability to record in Galaxy Log mode.The latter feature allows you to better spot overexposed areas and adjust the dynamic range as you're recording videos, while also applying LUTs in post for better color and light control. How all of these new features fare in the real world is what I'm most interested in seeing. I'll test the S25 Ultra over the next few weeks, so stay tuned for the updates. The reason to skip the Galaxy S25 UltraThis one's fairly obvious, but with how enticing Samsung and other major retailer's offers can be, let this be a reminder that you don't have to upgrade your phone every year or two. When I first tried the S25 Ultra, I was quite surprised by how similar it felt overall to older flagships like the S22 and S23 Ultra.Samsung has also confirmed that most of the new AI features, especially the ones embedded into One UI 7, will eventually trickle down to older Galaxy phones, so if the latest software is part of the reason why you're considering upgrading, you should reconsider.The Samsung Galaxy S25 Ultra is available for preorder today with a starting price of $1,299 for the 12GB of RAM and 256GB of storage variant. You'll have a choice of Titanium SIlverblue, Titanium Whitesilver, Titanium Gray, and Titanium Black if you're shopping through a major retailer or carrier, and have more colorful options if you buy directly from Samsung.Samsung Unpacked
    0 Comments ·0 Shares ·34 Views
  • This iPhone power bank was an essential lifeline on the bustling streets of Marrakesh
    www.zdnet.com
    The Sharge CarbonMag 5K made my trip stress-free. I would have been completely lost without my iPhone.
    0 Comments ·0 Shares ·33 Views
  • iOS 18.3 WarningYou Should Turn Off This New iPhone Setting ASAP
    www.forbes.com
    iOS 18.3 comes with a warning about a new setting that Apple has turned on by default and you may ... [+] want to switch it off.Getty ImagesApples iOS 18.3 is coming soon, with a bunch of new features and security updates for your iPhone. But iOS 18.3 also comes with a warning about a new setting that Apple has turned on by default and you may want to switch it off.The iPhone makers AI-enabled Apple Intelligence launched in iOS 18.1, including Siri integration with ChatGPT in iOS 18.2. However, users currently have to turn this on to use the features on their iPhone.Not anymore, from iOS 18.3 due to arrive next week, as Apple Intelligence will now be on by default, according to Apple-focused website 9to5Mac.The iOS 18.3 change is confirmed by Apples beta release notes, which read:For users new or upgrading to iOS 18.3, Apple Intelligence will be enabled automatically during iPhone onboarding. Users will have access to Apple Intelligence features after setting up their devices.MORE FOR YOUTo disable Apple Intelligence, users will need to navigate to the Apple Intelligence & Siri Settings panel and turn off the Apple Intelligence toggle, Apple said.The iOS 18.3 move comes as figures show the AI-enabled features are not that popular with iPhone users. In fact, 73% of iPhone users said AI features add little to no value, according to a recent survey by Sellcell.I asked Apple to comment on this iOS 18.3 news and will update this story if the iPhone maker responds.Why You Might Want To Toggle This iOS 18.3 Feature OffBy its very nature, AI requires a lot of data to operate, and this automatically enabled feature in iOS 18.3 is no different. It gets worse with ChatGPT integration, which can send data off to OpenAI although Apple does get your permission before this happens.Apple has lots of safeguards for data privacy and security when using Apple Intelligence, such as its Private Cloud Compute. However, your device is still more secure and private when you dont have AI enabled.These algorithms need huge amounts of data to build upon and grow, says Jake Moore, global cybersecurity advisor at ESET. Auto enabling features by default is a sure fire way to gain access to as much data as legally possible.Default settings such as this iOS 18.3 one may still hand over private information under the radar and without your explicit knowledge, Moore warns. Therefore, it is important to reduce or limit the release of personal information where possible, he says.If you want to turn off Apple Intelligence after upgrading to iOS 18.3, go to your iPhone Settings > Apple Intelligence & Siri and turn the toggle to Off.
    0 Comments ·0 Shares ·33 Views
  • Nvidia GeForce RTX 5090 Review
    www.techspot.com
    Exciting times for us computer enthusiasts as we can finally showcase the new GeForce RTX 5090 and the next generation of Nvidia GPUs, codenamed Blackwell, with the new flagship graphics card priced at $2,000.It's been two years since Nvidia released the mighty GeForce RTX 4090, an insane $1,600 GPU that smashed the previous-generation flagship by a 60% margin that is, 60% faster on average at 4K. This made it an extremely powerful and exciting option for high-end gaming, even if it was undeniably expensive.So, what's on offer here, and how can Nvidia justify a $2,000 price tag for the RTX 5090?Nvidia has faced some challenges this generation. While the RTX 50 series takes advantage of cutting-edge technologies such as PCI Express 5.0 and GDDR7 memory, the GPU is built using the same TSMC 4N process as the previous generation. Without improvements to the production node, significant performance gains would require an architectural overhaul, which isn't yet on the table.RTX 4090 FE on the left, 5090 FE on the rightGeForce RTX 5090GeForce RTX 4090GeForce RTX 5080GeForce RTX 4080 SuperGeForce RTX 4080Price $US MSRP$2,000$1,600$1,000$1,000$1,200Release DateJanuary 30, 2025October 12, 2022January 30, 2025January 31, 2024November 16, 2022ProcessTSMC 4NDie Size (mm2)750 mm2608.5 mm2378 mm2379 mm2Core Config21760 / 680 / 19216384 / 512 / 17610752 / 336 / 12810240 / 320 / 1129728 / 304 / 112L2 Cache (MB)96 MB72 MB64 MBGPU Boost Clock2407 MHz2520 MHz2617 MHz2550 MHz2505 MHzMemory Capacity32 GB24 GB16 GBMemory Speed28 Gbps21 Gbps30 Gbps23 Gbps22.4 GbpsMemory TypeGDDR7GDDR6XGDDR7GDDR6XBus Type / Bandwidth512-bit / 1792 GB/s384-bit / 1008 GB/s256-bit / 960 GB/s256-bit / 736 GB/s256-bit / 717 GB/sTotal Board Power575 W450 W360 W320 WTherefore, Nvidia's solution was to create a bigger and more powerful GPU. The die is now 23% larger, featuring 33% more cores. It comes equipped with 32 GB of 28Gbps GDDR7 memory on a 512-bit wide memory bus, delivering a bandwidth of 1,792 GB/s a hefty 78% increase over the RTX 4090.The RTX 5090 is a powerhouse, but it comes with an even steeper price tag, making it 25% more expensive than the RTX 4090. Given that price increase, we expect it to deliver performance far beyond what the specs suggest.RTX 4090 vs RTX 5090 ThermalsBefore we dive in and get into the blue bar graphs, let's take a look at how Nvidia's Founders Edition version of the RTX 5090 performs compared to the RTX 4090 FE card. For this comparison, we tested The Last of Us Part 1 at 4K with maxed-out settings.After an hour of load inside an enclosed ATX case, the RTX 5090 reached a peak GPU temperature of 73C, which is remarkable given how quiet and compact the card is. The fan speed peaked at 1,600 RPM and remained inaudible over our case fans, which are already very quiet.The cores averaged a clock speed of 2,655 MHz, while GPU power averaged 492 watts. The memory temperature peaked at 88C, with an operating frequency of 2,334 MHz, providing a transfer speed of 28 Gbps.In comparison, the RTX 4090 FE model peaked at 68C, with a memory temperature of 80C, and its fans spinning just below 1,500 RPM. Clearly, the RTX 5090 runs slightly hotter and louder. However, given that the RTX 5090 consumed, on average, 35% more power during testing and is a significantly smaller card, these results are nothing short of remarkable.We are incredibly impressed with what Nvidia has achieved here. The RTX 5090 might be the most impressive graphics card we've ever seen. You would never guess, just by looking at it, how much thermal load this cooler can handle so efficiently. It's an outstanding achievement. Now, let's see how it performs.Test System SpecsCPUAMD Ryzen 7 9800X3DMotherboardMSI MPG X870E Carbon WiFi (BIOS 7E49v1A23 - ReBAR enabled)MemoryG.Skill Trident Z5 RGB DDR5-6000 [CL30-38-38-96]Graphics CardsGeForce RTX 4070 GeForce RTX 4070 Super GeForce RTX 4070 Ti GeForce RTX 4070 Ti Super GeForce RTX 4080 GeForce RTX 4080 Super GeForce RTX 4090 GeForce RTX 5090 Radeon RX 7700 XT Radeon RX 7800 XT Radeon RX 7900 GRE Radeon RX 7900 XT Radeon RX 7900 XTXATX CaseMSI MEG Maestro 700L PZPower SupplyMSI MPG A 1000G ATX 3.0 80 Plus Gold 1000WStorageMSI Spatium 1TB M470 PCIe 4.0 NVMe M.2Operating SystemWindows 11 24H2Display DriverNvidia GeForce Game Ready 566.36 WHQL AMD Radeon Adrenalin 24.12.1Gaming BenchmarksMarvel RivalsStarting with Marvel Rivals at 1440p, we see that the RTX 5090 delivers 30% more performance than the RTX 4090. While this is a decent performance improvement, factoring in the 25% price increase makes it considerably less exciting.At 4K resolution, the margin increases slightly to 33%. This is a solid uplift, but the extreme price premium dampens the enthusiasm.S.T.A.L.K.E.R. 2: Heart of ChornobylS.T.A.L.K.E.R. 2 isn't the most optimized game, and as a result, the RTX 5090 maxes out at 94 fps at 1440p. This makes it only 22% faster than the RTX 4090, offering a very mild performance gain.At 4K, however, the RTX 5090 achieves a more reasonable 42% performance gain, rendering an average of 71 fps.Counter-Strike 2Next, we have Counter-Strike 2. At 1440p, the RTX 5090 is slightly slower than the RTX 4090, although the 1% lows are notably stronger. It's worth mentioning that the RTX 5090 was slower than the RTX 4090 at 1080p in multiple instances. This suggests a possible overhead issue with the Blackwell architecture, or perhaps the RTX 5090's large core count isn't being efficiently utilized at lower resolutions. Further investigation is needed here.Even at 4K, the RTX 5090 only offers an 8% performance increase over the RTX 4090. The issue doesn't appear to be a CPU bottleneck, given the higher frame rates observed at 1440p.God of War RagnarkPerformance in God of War Ragnark is outstanding at 1440p, hitting 268 fps on the ultra preset. However, this is only 22% faster than the RTX 4090, which is disappointing given the 25% higher cost.At 4K, the RTX 5090 scales better, achieving a 36% performance improvement with 195 fps compared to 143 fps on the RTX 4090 a much more favorable result.Delta ForceIn Delta Force, the RTX 5090 provides just 17% more performance than the RTX 4090 at 1440p. However, frame rates here are extreme and likely approaching a CPU bottleneck.At 4K, the margin extends to 27%, rendering 160 fps. While this is an improvement, it's still not an impressive uplift, especially considering the 25% higher price and the two-year gap between releases.Warhammer 40,000: Space Marine 2Space Marine 2 is a very CPU-limited game, and at 1440p, we appear to be hitting the limits of the 9800X3D processor. Oddly, the RTX 5090 is 4% slower than the RTX 4090 here. As observed in other instances at 1080p, this could indicate an overhead issue or inefficiencies in workloads that limit the RTX 5090's performance.At 4K, the RTX 5090 resolves this problem, delivering a 30% performance increase over the RTX 4090. While this is a decent uplift, it is undercut by the 25% price hike.Star Wars Jedi: SurvivorIn Star Wars Jedi: Survivor, the RTX 5090 delivers just a 14% improvement over the RTX 4090 at 1440p. However, with an average of 191 fps, performance remains impressive overall.At 4K, the RTX 5090 crosses the 100 fps threshold with 102 fps, making it 21% faster than the RTX 4090. Still, this is a disappointing margin given the higher cost.A Plague Tale: RequiemIn A Plague Tale: Requiem, the RTX 5090 delivers a 21% performance improvement over the RTX 4090 at 1440p. The results are partly CPU-limited, as suggested by similar 1% lows between the two GPUs.At 4K, the RTX 5090 pulls ahead with a 42% performance uplift, making this one of the better margins seen in the benchmarks.Cyberpunk 2077: Phantom LibertyIn Cyberpunk 2077: Phantom Liberty, the RTX 5090 struggles to deliver noteworthy gains at 1440p, with just a 19% improvement over the RTX 4090. The 1% lows are also similar, indicating other system limitations may be at play.At 4K, the margin improves to 32%. While the overall performance is excellent, this result remains underwhelming. It's worth noting that the second-highest preset was used, and ray tracing was not enabled for this test.Dying Light 2 Stay HumanFrame rates in Dying Light 2 using the high preset are extreme at 1440p, reaching 198 fps with the RTX 5090. However, this makes it only 24% faster than the RTX 4090.Even at 4K, the performance gain remains modest at 25% over the RTX 4090, which scales directly with the 25% price increase.Dragon Age: The VeilguardIn Dragon Age: The Veilguard, frame rates are limited to just under 130 fps at 1440p using the ultra preset, which selectively applies some ray tracing effects. While the focus of this portion of the review is on rasterization performance, ray tracing plays a role here.When increasing the resolution to 4K, the RTX 5090 averages 96 fps, only 10% faster than the RTX 4090. This is a very disappointing result.War ThunderWar Thunder runs at extremely high frame rates, even with the highest quality preset enabled. At 1440p, the performance is clearly CPU-limited, which we confirmed by testing at 1080p.Moving to 4K removes the CPU bottleneck, but even then, the RTX 5090 is only 15% faster than the RTX 4090. Granted, with frame rates well over 300 fps, performance is more than sufficient for gameplay, but in terms of relative performance, the RTX 5090 is underwhelming here.Marvel's Spider-Man RemasteredMarvel's Spider-Man Remastered is heavily CPU-limited at 1440p, with both the RTX 4090 and RTX 5090 capped at 222 fps.At 4K, the CPU bottleneck is mostly removed, but the RTX 5090 still appears slightly limited, averaging 212 fps. As a result, the RTX 5090 is just 26% faster than the RTX 4090.Hogwarts LegacyHogwarts Legacy is another title that is mostly CPU-limited at 1440p, resulting in similar performance between the RTX 4090 and RTX 5090.Increasing the resolution to 4K allows the RTX 5090 to pull ahead, delivering a 31% performance improvement. While the performance is excellent overall, the value remains questionable.The Last of Us Part IIn The Last of Us Part I, the RTX 5090 provides a solid performance uplift at 1440p, where it is 28% faster than the RTX 4090, averaging 204 fps. This results in excellent overall performance.At 4K, the RTX 5090 offers a 40% performance increase, averaging 125 fps. This is a strong result, especially when compared to most other titles.Star Wars OutlawsThe RTX 5090 achieves over 100 fps in Star Wars Outlaws at 1440p using the ultra preset. With ray tracing forced on, the RTX 5090 is 22% faster than the RTX 4090.Oddly, the margin decreases at 4K, where the RTX 5090 is just 19% faster than the RTX 4090. Typically, we expect the RTX 5090 to show greater advantages at higher resolutions, but that isn't the case here.StarfieldFinally, in Starfield, the RTX 5090 is only 4% faster than the RTX 4090 at 1440p using ultra-quality settings, limiting performance to 125 fps.At 4K, the RTX 5090 improves slightly but is still just 7% faster than the RTX 4090. There seems to be a limitation in this title that prevents the RTX 5090 from delivering the margins seen in other games at 4K.Performance SummaryAlthough we did not include 1080p data for individual games, here are the average results across the 17 games tested. As seen, both the RTX 4090 and RTX 5090 are heavily CPU-limited at this resolution, making them ideal for CPU benchmarking rather than GPU evaluation.Even at 1440p, the RTX 5090 is often heavily limited by the CPU, resulting in just a 12% performance improvement over the RTX 4090 across the 17 games tested.Now at 4K we can see the potential of the GeForce RTX 5090 where it delivers an average performance improvement of 27%, which looks solid on raw numbers but it's somewhat disappointing from a value perspective considering it costs 25% more than the 4090. This is why we've been joking internally, calling it the 4090 Ti as it really feels like that's what it is.Even if the RTX 5090 maintained the same $1,600 MSRP as the RTX 4090, it would still feel underwhelming as a next-generation flagship GPU. For comparison, the RTX 4090 was on average 60% faster than the RTX 3090 Ti, while launching at a lower price. It was also 73% faster than the RTX 3090 with only a 7% price increase. By comparison, the RTX 5090's performance and value fall far short of expectations for a generational leap.Power ConsumptionNow, let's look at power consumption. Most of our power data was recorded at 1440p, which is not ideal for measuring the full power usage of the RTX 5090, but we supplemented this with additional tests for clarity. In Starfield at 1440p, the RTX 5090 increased power consumption by 12% compared to the RTX 4090.In Star Wars Outlaws, we observed a 17% increase in power usage at 1440p, rising from 532 watts to 624 watts. Interestingly, in Space Marine 2, where the RTX 5090 performed worse than the RTX 4090 at 1440p, power consumption decreased by 15%, demonstrating that the RTX 5090 is highly efficient when not operating at full load.To better evaluate power usage, we re-tested the Radeon RX 7900 XTX, RTX 4090, and RTX 5090 at 4K in three games where the RTX 5090 performed well: Dying Light 2, Cyberpunk 2077, and A Plague Tale: Requiem.In these tests, the RTX 5090 increased power consumption by 37 41%, depending on the game. These results align more closely with the performance gains seen in these titles. Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption.Finally, we re-ran those same power tests with a 60 fps cap, which yielded some interesting results. In A Plague Tale: Requiem, power consumption for the RTX 5090 was nearly identical to the RTX 4090, with just a 2% increase. In Cyberpunk 2077, the RTX 5090 showed an 8% increase, while in Dying Light 2, it consumed 15% more power.Ray Tracing PerformanceRT - Metro Exodus EnhancedMetro Exodus Enhanced remains one of the few ray tracing games that provides a truly transformative experience with ray tracing enabled, so we felt it was important to include.As a side note before we show you the results, we've encountered issues testing Metro Exodus Enhanced with Radeon GPUs as of late. While the game has worked in the past, enabling ray tracing now causes system crashes with Radeon GPUs, regardless of whether AMD or Intel systems are used. AMD has replicated the problem and is aware of the issue, but unfortunately, a fix was not available in time for this review. As a result, we decided to exclude Radeon data and focus solely on the RTX 4090 and RTX 5090 performance.At 1080p, the RTX 5090 was 21% faster than the RTX 4090, and at 1440p, the margin increased to 33%. We did not test 4K ray tracing performance, as most titles deliver poor and often unplayable performance at that resolution, even with upscaling. However, Metro Exodus Enhanced would likely perform well on both the RTX 4090 and RTX 5090.RT - Alan Wake IIIn Alan Wake II, with quality upscaling enabled, the RTX 5090 was just 19% faster than the RTX 4090 at 1080p. Moving to 1440p did not significantly improve the results, with the RTX 5090 showing only an 18% performance gain.Overall, these are weak gains for the RTX 5090, and even with ray tracing enabled, the performance only just breaks the 100 fps barrier.RT - Cyberpunk 2077: Phantom LibertyUsing the ultra ray tracing preset with quality upscaling, Cyberpunk 2077: Phantom Liberty shows the RTX 5090 performing comparably to the RTX 4090 at 1080p, likely due to CPU limitations.At 1440p, the RTX 5090 pulls ahead slightly, offering an 11% performance increase with an average of 129 fps.RT - Marvel's Spider-Man RemasteredIn Marvel's Spider-Man Remastered, performance is heavily CPU-limited at both 1080p and 1440p. This is problematic, as frame rates are capped at 128 fps at 1440p, which is a limit achieved even by the RTX 4080 Super.While 4K benchmarks might provide more insight, the 128 fps cap at lower resolutions is concerning. Although this is solid performance overall, for those with high-refresh-rate monitors, it may not be enough. Furthermore, it's unlikely that many users spending $2,000 or more on a graphics card would settle for gaming at 60 fps, which is what would likely occur at 4K without upscaling.RT - Dying Light 2 Stay HumanIn Dying Light 2 using the high ray tracing preset with quality upscaling, the RTX 5090 achieved an average of 208 fps at 1080p, making it 18% faster than the RTX 4090.At 1440p, where CPU limitations are not a factor, the RTX 5090 was only 22% faster than the RTX 4090, making this an underwhelming result given the price premium.RT - Black Myth: WukongWith the very high ray tracing preset, the RTX 5090 delivered 123 fps at 1080p with quality upscaling, providing a 34% performance improvement over the RTX 4090.At 1440p, the RTX 5090 maintained a similar margin, being 36% faster and rendering an average of 98 fps. While this is a reasonable step forward relative to past products, the overall performance remains less impressive, especially since upscaling is required.Ray Tracing Performance SummaryWe used a five-game average for the ray tracing data since Metro Exodus Enhanced had to be excluded due to the issues with Radeon GPUs. On average, the RTX 5090 was 14% faster than the RTX 4090 at 1080p with upscaling.At 1440p, the RTX 5090 showed an average performance increase of just 17%. Notably, even with upscaling, the average frame rate at 1440p was just 123 fps far from impressive for a graphics card priced at $2,000.Cost per FrameHere's how the current and previous-generation mid-range to high-end GPUs compare in terms of value, based on MSRP. At $2,000, the RTX 5090 offers only a 1.5% improvement in value per frame compared to the RTX 4090.In other words, after more than two years, there's no meaningful improvement in cost per frame. The RTX 5090 is essentially just a faster RTX 40 series GPU.If we consider the best retail pricing for mid-2024 and assume the RTX 5090 will sell for $2,000, the value proposition looks slightly better. However, realistically, do we believe the RTX 5090 will actually sell for $2,000? Probably not.If anything, the retail price is likely to climb higher, making the value situation even worse. At $2,000, the RTX 5090 already represents poor value, and anything higher would make it an even tougher sell.What We Learned: It's the World's Fastest Gaming GPU, But...The GeForce RTX 5090 is now the world's fastest gaming GPU no surprise there. What is shocking, however, is that in our testing, it was on average just 27% faster than the RTX 4090 at 4K, while costing at least 25% more.This is why we've referred to it as the RTX 4090 Ti because, let's be honest, that's exactly what it is. Nvidia has tried to disguise this by marketing DLSS 4 multi-frame generation as a game-changing feature, akin to dangling a shiny set of keys to distract gamers.Speaking of DLSS 4, we haven't mentioned frame generation much in this review, despite Nvidia heavily promoting it as a key feature of the GeForce 50 series. This omission might seem odd, but we believe frame generation deserves a separate, dedicated analysis.We're already working on an in-depth DLSS 4 review, which will explore the technology in greater detail soon. The reason we tackle topics like frame generation and upscaling separately is that testing these features properly is complex. It's less about frame rates and more about image quality and, in the case of frame generation, latency.The reason we tackle topics like frame generation and upscaling separately is that testing these features properly is complex. It's less about frame rates and more about image quality and, in the case of frame generation, latency.To summarize briefly, frame generation doesn't deliver what Nvidia's marketing claims. It's not a true performance-enhancing feature; you're not genuinely going from 60 fps to 120 fps. Instead, you're getting the appearance of smoother gameplay, albeit with potential graphical artifacts, but without the tangible benefits of higher frame rates such as improved input latency.That doesn't mean frame generation is useless or that it's not a good technology. It can be helpful in certain scenarios, but Nvidia has weaponized the feature to mislead consumers, making claims like the upcoming RTX 5070 being faster than the RTX 4090, which is fundamentally untrue.We also strongly believe that showcasing frame generation performance in benchmark graphs is misleading. And while Nvidia would love for us to do just that, we see this as a slippery slope for gamers a race to the bottom, where winning benchmarks would become about who can spit out the most amount of interpolated frames... input and visual quality be damned.As it stands, DLSS 3 and DLSS 4 frame generation are best described as frame-smoothing technologies. Under the right conditions, they can be effective, but they don't truly boost FPS performance. Moreover, they're entirely unsuitable for competitive shooters or fast-paced games where the goal of high frame rates is to reduce input latency. Nvidia's narrative that all gamers will or should use frame generation couldn't be further from reality.Notes about CPU Pairing with the RTX 5090 and Ray TracingMoving on to another topic, about CPU performance, it's clear from the 1440p data we gathered that anyone investing in an RTX 5090 needs a high-end CPU, such as the 9800X3D. Even with the Zen 5 3D V-Cache processor, you'll frequently encounter CPU bottlenecks, especially if you aim for high refresh rates with ray tracing enabled.Speaking of ray tracing, you're almost certainly going to find reviews where the RT performance of the RTX 5090 relative to the 4090 is more impressive than what we saw for the majority of our testing, and this will come down to the quality settings used.Our testing focused on real-world scenarios that prioritize frame rates above 60 fps, as we believe most gamers spending $2,000 on a GPU won't settle for console-like frame rates.But in an effort to provide a bit more context, for example, in Black Myth: Wukong, we tested at 1440p using DLSS quality upscaling, where the RTX 5090 delivered 98 fps a 24% improvement over the RTX 4090. But if we disable upscaling, which we feel most gamers using ray tracing won't do, the frame rate of the 5090 drops to 64 fps, but this also meant that it was now 45% faster than the 4090, so a far more impressive margin here.This is comparable to what we see at 4K using DLSS upscaling, though again we're only gaming at around 60 fps, which some gamers will find acceptable, but I personally find it less than desirable, especially when spending so much money.Ultimately, the point is that the RTX 5090 can be 40-50% faster than the RTX 4090, depending on the game and settings. However, as demonstrated in this review, when targeting high frame rates, the difference is typically much smaller.Bottom LineAll things considered, the GeForce RTX 5090 is an impressive performer that falls short of meeting the expectations for a next-generation flagship GPU. It doesn't move the needle forward in terms of value or innovation and could easily fit into the GeForce 40 series lineup. If Nvidia had launched this as an RTX 4090 Ti, few would have batted an eye.We understand that Nvidia couldn't do much given the limitations of the current process node. However, they still could have delivered a more exciting product series. Even at $1,600, the RTX 5090 would have been far more appealing still not amazing, but much better than it is now.Without a process node upgrade, this release doesn't come close to the leap we saw from the RTX 3090 to the RTX 4090, which was vastly more significant. It's also clear that as Nvidia cements its position as the leader in AI hardware, GeForce seems to have taken a back seat to the big money in AI(just check out this graph, it's insane).We still expect the RTX 5090 to age well. While today's 27% average performance gain over the RTX 4090 is underwhelming, this margin will likely increase over time, potentially reaching 40% in more games.Unfortunately, this also means the more affordable models in the GeForce RTX 50 series will probably be underwhelming, offering only minor performance gains over the GPUs they replace. Nvidia could have addressed this by providing better VRAM configurations.For example, 12 GB on the RTX 5070 is simply unacceptable it should have at least 16 GB. If Nvidia had done this, the RTX 5070 might have been a worthwhile upgrade over the RTX 4070 and a much more significant step up from the RTX 3070.For those looking for a more positive take, the good news is that the RTX 5090 is faster than the RTX 4090, pushing 4K gaming closer to high-refresh-rate experiences. If you already had oodles of money to blow on a graphics card and missed out on the RTX 4090, the RTX 5090 could be a great addition to your gaming setup.In summary, the RTX 5090 is 25% more expensive than the RTX 4090, delivers an average of 27% more performance, includes 33% more VRAM, and consumes around 30% more power. Interpret that as you like. For now, our review is complete with a closer look at DLSS 4 coming soon let us know your thoughts on Nvidia's new flagship graphics card in the comments.Shopping Shortcuts:Nvidia GeForce RTX 5090 on AmazonNvidia GeForce RTX 5080 on Amazon (soon)AMD Radeon RX 7900 XTX on AmazonNvidia GeForce RTX 4070 Ti Super on AmazonNvidia GeForce RTX 4070 Super on AmazonAMD Radeon RX 7800 XT on AmazonAMD Radeon RX 7900 XT on Amazon
    0 Comments ·0 Shares ·35 Views
  • Bing search results in Edge are obscuring Chrome links, promoting Microsoft's browser
    www.techspot.com
    WTF?! Google and Microsoft have spent years engaging in dirty tricks campaigns designed to push people onto their respective browsers, Chrome and Edge. The latest tactic is one employed by the Windows maker: Edge hides Chrome's download links for some users when they perform a Bing search for the browser. As noticed by Windows Latest, searches for Chrome using Edge and via Bing (when signed out of your Microsoft account) on Windows 11 result in a "promoted by Microsoft" banner appearing at the top of the search results.The banner is a recommendation by the Redmond firm, advising users there's no need to download a new web browser and highlighting that Edge offers a fast, secure, and modern web experience that saves time and money. It also comes with the obligatory "Try now" button.Forcing obtrusive ads for its products down people's throats isn't new territory for Microsoft, of course. But this one arguably goes a little further by hiding the Chrome download links that are beneath the banner, and the small portion of the top Google result that is visible appears mostly blurred out.Courtesy of Windows LatestIt's easy to see the search results by clicking on the "See more" button further down the screen, and most people who do a search for Chrome likely intend to download it, no matter what Microsoft claims. However, less tech-savvy users may be persuaded by the banner's claims. // Related StoriesThe other thing to note is that few people are likely to encounter this banner. Google has an almost 90% share of the global search engine market, whereas Bing has 4%. It's a similar story in the browser market: Chrome has a 68.3% share, Edge has just under 5%.It appears that not everyone is seeing the banner. I couldn't get it to show, so it might be limited to a small set of users or certain locations.Microsoft's war against Chrome goes back a long way. Some examples of its pushiness include the company telling people in 2021 that the rival browser was "so 2008" and Edge was better. There were also full-size Edge ads that appeared on the Chrome website, and Edge was accused of stealing data from Chrome without users' consent in January.Google isn't a stranger to using such tactics, either. The company shows prompts to Edge users recommending Chrome, and in 2020 it showed a message that read "Google recommends switching to Chrome to use extensions securely" whenever Edge users visited the Chrome Web Store, though Google quickly removed that message.
    0 Comments ·0 Shares ·39 Views
  • We now know why AMD chose to delay RDNA 4 well, kind of
    www.digitaltrends.com
    AMD hasnt been very forthcoming when it comes to information about its RX 9000 series GPUs, but we just got an update as to why the cards wont be available until sometime in March. The company cites software optimization and FSR 4 as the two reasons why it most likely decided to delay the launch of RDNA 4. But is that all there is to it, or is AMD waiting to see some of Nvidias best graphics cards before pulling the trigger on the RX 9070 XT?The update comes from David McAfee, AMDs vice president and general manager of the Ryzen CPU and Radeon graphics division. A couple of days ago, McAfee took to X (Twitter) to announce that AMD was excited to launch the RX 9000 series in March. This caused a bit of an uproar, with many enthusiasts wondering why AMD was choosing to wait so long.Recommended VideosI really appreciate the excitement for RDNA4. We are focused on ensuring we deliver a great set of products with Radeon 9000 series. We are taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. We also have a wide range David McAfee (@McAfeeDavid_AMD) January 22, 2025Get your weekly teardown of the tech behind PC gaming McAfee now explains that AMD is taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. While a bit vague, this confirms what many leakers have been saying: RDNA 4 is ready hardware-wise, but now, AMD seems eager to improve it on the software side. While not a bad approach, it does feel like there might be more to it.After all, the GPUs have been spotted with preorders opening on January 22, and several retailers appear to already have them in stock. McAfee admits as much, saying: We also have a wide range of partners launching Radeon 9000 series cards, and while some have started building initial inventory at retailers, you should expect many more partner cards available at launch.There are more signs pointing to the fact that AMD may have had other plans for the release date of RDNA 4. VideoCardz spotted a Reddit ad posted by the official AMD account, and the ad clearly states that the GPUs are available, saying: When the stakes are high, every play counts play now with the ultimate performance of AMD Radeon RX 9000 series graphics cards. It also gives us a closer look at the RX 9070 XT in its Made By AMD (MBA) design.If late January was the initial plan for the RX 9000 series, its clear that AMD pulled back. Greater availability for FSR 4 titles and improved drivers are both good reasons to delay, but its possible that AMD might also want to see how Nvidias RTX 5070 Ti will fare when it (most likely) hits the market in February.Theres one good thing here, though, and thats GPU availability. McAfee implies therell be plenty of RDNA 4 cards to go around, which is great, especially considering that Nvidias RTX 50-series might be very hard to come by at launch.Editors Recommendations
    0 Comments ·0 Shares ·38 Views
  • Nvidia RTX 5090 review: fast, but not nearly twice as fast
    www.digitaltrends.com
    Nvidia GeForce RTX 5090MSRP$1,999.00 Score Details Nvidia is, once again, leaving its mark on the flagship throne with the RTX 5090.ProsUnrivaled 4K gaming performanceInnovative, attractive Founder's Edition designDisplayPort 2.1 and 4:2:2 encoding32GB of memory for AI workloadsDLSS 4 is a treat...Cons...when it works properlyInsanely expensivePower requirements are off the chartsTable of ContentsTable of ContentsNvidia RTX 5090 specs4K gaming performance1440p gaming performance1080p gaming performanceRay tracingA closer look at DLSS 4Great for those in the marketThe RTX 5090 is a hard GPU to review. By the numbers, its undoubtedly the best graphics card you can buy. Thats what happens when youre the only one in town making this class of GPU, and as it stands now, Nvidia is. If you want the best of the best and dont mind spending $2,000 to get it, you dont need to read the rest of this review though, Id certainly appreciate if you did.Recommended VideosNo, the RTX 5090 is about everything else that RTX 50-series GPUs represent. It delivers that flagship gaming performance, but it also ushers in an entirely new architecture, DLSS 4, and the era of neural rendering. And on those points, the dissection of the RTX 5090 is far more nuanced.Get your weekly teardown of the tech behind PC gaming Jacob Roach / Digital TrendsThe RTX 5090 is angled toward PC gamers who want the best of the best regardless of the price but its also the first taste weve gotten of Nvidias new Blackwell architecture in desktops. The big change is neural rendering. With RTX 50-series GPUs, Nvidia is introducing neural shaders along with DirectX though we wont see the fruits of that labor play out for quite some time.For immediate satisfaction, Nvidia has DLSS 4. This feature is coming to all RTX graphics cards, replacing the convolutional neural network (CNN) that DLSS previously used with a new transformer model. Nvidia says this leads to a quality boost across the board. For the RTX 5090, the more important addition is DLSS Multi-Frame Generation, which promises up to 4X frame generation in 75 games on day one. DLSS 4 is coming to all RTX graphics cards, but DLSS Multi-Frame Generation is exclusive to RTX 50-series GPUs, including the RTX 5090.RTX 5090RTX 4090ArchitectureBlackwellAda LovelaceProcess nodeTSMC N4TSMC N4CUDA cores21,76016,384Ray tracing cores170 4th-gen144 3rd-genTensor cores680 5th-gen576 4th-genBase clock speed2017MHz2235MHzBoost clock speed2407MHz2520MHzVRAM32GB GDDR724GB GDDR6XMemory speed30Gbps21GbpsBus width512-bit384-bitTDP575W450WList price$1,999$1,599Although it might seem like Nvidia could just flip a switch and enable DLSS Multi-Frame Generation on all of its GPUs, thats not exactly the case. Nvidia says with 4X frame generation and Ray Reconstruction enabled, there are five AI models running on your GPU for each rendered frame. To manage all of that, the RTX 5090 includes an AI management processor, or AMP, which handles scheduling of these different workloads across the ray tracing, Tensor, and CUDA cores.Outside of AI hardware, the RTX 5090 brings 32GB of GDDR7 memory. Nvidia bumped up the capacity from 24GB on the RTX 4090, though that doesnt have a ton of applications in games. The extra memory here really helps AI workloads, where training large models can easily saturate 32GB of memory. The bigger boost is GDDR7, which is twice as efficient as GDDR6 while providing twice as high of a data rate.Nvidia also redesigned its ray tracing and Tensor cores for Blackwell, both of which it says are built for the new Mega Geometry feature. The bigger standout for me is the media encoding engine, however. Nvidia now supports 4:2:2 video encoding, along with DisplayPort 2.1 output. Those are some significant upgrades over the RTX 4090, regardless of what the benchmarks say.Jacob Roach / Digital TrendsTwice as fast as the RTX 4090? Not quite. Based on my results, the RTX 5090 is about 30% faster than the RTX 4090 when the new DLSS Multi-Frame Generation feature isnt brought into the mix. And its a feature you might want to leave out of the mix in some titles, as Ill dig into later in this review. That sounds like a solid generational jump, but I went back to my RTX 4090 review for a sanity check. Its not nearly as big as what weve seen previously.With the RTX 4090, Nvidia provided over an 80% generational improvement, which is massive. Here, its actually more of a lateral move. The RTX 5090 is 30% faster than the RTX 4090, but its also 25% more expensive, at least at list price. That said, good luck finding an RTX 4090 in stock at $2,000, much less at list price. The RTX 5090 may not be the generational improvement I expected, but the reality for buyers is still that its the best option for flagship performance.The average is brought down by a handful of games where the RTX 5090 doesnt show a huge increase. InAssassins Creed Mirage,for example, theres about a 17% uplift. Similarly, inForza Motorsport,the improvement shrinks to just 14%. Those arent exactly the margins I was hoping for when Nvidia announced a new flagship GPU, and especially one that comes in at a significantly higher price.Jacob Roach / Digital TrendsMake no mistake; there are still big wins. As you can see above, I measured a massive 54% improvement inCyberpunk 2077,which is really impressive. In the previous generation, the RTX 4090 was the only GPU that could run this game at 4K Ultra without upscaling and still achieve 60 frames per second (fps). Now, the RTX 5090 is comfortably reaching into the triple digits. This is the kind of improvement I expected to see across the board.Cyberpunk 2077isnt a one-off thankfully. Although the improvements arent quite as large across the board, I saw similarly impressive uplifts inHorizon Zero Dawn Remastered, Returnal, andDying Light 2.The improvement may not be above 80% like we saw in the previous generation, but theres still a clear improvement. If you want the best of the best, Nvidia is claiming that throne with the RTX 5090.Its just the expectations that are important. Despite some big wins, I suspect most games will look likeBlack Myth: Wukong, Red Dead Redemption 2,andCall of Duty Modern Warfare 2.Youre getting a nice chunk of extra performance, no doubt, but that lift doesnt fundamentally change the gameplay experience in quite the same way that the RTX 4090 did.Looking over my 4K data, it became clear that the RTX 5090 establishes somewhat of a new normal. The RTX 4090 had an outsized generational improvement, as Nvidia continued to navigate the waters of how it wanted to market its flagships moving forward. The RTX 5090 is disappointing by comparison, and Im not sure theres much reason for RTX 4090 owners to run out and buy Nvidias latest. But for those that want the best, its hard arguing with the numbers the RTX 5090 puts up.Its easy to argue, however, with Nvidias misleading claims. Were nowhere near twice the performance of an RTX 4090, and the company confirmed to me that its seeing a 30% average uplift internally, as well. Thats the kind of improvement Id expect to see out of an 80-class card, but it looks like the death of Moores Law has to hit everyone at some point.Jacob Roach / Digital TrendsEven at 1440p, its very easy to run into a CPU bottleneck with the RTX 5090. You can see that just from looking at the averages above; the RTX 5090 shrinks down to just a 22% lead over the RTX 4090. All of my data here is fresh, and run with a top-of-the-line Ryzen 9 9950X. In short, if you plan to use the RTX 5090 at 1440p, youre giving up a serious chunk of its performance potential, and youre probably better off with a used RTX 4090.Forza Motorsport and especially Red Dead Redemption 2show the problem here. The RTX 5090 is still able to squeeze out a win across games at 1440p, but the margins are much thinner. Thats not a critique of the graphics card, but it is the reality of trying to run this monstrous GPU at any resolution below 4K.There are still some solid wins for Nvidias latest, particularly in games that scale well on your CPU. Cyberpunk 2077is once again a standout victory, but you can see similarly large improvements inDying Light 2andReturnal.Jacob Roach / Digital TrendsOne game thats worth zooming in on isBlack Myth: Wukong.This is the only game in my test suite that I run with upscaling enabled by default, and it shows what can happen when forcing upscaling on at a lower resolution. The RTX 5090 is providing a 20% improvement, but as you continue to push down the internal resolution, that lead will continue to flatline.Regardless, the RTX 5090 really isnt built for 1440p. You can use it at this resolution, but youre giving up a chunk of what the RTX 5090 is truly capable of.Jacob Roach / Digital TrendsThe idea of using an RTX 5090 at 1080p is a little silly, but I still ran the card through all of the games I tested at this resolution. Here, the CPU bottleneck becomes more extreme, pushing the RTX 5090 down to just a 15% lead over the RTX 4090. You could see that as disappointing, but frankly, I see this resolution as unrealistic for a $2,000 graphics card.However, looking at 1080p data is still valuable, at least at a high level. Its important to remember that DLSS Super Resolution renders the game at a lower internal resolution, so the advantage of the RTX 5090 slips a bit with DLSS upscaling turned on. The RTX 5090 can easily make up that gap with DLSS Multi-Frame Generation and even push much further but these results are a good reminder of bottlenecks you can run into when using flagship hardware with upscaling.Nvidia dominates when it comes to ray tracing, so its no surprise that the RTX 5090 enjoys a top slot among the games I tested. However, the improvements arent as large as I expected. Nvidia has solved, for lack of a better word, real-time ray tracing. Games that arent pushing full-on path tracing are seeing less of an improvement, largely due to the fact that lighter forms of ray tracing are fair game for GPUs as weak as the Intel Arc B580.Dying Light 2is a good example of this dynamic. When this game released a few years back, it was one of the most demanding titles you could play on PC. But even at 4K with the highest graphics preset and no help from upscaling, the RTX 5090 makesDying Light 2look like childs play with a comfortable 90 fps average.InReturnal,the situation is even more extreme. This is one of those lighter ray tracing games available, and sure enough, the RTX 5090 crosses triple digits without breaking a sweat, even at 4K.Things get interesting when looking at those more demanding ray tracing games, though.Cyberpunk 2077,once again, serves as a mile marker for the RTX 5090. Its the first GPU to get close to 60 fps at 4K with the RT Ultra preset, which is quite the achievement. Of course, its possible to push the RT Overdrive preset, as well more on that in the next section but looking at raw performance, Nvidia is pushing to new heights.The next frontier is path tracing, and for that, I usedBlack Myth: Wukong.The RTX 5090 provides a great experience, even at the Cinematic graphics preset in the game. But games likeBlack Myth such asAlan Wake 2andCyberpunk 2077 that have a path tracing mode still need to resort to upscaling, introducing the CPU more into the mix and limiting the performance uplift. Maybe in the next few generations well see a native 60 fps in this title from an Nvidia flagship.There really isnt much to talk about when it comes to ray tracing on the RTX 5090, and thats exactly how Nvidia wants it. In the vast majority of games, youre looking at rasterized performance that comfortably clears 60 fps at native 4K and can easily climb into the triple digits. Ray tracing still forces some upscaling wizardry in titles like Black Myth: Wukong,but for the most part, you can flip on ray tracing without a second thought. Thats the way it should be.Jacob Roach / Digital TrendsThe chart above is the story Nvidia wants to tell about DLSS 4. Nvidia didnt make this chart, nor did it tell me to make it, but theres a clear narrative that emerges from the data here. Even factoring in PC latency, which is the main issue with frame generation technology, DLSS 4 is doing some magical things. Youre going from an unplayable frame rate to something that can fully saturate a 4K, 240Hz monitor like the MSI MPG 321URX. And youre doing so with around half of the average PC latency as native rendering.The devil is in the details here, however, and Nvidia has a few little devils to contend with.Heres a different side of the story. Above, you can see a short section of gameplay Im going for five stars here inCyberpunk 2077with the RT Overdrive preset. Im using DLAA for a little boost to image quality, and Im using the 4X frame generation mode. Given that Im playing on a 4K display with a 138Hz refresh rate, these seem like ideal settings for my setup. Watch the video, and you tell me if it looks like an ideal experience.I can point out a lot of problems here, picking out single frames with various visual artifacts between each swipe of the mouse, but you dont need to pixel peep to see the issue. Theres an unnatural motion blur over everything, and the edges of objects are mere suggestions rather than being locked in place. You dont need a trained eye to see that this is a bad experience. You dont need a point of comparison, even. You can watch this video in a vacuum and see that DLSS 4 has some clear limitations. Thats not a damning critique of DLSS 4. Its a wonderful tool, but you need to use it correctly.Like any frame generation tech, your experience will rapidly deteriorate when you feed the frame generation algorithm with a low base frame rate like I did in Cyberpunk 2077.Nvidia wants you to use Super Resolution to get to a playable base frame rate of near 60 fps, and then click on Multi-Frame Generation to saturate a high refresh rate display. Using Multi-Frame Generation alone, especially if youre hovering around 30 fps, will give you a bad experience.Marvel Rivals - DLSS 4 GameplayCyberpunk 2077shows the worst of what DLSS 4 has to offer, butMarvel Rivalsshows the best. This is one of various games that uses Nvidias new DLSS Override feature, allowing you to add up to 4X Multi-Frame Generation to games with DLSS Frame Generation through the Nvidia app. Not only is the base frame rate high enough here well over 60 fps, even with DLAA turned on but you also have a third-person camera. There are some minor artifacts, but nothing that ruins the experience and nothing youd even notice during gameplay.Alan Wake 2 - DLSS 4 GameplaySimilarly, the artifacting isnt nearly as bad inAlan Wake 2as it is inCyberpunk 2077.Here, once again, Im starting with a base frame rate of around 30 fps and using Multi-Frame Generation to make up the difference. There are some artifacts, and Id recommend using a combination of Super Resolution and Frame Generation instead. But the experience is at least better compared toCyberpunk 2077due to the camera angle.You dont want to just crank DLSS 4 to 4X mode and call it a day. It needs to be fed with a base frame rate of ideally 60 fps. Although the latency doesnt significantly increase up to three generated frames something that Nvidia should be applauded for on its own the number of visual artifacts does. Realistically, I suspect DLSS 4 will more often run in 2X or 3X mode alongside Super Resolution. That, in a lot of games, will provide a much better experience than relying on Multi-Frame Generation alone.Over the past few generations, Nvidia has increasingly relied on DLSS to market its graphics cards, and that same playbook is at work here. Its just not the same selling point that it once was. Super Resolution is still pulling a lot of the weight, and even a single generated frame is enough to saturate most gaming monitors, even as refresh rates climb. Theres still a use for 4X Multi-Frame Generation, and with the right circumstances, it works extremely well. But when it comes time to spend $2,000 on a graphics card, I would seriously consider how much DLSS Multi-Frame Generation is offering over a $7 utility like Lossless Scaling.For my money, it isnt providing much of an advantage.This is where you need to carefully consider your setup. You want to be using Multi-Frame Generation alongside Super Resolution in those prestige games likeCyberpunk 2077andAlan Wake 2,and if you dont have a monitor capable of producing that high of a refresh rate, that second or third generated frame goes to waste. Unlike DLSS 3, Multi-Frame Generation isnt a feature that just works on its own; it needs to work as part of the rest of your gaming rig.Jacob Roach / Digital TrendsNvidias CEO hit the nail on the head when defending the price of the RTX 5090: When someone would like to have the best, they just go for the best. If theres one thing I can say with absolute certainly, especially considering the lack of flagship competition from AMD, its that the RTX 5090 is the best. It doesnt matter if its $1,500, $2,000, or $2,500 Nvidias CEO is right when he says that the appetite for this type of product doesnt factor in price nearly as much as more inexpensive options.The question isnt if the RTX 5090 is the best; it is. The question is if you need the best, and theres a bit more discussion there. The generational improvements are here, but they dont touch what we saw with the RTX 4090. DLSS 4 is incredible, but it falls apart when its not fed with the right information. And 32GB of GDDR7 memory is welcome, but its only delivering a benefit in AI workloads, not in games.If youre sitting on an RTX 4090, theres not much reason to upgrade here. Theres a performance boost, but the real value lies in DLSS 4, and thats something thats very easy to get around without spending $2,000. The RTX 5090 really shines for everyone else.Maybe you had to skip the RTX 40-series due to poor availability, or maybe the RTX 2080 Ti you have just isnt providing the grunt that it used to. In those situations, the RTX 5090 is great. But if youre in the market to spend $2,000 on a graphics card, you probably dont need me to convince you.Editors Recommendations
    0 Comments ·0 Shares ·37 Views
  • 2025 Oscar Nominees, from Emilia Prez to Wicked
    www.wsj.com
    Emilia Prez and The Brutalist are expected to contend for best picture.
    0 Comments ·0 Shares ·40 Views