www.theverge.com
Nvidias 50-series desktop GPUs have been off to a slightly rocky start, with quality control issues, underwhelming performance gains over last generation, and the occasional melting cable. Its new laptop GPUs, led by the flagship RTX 5090, may be on a similarly modest trajectory, but with one added benefit for laptop gamers: efficiency.Ive been testing the RTX 5090 laptop GPU in Razers new Blade 16 laptop, which Nvidia is using as a showcase for its top-tier mobile card. My time with it has been more limited than Id hoped, since my first review unit exhibited some strange graphical anomalies and was prone to blue-screen crashes during basic productivity tasks. Nvidia sent me a replacement, but Ive had less than two days with it as of press time. From what Ive seen so far, it is indeed a little faster than the mobile 4090, but like the desktop 50-series cards, its biggest improvements are in DLSS and Multi Frame Generation, rather than raw performance.The new mobile RTX 5090 has 24GB of GDDR7 VRAM instead of the 16GB of GDDR6 found on the 4090. Its also got more powerful Tensor cores, slightly more CUDA cores, and an extra hardware video encoder for better livestreaming and video exports, but the same 175W TDP.Well have more to say about the laptop housing this new GPU soon.The 5090s specs translate to small gains over the 4090 in synthetic benchmarks, such as a 13 percent higher Geekbench GPU score and a 14 percent higher score in 3DMarks standard Time Spy test at 2560 x 1440 resolution.To test the 5090s uplift in actual games, I ran an array of benchmarks using Cyberpunk 2077 and Black Myth: Wukong at the native 2560 x 1600 of both Blade 16s and at 4K on an Alienware AW3225QF monitor. The biggest difference I saw was in Cyberpunk 2077 at 2.5K resolution and Ultra settings without DLSS, frame generation, or ray tracing enabled. There, the 5090 was 24 percent faster than the 4090. That sounds great, but once you enable ray tracing and DLSS 4 (something most people would likely do on either of these laptops), it narrows to a delta of just 5 percent. When you enable 2x frame generation, the 5090 is 14 percent faster than the 4090, and going all out with 4x Multi Frame Gen more than doubles what the 4090 can do at 2x frame gen. If youre playing at 4K, the 5090 still wins, but the gap narrows to within single digits.BenchmarkBlade 16 RTX 4090 average fps @ 4KBlade 16 RTX 5090 average fps @ 4KBlade 16 RTX 4090 average fps @ 2.5KBlade 16 RTX 5090 average fps @ 2.5KBlack Myth: Wukong TSR26284548Black Myth: Wukong FSR41436266Black Myth: Wukong FSR Frame Gen35395458Black Myth: Wukong DLSS46486870Black Myth: Wukong DLSS Frame Gen6670104111Cyberpunk 2077 Ultra41487492Cyberpunk 2077 Ultra RT & FSR 3.1 Quality40406366Cyberpunk 2077 Ultra RT FSR Frame Gen6170118124Cyberpunk 2077 Ultra RT & DLSS quality39406467Cyberpunk 2077 Ultra RT & DLSS Frame Gen x26770103117Cyberpunk 2077 Ultra RT & DLSS Frame Gen x4N/A123N/A209Cyberpunk 2077 Full Ultra RT no DLSS12122123Cyberpunk 2077 Full Ultra & DLSS Quality23253945Cyberpunk 2077 Full Ultra RT & DLSS 4 Quality Frame Gen x243456781Cyberpunk 2077 Full Ultra RT & DLSS 4 Quality Frame Gen x4N/A83N/A147Black Myth: Wukong, on the other hand, is a little less forgiving. In this graphically demanding Souls-adjacent RPG, the 5090 is only a few percent faster than the 4090. The highest average the 5090 could muster was 111fps at 2.5K with DLSS and standard frame generation, with the 4090 wasnt far behind at 104. If Wukong supported Multi Frame Generation, the 5090 could no doubt separate itself further, but that would likely add a couple extra milliseconds of latency. Most people will never notice or care about that, but some players may feel any latency added to their pinpoint dodge-roll timing is a nonstarter.Like the 50-series desktop cards, the 5090 laptop GPU looks its best in games that support DLSS 4 Multi Frame Generation. Over 100 games now support Nvidias highlight feature, and you can also use override settings in the Nvidia app to force it in unsupported games. Frame generation, which both Nvidia and AMD offer, uses AI models to generate extra frames and insert them between traditionally rendered frames. This gives the appearance of smoother gameplay but doesnt improve input latency and actually increases it slightly. AMD and previous Nvidia cards can do 2x frame generation; the 50 series can do up to 4x. Theres an ongoing debate in the world of PC gaming about these fake frames some folks are perfectly content with AI upscaling and generated frames, and others view it as a band-aid solution with too many compromises to image quality and game feel. I think it depends on your personal experience, what kinds of games you like to play and what youre used to playing them on, and even whether you have an eye (or even a care) for this stuff. It can be great for the right games, and it may even make more sense on a 16-inch laptop screen than a jumbo 4K monitor right in front of your face. I forced Multi Frame Generation on Assassins Creed Shadows, and I was pleasantly surprised with its performance and buttery smoothness on ultra settings. Shadows is maybe a perfect example of a single-player game where frame generation is a nonissue. Its got dodge rolling, but its not the kind of game that really requires frame-perfect timing. It felt A-okay to me, and boy did feudal Japan look lovely on a 2.5K OLED at an average of 167fps.Many of those frames may be fake, but they sure are pretty.The obvious games to avoid with frame generation are competitive shooters. While I felt fine playing Marvel Rivals on the 5090 with 4x frame generation reaching nearly 200fps, I was playing pretty casually with a hero that doesnt require precision accuracy. For something like Valorant or Counter-Strike 2, Id advise against it. But to be fair, those ultra-competitive games typically have a low performance barrier for entry and practically run on a potato.One area where the 5090 does seem significantly better so far is power efficiency. In our benchmarks, the new GPU had a lower average wattage, by 20 to 29 percent, than the 4090. That should help improve battery life when playing games away from a power cord.GPU power (watts) at 2.5KBenchmarksBlade 16 RTX 4090 average GPU wattageBlade 16 RTX 5090 average GPU wattageBlack Myth: Wukong TSR174138Black Myth: Wukong FSR171137Black Myth: Wukong FSR Frame Gen145134Black Myth: Wukong DLSS170135Black Myth: Wukong DLSS Frame Gen163131Cyberpunk 2077 Ultra159134Cyberpunk 2077 Ultra RT & FSR 3.1 quality153127Cyberpunk 2077 Ultra RT FSR Frame Gen154133Cyberpunk 2077 Ultra RT & DLSS Quality154126Cyberpunk 2077 Ultra RT & DLSS Frame Gen x2162130Cyberpunk 2077 Ultra RT & DLSS Frame Gen x4N/A133Cyberpunk 2077 Full Ultra RT no DLSS169131Cyberpunk 2077 Full Ultra & DLSS quality167138Cyberpunk 2077 Full Ultra RT & DLSS 4 Quality Frame Gen x2167137Cyberpunk 2077 Full UltraA RT & DLSS 4 Quality Frame Gen x4N/A133Lower wattages can also mean lower temperatures, as I was able to play games with the plugged-in laptop on my actual lap and not toast my legs. But this also hinges on an individual laptops thermal design and how well it cools the Blade 16 so far seems to do it well, but its also a totally different design than the 2024 model with the 4090 in it. Of course, youre not going to see the same level of performance when running graphically intensive games on battery power. Ill have to test this more to see how much of a difference it makes in real-world usage.The RTX 5090 is Nvidias most powerful and expensive laptop GPU, and you can expect to pay $4,000 or more for a laptop that runs it. The Blade 16 review configuration costs $4,499.99. It shows some promise for games that support DLSS 4 and Multi Frame Generation, and maybe for gaming thats not always tethered to a wall. But if I were shopping right now, I wouldnt ignore a heavy discount on a laptop with an RTX 4090. (The 4090 desktop supply, on the other hand, has dried up.)Well continue to test the RTX 5090, as well as the rest of the 50-series mobile GPUs as they appear in new gaming laptop models over the next few weeks and months. Im looking forward to testing the RTX 5080, in particular, and seeing if thats the sweet spot for laptop gamers.Photography by Antonio G. Di Benedetto / The VergeSee More: