• WWW.ZDNET.COM
    This iPhone power bank was an essential lifeline on the bustling streets of Marrakesh
    The Sharge CarbonMag 5K made my trip stress-free. I would have been completely lost without my iPhone.
    0 Comentários 0 Compartilhamentos 144 Visualizações
  • WWW.FORBES.COM
    iOS 18.3 WarningYou Should Turn Off This New iPhone Setting ASAP
    iOS 18.3 comes with a warning about a new setting that Apple has turned on by default and you may ... [+] want to switch it off.Getty ImagesApples iOS 18.3 is coming soon, with a bunch of new features and security updates for your iPhone. But iOS 18.3 also comes with a warning about a new setting that Apple has turned on by default and you may want to switch it off.The iPhone makers AI-enabled Apple Intelligence launched in iOS 18.1, including Siri integration with ChatGPT in iOS 18.2. However, users currently have to turn this on to use the features on their iPhone.Not anymore, from iOS 18.3 due to arrive next week, as Apple Intelligence will now be on by default, according to Apple-focused website 9to5Mac.The iOS 18.3 change is confirmed by Apples beta release notes, which read:For users new or upgrading to iOS 18.3, Apple Intelligence will be enabled automatically during iPhone onboarding. Users will have access to Apple Intelligence features after setting up their devices.MORE FOR YOUTo disable Apple Intelligence, users will need to navigate to the Apple Intelligence & Siri Settings panel and turn off the Apple Intelligence toggle, Apple said.The iOS 18.3 move comes as figures show the AI-enabled features are not that popular with iPhone users. In fact, 73% of iPhone users said AI features add little to no value, according to a recent survey by Sellcell.I asked Apple to comment on this iOS 18.3 news and will update this story if the iPhone maker responds.Why You Might Want To Toggle This iOS 18.3 Feature OffBy its very nature, AI requires a lot of data to operate, and this automatically enabled feature in iOS 18.3 is no different. It gets worse with ChatGPT integration, which can send data off to OpenAI although Apple does get your permission before this happens.Apple has lots of safeguards for data privacy and security when using Apple Intelligence, such as its Private Cloud Compute. However, your device is still more secure and private when you dont have AI enabled.These algorithms need huge amounts of data to build upon and grow, says Jake Moore, global cybersecurity advisor at ESET. Auto enabling features by default is a sure fire way to gain access to as much data as legally possible.Default settings such as this iOS 18.3 one may still hand over private information under the radar and without your explicit knowledge, Moore warns. Therefore, it is important to reduce or limit the release of personal information where possible, he says.If you want to turn off Apple Intelligence after upgrading to iOS 18.3, go to your iPhone Settings > Apple Intelligence & Siri and turn the toggle to Off.
    0 Comentários 0 Compartilhamentos 162 Visualizações
  • WWW.TECHSPOT.COM
    Nvidia GeForce RTX 5090 Review
    Exciting times for us computer enthusiasts as we can finally showcase the new GeForce RTX 5090 and the next generation of Nvidia GPUs, codenamed Blackwell, with the new flagship graphics card priced at $2,000.It's been two years since Nvidia released the mighty GeForce RTX 4090, an insane $1,600 GPU that smashed the previous-generation flagship by a 60% margin that is, 60% faster on average at 4K. This made it an extremely powerful and exciting option for high-end gaming, even if it was undeniably expensive.So, what's on offer here, and how can Nvidia justify a $2,000 price tag for the RTX 5090?Nvidia has faced some challenges this generation. While the RTX 50 series takes advantage of cutting-edge technologies such as PCI Express 5.0 and GDDR7 memory, the GPU is built using the same TSMC 4N process as the previous generation. Without improvements to the production node, significant performance gains would require an architectural overhaul, which isn't yet on the table.RTX 4090 FE on the left, 5090 FE on the rightGeForce RTX 5090GeForce RTX 4090GeForce RTX 5080GeForce RTX 4080 SuperGeForce RTX 4080Price $US MSRP$2,000$1,600$1,000$1,000$1,200Release DateJanuary 30, 2025October 12, 2022January 30, 2025January 31, 2024November 16, 2022ProcessTSMC 4NDie Size (mm2)750 mm2608.5 mm2378 mm2379 mm2Core Config21760 / 680 / 19216384 / 512 / 17610752 / 336 / 12810240 / 320 / 1129728 / 304 / 112L2 Cache (MB)96 MB72 MB64 MBGPU Boost Clock2407 MHz2520 MHz2617 MHz2550 MHz2505 MHzMemory Capacity32 GB24 GB16 GBMemory Speed28 Gbps21 Gbps30 Gbps23 Gbps22.4 GbpsMemory TypeGDDR7GDDR6XGDDR7GDDR6XBus Type / Bandwidth512-bit / 1792 GB/s384-bit / 1008 GB/s256-bit / 960 GB/s256-bit / 736 GB/s256-bit / 717 GB/sTotal Board Power575 W450 W360 W320 WTherefore, Nvidia's solution was to create a bigger and more powerful GPU. The die is now 23% larger, featuring 33% more cores. It comes equipped with 32 GB of 28Gbps GDDR7 memory on a 512-bit wide memory bus, delivering a bandwidth of 1,792 GB/s a hefty 78% increase over the RTX 4090.The RTX 5090 is a powerhouse, but it comes with an even steeper price tag, making it 25% more expensive than the RTX 4090. Given that price increase, we expect it to deliver performance far beyond what the specs suggest.RTX 4090 vs RTX 5090 ThermalsBefore we dive in and get into the blue bar graphs, let's take a look at how Nvidia's Founders Edition version of the RTX 5090 performs compared to the RTX 4090 FE card. For this comparison, we tested The Last of Us Part 1 at 4K with maxed-out settings.After an hour of load inside an enclosed ATX case, the RTX 5090 reached a peak GPU temperature of 73C, which is remarkable given how quiet and compact the card is. The fan speed peaked at 1,600 RPM and remained inaudible over our case fans, which are already very quiet.The cores averaged a clock speed of 2,655 MHz, while GPU power averaged 492 watts. The memory temperature peaked at 88C, with an operating frequency of 2,334 MHz, providing a transfer speed of 28 Gbps.In comparison, the RTX 4090 FE model peaked at 68C, with a memory temperature of 80C, and its fans spinning just below 1,500 RPM. Clearly, the RTX 5090 runs slightly hotter and louder. However, given that the RTX 5090 consumed, on average, 35% more power during testing and is a significantly smaller card, these results are nothing short of remarkable.We are incredibly impressed with what Nvidia has achieved here. The RTX 5090 might be the most impressive graphics card we've ever seen. You would never guess, just by looking at it, how much thermal load this cooler can handle so efficiently. It's an outstanding achievement. Now, let's see how it performs.Test System SpecsCPUAMD Ryzen 7 9800X3DMotherboardMSI MPG X870E Carbon WiFi (BIOS 7E49v1A23 - ReBAR enabled)MemoryG.Skill Trident Z5 RGB DDR5-6000 [CL30-38-38-96]Graphics CardsGeForce RTX 4070 GeForce RTX 4070 Super GeForce RTX 4070 Ti GeForce RTX 4070 Ti Super GeForce RTX 4080 GeForce RTX 4080 Super GeForce RTX 4090 GeForce RTX 5090 Radeon RX 7700 XT Radeon RX 7800 XT Radeon RX 7900 GRE Radeon RX 7900 XT Radeon RX 7900 XTXATX CaseMSI MEG Maestro 700L PZPower SupplyMSI MPG A 1000G ATX 3.0 80 Plus Gold 1000WStorageMSI Spatium 1TB M470 PCIe 4.0 NVMe M.2Operating SystemWindows 11 24H2Display DriverNvidia GeForce Game Ready 566.36 WHQL AMD Radeon Adrenalin 24.12.1Gaming BenchmarksMarvel RivalsStarting with Marvel Rivals at 1440p, we see that the RTX 5090 delivers 30% more performance than the RTX 4090. While this is a decent performance improvement, factoring in the 25% price increase makes it considerably less exciting.At 4K resolution, the margin increases slightly to 33%. This is a solid uplift, but the extreme price premium dampens the enthusiasm.S.T.A.L.K.E.R. 2: Heart of ChornobylS.T.A.L.K.E.R. 2 isn't the most optimized game, and as a result, the RTX 5090 maxes out at 94 fps at 1440p. This makes it only 22% faster than the RTX 4090, offering a very mild performance gain.At 4K, however, the RTX 5090 achieves a more reasonable 42% performance gain, rendering an average of 71 fps.Counter-Strike 2Next, we have Counter-Strike 2. At 1440p, the RTX 5090 is slightly slower than the RTX 4090, although the 1% lows are notably stronger. It's worth mentioning that the RTX 5090 was slower than the RTX 4090 at 1080p in multiple instances. This suggests a possible overhead issue with the Blackwell architecture, or perhaps the RTX 5090's large core count isn't being efficiently utilized at lower resolutions. Further investigation is needed here.Even at 4K, the RTX 5090 only offers an 8% performance increase over the RTX 4090. The issue doesn't appear to be a CPU bottleneck, given the higher frame rates observed at 1440p.God of War RagnarkPerformance in God of War Ragnark is outstanding at 1440p, hitting 268 fps on the ultra preset. However, this is only 22% faster than the RTX 4090, which is disappointing given the 25% higher cost.At 4K, the RTX 5090 scales better, achieving a 36% performance improvement with 195 fps compared to 143 fps on the RTX 4090 a much more favorable result.Delta ForceIn Delta Force, the RTX 5090 provides just 17% more performance than the RTX 4090 at 1440p. However, frame rates here are extreme and likely approaching a CPU bottleneck.At 4K, the margin extends to 27%, rendering 160 fps. While this is an improvement, it's still not an impressive uplift, especially considering the 25% higher price and the two-year gap between releases.Warhammer 40,000: Space Marine 2Space Marine 2 is a very CPU-limited game, and at 1440p, we appear to be hitting the limits of the 9800X3D processor. Oddly, the RTX 5090 is 4% slower than the RTX 4090 here. As observed in other instances at 1080p, this could indicate an overhead issue or inefficiencies in workloads that limit the RTX 5090's performance.At 4K, the RTX 5090 resolves this problem, delivering a 30% performance increase over the RTX 4090. While this is a decent uplift, it is undercut by the 25% price hike.Star Wars Jedi: SurvivorIn Star Wars Jedi: Survivor, the RTX 5090 delivers just a 14% improvement over the RTX 4090 at 1440p. However, with an average of 191 fps, performance remains impressive overall.At 4K, the RTX 5090 crosses the 100 fps threshold with 102 fps, making it 21% faster than the RTX 4090. Still, this is a disappointing margin given the higher cost.A Plague Tale: RequiemIn A Plague Tale: Requiem, the RTX 5090 delivers a 21% performance improvement over the RTX 4090 at 1440p. The results are partly CPU-limited, as suggested by similar 1% lows between the two GPUs.At 4K, the RTX 5090 pulls ahead with a 42% performance uplift, making this one of the better margins seen in the benchmarks.Cyberpunk 2077: Phantom LibertyIn Cyberpunk 2077: Phantom Liberty, the RTX 5090 struggles to deliver noteworthy gains at 1440p, with just a 19% improvement over the RTX 4090. The 1% lows are also similar, indicating other system limitations may be at play.At 4K, the margin improves to 32%. While the overall performance is excellent, this result remains underwhelming. It's worth noting that the second-highest preset was used, and ray tracing was not enabled for this test.Dying Light 2 Stay HumanFrame rates in Dying Light 2 using the high preset are extreme at 1440p, reaching 198 fps with the RTX 5090. However, this makes it only 24% faster than the RTX 4090.Even at 4K, the performance gain remains modest at 25% over the RTX 4090, which scales directly with the 25% price increase.Dragon Age: The VeilguardIn Dragon Age: The Veilguard, frame rates are limited to just under 130 fps at 1440p using the ultra preset, which selectively applies some ray tracing effects. While the focus of this portion of the review is on rasterization performance, ray tracing plays a role here.When increasing the resolution to 4K, the RTX 5090 averages 96 fps, only 10% faster than the RTX 4090. This is a very disappointing result.War ThunderWar Thunder runs at extremely high frame rates, even with the highest quality preset enabled. At 1440p, the performance is clearly CPU-limited, which we confirmed by testing at 1080p.Moving to 4K removes the CPU bottleneck, but even then, the RTX 5090 is only 15% faster than the RTX 4090. Granted, with frame rates well over 300 fps, performance is more than sufficient for gameplay, but in terms of relative performance, the RTX 5090 is underwhelming here.Marvel's Spider-Man RemasteredMarvel's Spider-Man Remastered is heavily CPU-limited at 1440p, with both the RTX 4090 and RTX 5090 capped at 222 fps.At 4K, the CPU bottleneck is mostly removed, but the RTX 5090 still appears slightly limited, averaging 212 fps. As a result, the RTX 5090 is just 26% faster than the RTX 4090.Hogwarts LegacyHogwarts Legacy is another title that is mostly CPU-limited at 1440p, resulting in similar performance between the RTX 4090 and RTX 5090.Increasing the resolution to 4K allows the RTX 5090 to pull ahead, delivering a 31% performance improvement. While the performance is excellent overall, the value remains questionable.The Last of Us Part IIn The Last of Us Part I, the RTX 5090 provides a solid performance uplift at 1440p, where it is 28% faster than the RTX 4090, averaging 204 fps. This results in excellent overall performance.At 4K, the RTX 5090 offers a 40% performance increase, averaging 125 fps. This is a strong result, especially when compared to most other titles.Star Wars OutlawsThe RTX 5090 achieves over 100 fps in Star Wars Outlaws at 1440p using the ultra preset. With ray tracing forced on, the RTX 5090 is 22% faster than the RTX 4090.Oddly, the margin decreases at 4K, where the RTX 5090 is just 19% faster than the RTX 4090. Typically, we expect the RTX 5090 to show greater advantages at higher resolutions, but that isn't the case here.StarfieldFinally, in Starfield, the RTX 5090 is only 4% faster than the RTX 4090 at 1440p using ultra-quality settings, limiting performance to 125 fps.At 4K, the RTX 5090 improves slightly but is still just 7% faster than the RTX 4090. There seems to be a limitation in this title that prevents the RTX 5090 from delivering the margins seen in other games at 4K.Performance SummaryAlthough we did not include 1080p data for individual games, here are the average results across the 17 games tested. As seen, both the RTX 4090 and RTX 5090 are heavily CPU-limited at this resolution, making them ideal for CPU benchmarking rather than GPU evaluation.Even at 1440p, the RTX 5090 is often heavily limited by the CPU, resulting in just a 12% performance improvement over the RTX 4090 across the 17 games tested.Now at 4K we can see the potential of the GeForce RTX 5090 where it delivers an average performance improvement of 27%, which looks solid on raw numbers but it's somewhat disappointing from a value perspective considering it costs 25% more than the 4090. This is why we've been joking internally, calling it the 4090 Ti as it really feels like that's what it is.Even if the RTX 5090 maintained the same $1,600 MSRP as the RTX 4090, it would still feel underwhelming as a next-generation flagship GPU. For comparison, the RTX 4090 was on average 60% faster than the RTX 3090 Ti, while launching at a lower price. It was also 73% faster than the RTX 3090 with only a 7% price increase. By comparison, the RTX 5090's performance and value fall far short of expectations for a generational leap.Power ConsumptionNow, let's look at power consumption. Most of our power data was recorded at 1440p, which is not ideal for measuring the full power usage of the RTX 5090, but we supplemented this with additional tests for clarity. In Starfield at 1440p, the RTX 5090 increased power consumption by 12% compared to the RTX 4090.In Star Wars Outlaws, we observed a 17% increase in power usage at 1440p, rising from 532 watts to 624 watts. Interestingly, in Space Marine 2, where the RTX 5090 performed worse than the RTX 4090 at 1440p, power consumption decreased by 15%, demonstrating that the RTX 5090 is highly efficient when not operating at full load.To better evaluate power usage, we re-tested the Radeon RX 7900 XTX, RTX 4090, and RTX 5090 at 4K in three games where the RTX 5090 performed well: Dying Light 2, Cyberpunk 2077, and A Plague Tale: Requiem.In these tests, the RTX 5090 increased power consumption by 37 41%, depending on the game. These results align more closely with the performance gains seen in these titles. Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption.Finally, we re-ran those same power tests with a 60 fps cap, which yielded some interesting results. In A Plague Tale: Requiem, power consumption for the RTX 5090 was nearly identical to the RTX 4090, with just a 2% increase. In Cyberpunk 2077, the RTX 5090 showed an 8% increase, while in Dying Light 2, it consumed 15% more power.Ray Tracing PerformanceRT - Metro Exodus EnhancedMetro Exodus Enhanced remains one of the few ray tracing games that provides a truly transformative experience with ray tracing enabled, so we felt it was important to include.As a side note before we show you the results, we've encountered issues testing Metro Exodus Enhanced with Radeon GPUs as of late. While the game has worked in the past, enabling ray tracing now causes system crashes with Radeon GPUs, regardless of whether AMD or Intel systems are used. AMD has replicated the problem and is aware of the issue, but unfortunately, a fix was not available in time for this review. As a result, we decided to exclude Radeon data and focus solely on the RTX 4090 and RTX 5090 performance.At 1080p, the RTX 5090 was 21% faster than the RTX 4090, and at 1440p, the margin increased to 33%. We did not test 4K ray tracing performance, as most titles deliver poor and often unplayable performance at that resolution, even with upscaling. However, Metro Exodus Enhanced would likely perform well on both the RTX 4090 and RTX 5090.RT - Alan Wake IIIn Alan Wake II, with quality upscaling enabled, the RTX 5090 was just 19% faster than the RTX 4090 at 1080p. Moving to 1440p did not significantly improve the results, with the RTX 5090 showing only an 18% performance gain.Overall, these are weak gains for the RTX 5090, and even with ray tracing enabled, the performance only just breaks the 100 fps barrier.RT - Cyberpunk 2077: Phantom LibertyUsing the ultra ray tracing preset with quality upscaling, Cyberpunk 2077: Phantom Liberty shows the RTX 5090 performing comparably to the RTX 4090 at 1080p, likely due to CPU limitations.At 1440p, the RTX 5090 pulls ahead slightly, offering an 11% performance increase with an average of 129 fps.RT - Marvel's Spider-Man RemasteredIn Marvel's Spider-Man Remastered, performance is heavily CPU-limited at both 1080p and 1440p. This is problematic, as frame rates are capped at 128 fps at 1440p, which is a limit achieved even by the RTX 4080 Super.While 4K benchmarks might provide more insight, the 128 fps cap at lower resolutions is concerning. Although this is solid performance overall, for those with high-refresh-rate monitors, it may not be enough. Furthermore, it's unlikely that many users spending $2,000 or more on a graphics card would settle for gaming at 60 fps, which is what would likely occur at 4K without upscaling.RT - Dying Light 2 Stay HumanIn Dying Light 2 using the high ray tracing preset with quality upscaling, the RTX 5090 achieved an average of 208 fps at 1080p, making it 18% faster than the RTX 4090.At 1440p, where CPU limitations are not a factor, the RTX 5090 was only 22% faster than the RTX 4090, making this an underwhelming result given the price premium.RT - Black Myth: WukongWith the very high ray tracing preset, the RTX 5090 delivered 123 fps at 1080p with quality upscaling, providing a 34% performance improvement over the RTX 4090.At 1440p, the RTX 5090 maintained a similar margin, being 36% faster and rendering an average of 98 fps. While this is a reasonable step forward relative to past products, the overall performance remains less impressive, especially since upscaling is required.Ray Tracing Performance SummaryWe used a five-game average for the ray tracing data since Metro Exodus Enhanced had to be excluded due to the issues with Radeon GPUs. On average, the RTX 5090 was 14% faster than the RTX 4090 at 1080p with upscaling.At 1440p, the RTX 5090 showed an average performance increase of just 17%. Notably, even with upscaling, the average frame rate at 1440p was just 123 fps far from impressive for a graphics card priced at $2,000.Cost per FrameHere's how the current and previous-generation mid-range to high-end GPUs compare in terms of value, based on MSRP. At $2,000, the RTX 5090 offers only a 1.5% improvement in value per frame compared to the RTX 4090.In other words, after more than two years, there's no meaningful improvement in cost per frame. The RTX 5090 is essentially just a faster RTX 40 series GPU.If we consider the best retail pricing for mid-2024 and assume the RTX 5090 will sell for $2,000, the value proposition looks slightly better. However, realistically, do we believe the RTX 5090 will actually sell for $2,000? Probably not.If anything, the retail price is likely to climb higher, making the value situation even worse. At $2,000, the RTX 5090 already represents poor value, and anything higher would make it an even tougher sell.What We Learned: It's the World's Fastest Gaming GPU, But...The GeForce RTX 5090 is now the world's fastest gaming GPU no surprise there. What is shocking, however, is that in our testing, it was on average just 27% faster than the RTX 4090 at 4K, while costing at least 25% more.This is why we've referred to it as the RTX 4090 Ti because, let's be honest, that's exactly what it is. Nvidia has tried to disguise this by marketing DLSS 4 multi-frame generation as a game-changing feature, akin to dangling a shiny set of keys to distract gamers.Speaking of DLSS 4, we haven't mentioned frame generation much in this review, despite Nvidia heavily promoting it as a key feature of the GeForce 50 series. This omission might seem odd, but we believe frame generation deserves a separate, dedicated analysis.We're already working on an in-depth DLSS 4 review, which will explore the technology in greater detail soon. The reason we tackle topics like frame generation and upscaling separately is that testing these features properly is complex. It's less about frame rates and more about image quality and, in the case of frame generation, latency.The reason we tackle topics like frame generation and upscaling separately is that testing these features properly is complex. It's less about frame rates and more about image quality and, in the case of frame generation, latency.To summarize briefly, frame generation doesn't deliver what Nvidia's marketing claims. It's not a true performance-enhancing feature; you're not genuinely going from 60 fps to 120 fps. Instead, you're getting the appearance of smoother gameplay, albeit with potential graphical artifacts, but without the tangible benefits of higher frame rates such as improved input latency.That doesn't mean frame generation is useless or that it's not a good technology. It can be helpful in certain scenarios, but Nvidia has weaponized the feature to mislead consumers, making claims like the upcoming RTX 5070 being faster than the RTX 4090, which is fundamentally untrue.We also strongly believe that showcasing frame generation performance in benchmark graphs is misleading. And while Nvidia would love for us to do just that, we see this as a slippery slope for gamers a race to the bottom, where winning benchmarks would become about who can spit out the most amount of interpolated frames... input and visual quality be damned.As it stands, DLSS 3 and DLSS 4 frame generation are best described as frame-smoothing technologies. Under the right conditions, they can be effective, but they don't truly boost FPS performance. Moreover, they're entirely unsuitable for competitive shooters or fast-paced games where the goal of high frame rates is to reduce input latency. Nvidia's narrative that all gamers will or should use frame generation couldn't be further from reality.Notes about CPU Pairing with the RTX 5090 and Ray TracingMoving on to another topic, about CPU performance, it's clear from the 1440p data we gathered that anyone investing in an RTX 5090 needs a high-end CPU, such as the 9800X3D. Even with the Zen 5 3D V-Cache processor, you'll frequently encounter CPU bottlenecks, especially if you aim for high refresh rates with ray tracing enabled.Speaking of ray tracing, you're almost certainly going to find reviews where the RT performance of the RTX 5090 relative to the 4090 is more impressive than what we saw for the majority of our testing, and this will come down to the quality settings used.Our testing focused on real-world scenarios that prioritize frame rates above 60 fps, as we believe most gamers spending $2,000 on a GPU won't settle for console-like frame rates.But in an effort to provide a bit more context, for example, in Black Myth: Wukong, we tested at 1440p using DLSS quality upscaling, where the RTX 5090 delivered 98 fps a 24% improvement over the RTX 4090. But if we disable upscaling, which we feel most gamers using ray tracing won't do, the frame rate of the 5090 drops to 64 fps, but this also meant that it was now 45% faster than the 4090, so a far more impressive margin here.This is comparable to what we see at 4K using DLSS upscaling, though again we're only gaming at around 60 fps, which some gamers will find acceptable, but I personally find it less than desirable, especially when spending so much money.Ultimately, the point is that the RTX 5090 can be 40-50% faster than the RTX 4090, depending on the game and settings. However, as demonstrated in this review, when targeting high frame rates, the difference is typically much smaller.Bottom LineAll things considered, the GeForce RTX 5090 is an impressive performer that falls short of meeting the expectations for a next-generation flagship GPU. It doesn't move the needle forward in terms of value or innovation and could easily fit into the GeForce 40 series lineup. If Nvidia had launched this as an RTX 4090 Ti, few would have batted an eye.We understand that Nvidia couldn't do much given the limitations of the current process node. However, they still could have delivered a more exciting product series. Even at $1,600, the RTX 5090 would have been far more appealing still not amazing, but much better than it is now.Without a process node upgrade, this release doesn't come close to the leap we saw from the RTX 3090 to the RTX 4090, which was vastly more significant. It's also clear that as Nvidia cements its position as the leader in AI hardware, GeForce seems to have taken a back seat to the big money in AI(just check out this graph, it's insane).We still expect the RTX 5090 to age well. While today's 27% average performance gain over the RTX 4090 is underwhelming, this margin will likely increase over time, potentially reaching 40% in more games.Unfortunately, this also means the more affordable models in the GeForce RTX 50 series will probably be underwhelming, offering only minor performance gains over the GPUs they replace. Nvidia could have addressed this by providing better VRAM configurations.For example, 12 GB on the RTX 5070 is simply unacceptable it should have at least 16 GB. If Nvidia had done this, the RTX 5070 might have been a worthwhile upgrade over the RTX 4070 and a much more significant step up from the RTX 3070.For those looking for a more positive take, the good news is that the RTX 5090 is faster than the RTX 4090, pushing 4K gaming closer to high-refresh-rate experiences. If you already had oodles of money to blow on a graphics card and missed out on the RTX 4090, the RTX 5090 could be a great addition to your gaming setup.In summary, the RTX 5090 is 25% more expensive than the RTX 4090, delivers an average of 27% more performance, includes 33% more VRAM, and consumes around 30% more power. Interpret that as you like. For now, our review is complete with a closer look at DLSS 4 coming soon let us know your thoughts on Nvidia's new flagship graphics card in the comments.Shopping Shortcuts:Nvidia GeForce RTX 5090 on AmazonNvidia GeForce RTX 5080 on Amazon (soon)AMD Radeon RX 7900 XTX on AmazonNvidia GeForce RTX 4070 Ti Super on AmazonNvidia GeForce RTX 4070 Super on AmazonAMD Radeon RX 7800 XT on AmazonAMD Radeon RX 7900 XT on Amazon
    0 Comentários 0 Compartilhamentos 164 Visualizações
  • WWW.TECHSPOT.COM
    Bing search results in Edge are obscuring Chrome links, promoting Microsoft's browser
    WTF?! Google and Microsoft have spent years engaging in dirty tricks campaigns designed to push people onto their respective browsers, Chrome and Edge. The latest tactic is one employed by the Windows maker: Edge hides Chrome's download links for some users when they perform a Bing search for the browser. As noticed by Windows Latest, searches for Chrome using Edge and via Bing (when signed out of your Microsoft account) on Windows 11 result in a "promoted by Microsoft" banner appearing at the top of the search results.The banner is a recommendation by the Redmond firm, advising users there's no need to download a new web browser and highlighting that Edge offers a fast, secure, and modern web experience that saves time and money. It also comes with the obligatory "Try now" button.Forcing obtrusive ads for its products down people's throats isn't new territory for Microsoft, of course. But this one arguably goes a little further by hiding the Chrome download links that are beneath the banner, and the small portion of the top Google result that is visible appears mostly blurred out.Courtesy of Windows LatestIt's easy to see the search results by clicking on the "See more" button further down the screen, and most people who do a search for Chrome likely intend to download it, no matter what Microsoft claims. However, less tech-savvy users may be persuaded by the banner's claims. // Related StoriesThe other thing to note is that few people are likely to encounter this banner. Google has an almost 90% share of the global search engine market, whereas Bing has 4%. It's a similar story in the browser market: Chrome has a 68.3% share, Edge has just under 5%.It appears that not everyone is seeing the banner. I couldn't get it to show, so it might be limited to a small set of users or certain locations.Microsoft's war against Chrome goes back a long way. Some examples of its pushiness include the company telling people in 2021 that the rival browser was "so 2008" and Edge was better. There were also full-size Edge ads that appeared on the Chrome website, and Edge was accused of stealing data from Chrome without users' consent in January.Google isn't a stranger to using such tactics, either. The company shows prompts to Edge users recommending Chrome, and in 2020 it showed a message that read "Google recommends switching to Chrome to use extensions securely" whenever Edge users visited the Chrome Web Store, though Google quickly removed that message.
    0 Comentários 0 Compartilhamentos 167 Visualizações
  • WWW.DIGITALTRENDS.COM
    We now know why AMD chose to delay RDNA 4 well, kind of
    AMD hasnt been very forthcoming when it comes to information about its RX 9000 series GPUs, but we just got an update as to why the cards wont be available until sometime in March. The company cites software optimization and FSR 4 as the two reasons why it most likely decided to delay the launch of RDNA 4. But is that all there is to it, or is AMD waiting to see some of Nvidias best graphics cards before pulling the trigger on the RX 9070 XT?The update comes from David McAfee, AMDs vice president and general manager of the Ryzen CPU and Radeon graphics division. A couple of days ago, McAfee took to X (Twitter) to announce that AMD was excited to launch the RX 9000 series in March. This caused a bit of an uproar, with many enthusiasts wondering why AMD was choosing to wait so long.Recommended VideosI really appreciate the excitement for RDNA4. We are focused on ensuring we deliver a great set of products with Radeon 9000 series. We are taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. We also have a wide range David McAfee (@McAfeeDavid_AMD) January 22, 2025Get your weekly teardown of the tech behind PC gaming McAfee now explains that AMD is taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. While a bit vague, this confirms what many leakers have been saying: RDNA 4 is ready hardware-wise, but now, AMD seems eager to improve it on the software side. While not a bad approach, it does feel like there might be more to it.After all, the GPUs have been spotted with preorders opening on January 22, and several retailers appear to already have them in stock. McAfee admits as much, saying: We also have a wide range of partners launching Radeon 9000 series cards, and while some have started building initial inventory at retailers, you should expect many more partner cards available at launch.There are more signs pointing to the fact that AMD may have had other plans for the release date of RDNA 4. VideoCardz spotted a Reddit ad posted by the official AMD account, and the ad clearly states that the GPUs are available, saying: When the stakes are high, every play counts play now with the ultimate performance of AMD Radeon RX 9000 series graphics cards. It also gives us a closer look at the RX 9070 XT in its Made By AMD (MBA) design.If late January was the initial plan for the RX 9000 series, its clear that AMD pulled back. Greater availability for FSR 4 titles and improved drivers are both good reasons to delay, but its possible that AMD might also want to see how Nvidias RTX 5070 Ti will fare when it (most likely) hits the market in February.Theres one good thing here, though, and thats GPU availability. McAfee implies therell be plenty of RDNA 4 cards to go around, which is great, especially considering that Nvidias RTX 50-series might be very hard to come by at launch.Editors Recommendations
    0 Comentários 0 Compartilhamentos 162 Visualizações
  • WWW.DIGITALTRENDS.COM
    Nvidia RTX 5090 review: fast, but not nearly twice as fast
    Nvidia GeForce RTX 5090MSRP$1,999.00 Score Details Nvidia is, once again, leaving its mark on the flagship throne with the RTX 5090.ProsUnrivaled 4K gaming performanceInnovative, attractive Founder's Edition designDisplayPort 2.1 and 4:2:2 encoding32GB of memory for AI workloadsDLSS 4 is a treat...Cons...when it works properlyInsanely expensivePower requirements are off the chartsTable of ContentsTable of ContentsNvidia RTX 5090 specs4K gaming performance1440p gaming performance1080p gaming performanceRay tracingA closer look at DLSS 4Great for those in the marketThe RTX 5090 is a hard GPU to review. By the numbers, its undoubtedly the best graphics card you can buy. Thats what happens when youre the only one in town making this class of GPU, and as it stands now, Nvidia is. If you want the best of the best and dont mind spending $2,000 to get it, you dont need to read the rest of this review though, Id certainly appreciate if you did.Recommended VideosNo, the RTX 5090 is about everything else that RTX 50-series GPUs represent. It delivers that flagship gaming performance, but it also ushers in an entirely new architecture, DLSS 4, and the era of neural rendering. And on those points, the dissection of the RTX 5090 is far more nuanced.Get your weekly teardown of the tech behind PC gaming Jacob Roach / Digital TrendsThe RTX 5090 is angled toward PC gamers who want the best of the best regardless of the price but its also the first taste weve gotten of Nvidias new Blackwell architecture in desktops. The big change is neural rendering. With RTX 50-series GPUs, Nvidia is introducing neural shaders along with DirectX though we wont see the fruits of that labor play out for quite some time.For immediate satisfaction, Nvidia has DLSS 4. This feature is coming to all RTX graphics cards, replacing the convolutional neural network (CNN) that DLSS previously used with a new transformer model. Nvidia says this leads to a quality boost across the board. For the RTX 5090, the more important addition is DLSS Multi-Frame Generation, which promises up to 4X frame generation in 75 games on day one. DLSS 4 is coming to all RTX graphics cards, but DLSS Multi-Frame Generation is exclusive to RTX 50-series GPUs, including the RTX 5090.RTX 5090RTX 4090ArchitectureBlackwellAda LovelaceProcess nodeTSMC N4TSMC N4CUDA cores21,76016,384Ray tracing cores170 4th-gen144 3rd-genTensor cores680 5th-gen576 4th-genBase clock speed2017MHz2235MHzBoost clock speed2407MHz2520MHzVRAM32GB GDDR724GB GDDR6XMemory speed30Gbps21GbpsBus width512-bit384-bitTDP575W450WList price$1,999$1,599Although it might seem like Nvidia could just flip a switch and enable DLSS Multi-Frame Generation on all of its GPUs, thats not exactly the case. Nvidia says with 4X frame generation and Ray Reconstruction enabled, there are five AI models running on your GPU for each rendered frame. To manage all of that, the RTX 5090 includes an AI management processor, or AMP, which handles scheduling of these different workloads across the ray tracing, Tensor, and CUDA cores.Outside of AI hardware, the RTX 5090 brings 32GB of GDDR7 memory. Nvidia bumped up the capacity from 24GB on the RTX 4090, though that doesnt have a ton of applications in games. The extra memory here really helps AI workloads, where training large models can easily saturate 32GB of memory. The bigger boost is GDDR7, which is twice as efficient as GDDR6 while providing twice as high of a data rate.Nvidia also redesigned its ray tracing and Tensor cores for Blackwell, both of which it says are built for the new Mega Geometry feature. The bigger standout for me is the media encoding engine, however. Nvidia now supports 4:2:2 video encoding, along with DisplayPort 2.1 output. Those are some significant upgrades over the RTX 4090, regardless of what the benchmarks say.Jacob Roach / Digital TrendsTwice as fast as the RTX 4090? Not quite. Based on my results, the RTX 5090 is about 30% faster than the RTX 4090 when the new DLSS Multi-Frame Generation feature isnt brought into the mix. And its a feature you might want to leave out of the mix in some titles, as Ill dig into later in this review. That sounds like a solid generational jump, but I went back to my RTX 4090 review for a sanity check. Its not nearly as big as what weve seen previously.With the RTX 4090, Nvidia provided over an 80% generational improvement, which is massive. Here, its actually more of a lateral move. The RTX 5090 is 30% faster than the RTX 4090, but its also 25% more expensive, at least at list price. That said, good luck finding an RTX 4090 in stock at $2,000, much less at list price. The RTX 5090 may not be the generational improvement I expected, but the reality for buyers is still that its the best option for flagship performance.The average is brought down by a handful of games where the RTX 5090 doesnt show a huge increase. InAssassins Creed Mirage,for example, theres about a 17% uplift. Similarly, inForza Motorsport,the improvement shrinks to just 14%. Those arent exactly the margins I was hoping for when Nvidia announced a new flagship GPU, and especially one that comes in at a significantly higher price.Jacob Roach / Digital TrendsMake no mistake; there are still big wins. As you can see above, I measured a massive 54% improvement inCyberpunk 2077,which is really impressive. In the previous generation, the RTX 4090 was the only GPU that could run this game at 4K Ultra without upscaling and still achieve 60 frames per second (fps). Now, the RTX 5090 is comfortably reaching into the triple digits. This is the kind of improvement I expected to see across the board.Cyberpunk 2077isnt a one-off thankfully. Although the improvements arent quite as large across the board, I saw similarly impressive uplifts inHorizon Zero Dawn Remastered, Returnal, andDying Light 2.The improvement may not be above 80% like we saw in the previous generation, but theres still a clear improvement. If you want the best of the best, Nvidia is claiming that throne with the RTX 5090.Its just the expectations that are important. Despite some big wins, I suspect most games will look likeBlack Myth: Wukong, Red Dead Redemption 2,andCall of Duty Modern Warfare 2.Youre getting a nice chunk of extra performance, no doubt, but that lift doesnt fundamentally change the gameplay experience in quite the same way that the RTX 4090 did.Looking over my 4K data, it became clear that the RTX 5090 establishes somewhat of a new normal. The RTX 4090 had an outsized generational improvement, as Nvidia continued to navigate the waters of how it wanted to market its flagships moving forward. The RTX 5090 is disappointing by comparison, and Im not sure theres much reason for RTX 4090 owners to run out and buy Nvidias latest. But for those that want the best, its hard arguing with the numbers the RTX 5090 puts up.Its easy to argue, however, with Nvidias misleading claims. Were nowhere near twice the performance of an RTX 4090, and the company confirmed to me that its seeing a 30% average uplift internally, as well. Thats the kind of improvement Id expect to see out of an 80-class card, but it looks like the death of Moores Law has to hit everyone at some point.Jacob Roach / Digital TrendsEven at 1440p, its very easy to run into a CPU bottleneck with the RTX 5090. You can see that just from looking at the averages above; the RTX 5090 shrinks down to just a 22% lead over the RTX 4090. All of my data here is fresh, and run with a top-of-the-line Ryzen 9 9950X. In short, if you plan to use the RTX 5090 at 1440p, youre giving up a serious chunk of its performance potential, and youre probably better off with a used RTX 4090.Forza Motorsport and especially Red Dead Redemption 2show the problem here. The RTX 5090 is still able to squeeze out a win across games at 1440p, but the margins are much thinner. Thats not a critique of the graphics card, but it is the reality of trying to run this monstrous GPU at any resolution below 4K.There are still some solid wins for Nvidias latest, particularly in games that scale well on your CPU. Cyberpunk 2077is once again a standout victory, but you can see similarly large improvements inDying Light 2andReturnal.Jacob Roach / Digital TrendsOne game thats worth zooming in on isBlack Myth: Wukong.This is the only game in my test suite that I run with upscaling enabled by default, and it shows what can happen when forcing upscaling on at a lower resolution. The RTX 5090 is providing a 20% improvement, but as you continue to push down the internal resolution, that lead will continue to flatline.Regardless, the RTX 5090 really isnt built for 1440p. You can use it at this resolution, but youre giving up a chunk of what the RTX 5090 is truly capable of.Jacob Roach / Digital TrendsThe idea of using an RTX 5090 at 1080p is a little silly, but I still ran the card through all of the games I tested at this resolution. Here, the CPU bottleneck becomes more extreme, pushing the RTX 5090 down to just a 15% lead over the RTX 4090. You could see that as disappointing, but frankly, I see this resolution as unrealistic for a $2,000 graphics card.However, looking at 1080p data is still valuable, at least at a high level. Its important to remember that DLSS Super Resolution renders the game at a lower internal resolution, so the advantage of the RTX 5090 slips a bit with DLSS upscaling turned on. The RTX 5090 can easily make up that gap with DLSS Multi-Frame Generation and even push much further but these results are a good reminder of bottlenecks you can run into when using flagship hardware with upscaling.Nvidia dominates when it comes to ray tracing, so its no surprise that the RTX 5090 enjoys a top slot among the games I tested. However, the improvements arent as large as I expected. Nvidia has solved, for lack of a better word, real-time ray tracing. Games that arent pushing full-on path tracing are seeing less of an improvement, largely due to the fact that lighter forms of ray tracing are fair game for GPUs as weak as the Intel Arc B580.Dying Light 2is a good example of this dynamic. When this game released a few years back, it was one of the most demanding titles you could play on PC. But even at 4K with the highest graphics preset and no help from upscaling, the RTX 5090 makesDying Light 2look like childs play with a comfortable 90 fps average.InReturnal,the situation is even more extreme. This is one of those lighter ray tracing games available, and sure enough, the RTX 5090 crosses triple digits without breaking a sweat, even at 4K.Things get interesting when looking at those more demanding ray tracing games, though.Cyberpunk 2077,once again, serves as a mile marker for the RTX 5090. Its the first GPU to get close to 60 fps at 4K with the RT Ultra preset, which is quite the achievement. Of course, its possible to push the RT Overdrive preset, as well more on that in the next section but looking at raw performance, Nvidia is pushing to new heights.The next frontier is path tracing, and for that, I usedBlack Myth: Wukong.The RTX 5090 provides a great experience, even at the Cinematic graphics preset in the game. But games likeBlack Myth such asAlan Wake 2andCyberpunk 2077 that have a path tracing mode still need to resort to upscaling, introducing the CPU more into the mix and limiting the performance uplift. Maybe in the next few generations well see a native 60 fps in this title from an Nvidia flagship.There really isnt much to talk about when it comes to ray tracing on the RTX 5090, and thats exactly how Nvidia wants it. In the vast majority of games, youre looking at rasterized performance that comfortably clears 60 fps at native 4K and can easily climb into the triple digits. Ray tracing still forces some upscaling wizardry in titles like Black Myth: Wukong,but for the most part, you can flip on ray tracing without a second thought. Thats the way it should be.Jacob Roach / Digital TrendsThe chart above is the story Nvidia wants to tell about DLSS 4. Nvidia didnt make this chart, nor did it tell me to make it, but theres a clear narrative that emerges from the data here. Even factoring in PC latency, which is the main issue with frame generation technology, DLSS 4 is doing some magical things. Youre going from an unplayable frame rate to something that can fully saturate a 4K, 240Hz monitor like the MSI MPG 321URX. And youre doing so with around half of the average PC latency as native rendering.The devil is in the details here, however, and Nvidia has a few little devils to contend with.Heres a different side of the story. Above, you can see a short section of gameplay Im going for five stars here inCyberpunk 2077with the RT Overdrive preset. Im using DLAA for a little boost to image quality, and Im using the 4X frame generation mode. Given that Im playing on a 4K display with a 138Hz refresh rate, these seem like ideal settings for my setup. Watch the video, and you tell me if it looks like an ideal experience.I can point out a lot of problems here, picking out single frames with various visual artifacts between each swipe of the mouse, but you dont need to pixel peep to see the issue. Theres an unnatural motion blur over everything, and the edges of objects are mere suggestions rather than being locked in place. You dont need a trained eye to see that this is a bad experience. You dont need a point of comparison, even. You can watch this video in a vacuum and see that DLSS 4 has some clear limitations. Thats not a damning critique of DLSS 4. Its a wonderful tool, but you need to use it correctly.Like any frame generation tech, your experience will rapidly deteriorate when you feed the frame generation algorithm with a low base frame rate like I did in Cyberpunk 2077.Nvidia wants you to use Super Resolution to get to a playable base frame rate of near 60 fps, and then click on Multi-Frame Generation to saturate a high refresh rate display. Using Multi-Frame Generation alone, especially if youre hovering around 30 fps, will give you a bad experience.Marvel Rivals - DLSS 4 GameplayCyberpunk 2077shows the worst of what DLSS 4 has to offer, butMarvel Rivalsshows the best. This is one of various games that uses Nvidias new DLSS Override feature, allowing you to add up to 4X Multi-Frame Generation to games with DLSS Frame Generation through the Nvidia app. Not only is the base frame rate high enough here well over 60 fps, even with DLAA turned on but you also have a third-person camera. There are some minor artifacts, but nothing that ruins the experience and nothing youd even notice during gameplay.Alan Wake 2 - DLSS 4 GameplaySimilarly, the artifacting isnt nearly as bad inAlan Wake 2as it is inCyberpunk 2077.Here, once again, Im starting with a base frame rate of around 30 fps and using Multi-Frame Generation to make up the difference. There are some artifacts, and Id recommend using a combination of Super Resolution and Frame Generation instead. But the experience is at least better compared toCyberpunk 2077due to the camera angle.You dont want to just crank DLSS 4 to 4X mode and call it a day. It needs to be fed with a base frame rate of ideally 60 fps. Although the latency doesnt significantly increase up to three generated frames something that Nvidia should be applauded for on its own the number of visual artifacts does. Realistically, I suspect DLSS 4 will more often run in 2X or 3X mode alongside Super Resolution. That, in a lot of games, will provide a much better experience than relying on Multi-Frame Generation alone.Over the past few generations, Nvidia has increasingly relied on DLSS to market its graphics cards, and that same playbook is at work here. Its just not the same selling point that it once was. Super Resolution is still pulling a lot of the weight, and even a single generated frame is enough to saturate most gaming monitors, even as refresh rates climb. Theres still a use for 4X Multi-Frame Generation, and with the right circumstances, it works extremely well. But when it comes time to spend $2,000 on a graphics card, I would seriously consider how much DLSS Multi-Frame Generation is offering over a $7 utility like Lossless Scaling.For my money, it isnt providing much of an advantage.This is where you need to carefully consider your setup. You want to be using Multi-Frame Generation alongside Super Resolution in those prestige games likeCyberpunk 2077andAlan Wake 2,and if you dont have a monitor capable of producing that high of a refresh rate, that second or third generated frame goes to waste. Unlike DLSS 3, Multi-Frame Generation isnt a feature that just works on its own; it needs to work as part of the rest of your gaming rig.Jacob Roach / Digital TrendsNvidias CEO hit the nail on the head when defending the price of the RTX 5090: When someone would like to have the best, they just go for the best. If theres one thing I can say with absolute certainly, especially considering the lack of flagship competition from AMD, its that the RTX 5090 is the best. It doesnt matter if its $1,500, $2,000, or $2,500 Nvidias CEO is right when he says that the appetite for this type of product doesnt factor in price nearly as much as more inexpensive options.The question isnt if the RTX 5090 is the best; it is. The question is if you need the best, and theres a bit more discussion there. The generational improvements are here, but they dont touch what we saw with the RTX 4090. DLSS 4 is incredible, but it falls apart when its not fed with the right information. And 32GB of GDDR7 memory is welcome, but its only delivering a benefit in AI workloads, not in games.If youre sitting on an RTX 4090, theres not much reason to upgrade here. Theres a performance boost, but the real value lies in DLSS 4, and thats something thats very easy to get around without spending $2,000. The RTX 5090 really shines for everyone else.Maybe you had to skip the RTX 40-series due to poor availability, or maybe the RTX 2080 Ti you have just isnt providing the grunt that it used to. In those situations, the RTX 5090 is great. But if youre in the market to spend $2,000 on a graphics card, you probably dont need me to convince you.Editors Recommendations
    0 Comentários 0 Compartilhamentos 162 Visualizações
  • WWW.WSJ.COM
    2025 Oscar Nominees, from Emilia Prez to Wicked
    Emilia Prez and The Brutalist are expected to contend for best picture.
    0 Comentários 0 Compartilhamentos 166 Visualizações
  • ARSTECHNICA.COM
    600 kW fast-charging pitstops are coming to Formula E
    CCS2 ports 600 kW fast-charging pitstops are coming to Formula E After a couple of years' delay, mid-race recharging is ready to go. Jonathan M. Gitlin Jan 23, 2025 8:25 am | 6 Credit: Oscar Lumley/LAT Images Credit: Oscar Lumley/LAT Images Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreNow 11 seasons in, Formula E has come a long way from its sometimes-chaotic early days and those mid-race pitstops to change cars. Car swaps went away a long time ago, but when the series gets back to racing next month in Jeddah, Saudi Arabia, the mid-race pit stop will be back. Except this time, the cars will be quickly recharged by powerful 600 kW fast chargers.The new race feature, which Formula E is calling "pit boost," is a 30-second pitstop, during which time the car receives a 600 kW fast chargemore than twice as much power as a Tesla Superchargerthat adds 10 percent (3.85 kWh) to the battery's state of charge. It's mandatory for every car in the race, but a team is only allowed to charge one of its two cars at a time, and only within a specified window of time during the race.Some people are probably going to be unimpressed with the length of the stopswhile they're shorter than you might see at a prototype stop at Le Mans or Daytona, you also won't see mechanics running around changing tires. We're also talking an order of magnitude longer than a current Formula 1 pitstop, which will no doubt be used as ammunition by Formula E's detractors, just as the lap-time comparisons are.But the laws of physics are what they are. You can only safely put so much energy into a battery in a given time. Any recharge of a significant percent of the battery's state of charge would take several minutes, and that's really not conducive to an entertaining sprint race. (And it's not like the TV director would spend the entire time focusing on a car in the pits as opposed to all the other cars racing on track.)Trying new stuffPit boost joins attack mode in the Formula E box of tricks. The sport has never been afraid to try new things as a way to entertain the fans, and although not every experiment has been a raging success, others cannot be written off as failures.Fan boostwhere people voted online to give three drivers extra power during parts of the racewas not well-received and contributed little to the show. But "attack mode," a time-limited power-up that's activated by driving over a specific part of the racetrack, has added an interesting strategy component to the races and delivered plenty of excitement.Formula E has also shown it can respond to criticismthis year there's a new compound in Hankook tires that's far racier than the durable-but-gripless rubber of the last two seasons. And in addition to adding more power, during attack mode the front electric motor is allowed to actually send power to those wheelsnormally it is relegated to just harvesting energy under deceleration.Formula E wants pit boost to do something similar, but it neither replaces attack mode nor is tied to itit's now a second thing that race strategists are going to have to deal with."I think it's gonna be fantastic because it's gonna create a little bit of jeopardy into the race. And there are teams that are gonna be using that energy in different ways in different moments. And definitely, you know, it's gonna bring that excitement that we want," said Albert Longo, Formula E's co-founder and chief championship officer.The team view"Pit boost is mainly a strategic element, which, like any change of this nature, will impact our approach, as a stop mid race isnt something weve had in the Championship since the Gen1 era," said Frederic Espinos, team director at Lola Yamaha ABT Formula E Team."Choosing the right moment to pit, which balances the risk of losing track position with the extra power boost, will be essential, especially as only one car at a time will be able to come in for the pit boost, so we will have to go for slightly different strategies even if naturally it would seem going as soon as possible will be the best option. To add further jeopardy, pit boost will be available during the safety car, which could present a beneficial opportunity if you are lucky but if not it could destroy your race," Espinos said."Equally as important to the strategy of when to take pit boost is the execution of it. Although it isnt somewhere you can really gain time, there is the potential to lose a lot, so practice will be key as we introduce this new feature. Ultimately, we can expect added unpredictability in the race and a lot of learnings for the teams, particularly in the early stages," Espinos said.Not every race will feature pit boostthe idea is to give the series some variation. So one race of the Jeddah double-header will require pit boost, but the other will not. "What we thought for season 11 is that basically, let's launch this new system in places where we can do a completely different race the day after," Longo said. "Let's implement that in places where basically, in only 24 hours, you're gonna see a completely different race. So you are actually going to be seeing the difference... by the pit boost," Longo said.That's the plan for this season, at any rate; if it proves a success, Longo said it would probably be rolled out much more widely next year. Formula E says it's confident in the reliability of the chargers or it wouldn't be introducing themit called off pit boost in both 2023 and 2024, after all. But there will be a couple of spares in the paddock in case a team (or two) experiences a failure, according to Formula E's head of championship, Pablo Martino.The Jeddah double-header will be held next month on February 1415. If the first two rounds of this season were anything to go by, it should be worth watching.Jonathan M. GitlinAutomotive EditorJonathan M. GitlinAutomotive Editor Jonathan is the Automotive Editor at Ars Technica. He has a BSc and PhD in Pharmacology. In 2014 he decided to indulge his lifelong passion for the car by leaving the National Human Genome Research Institute and launching Ars Technica's automotive coverage. He lives in Washington, DC. 6 Comments
    0 Comentários 0 Compartilhamentos 179 Visualizações
  • ARSTECHNICA.COM
    Cadillac gives the Lyriq a race car-inspired glow-up
    No blackwing, though Cadillac gives the Lyriq a race car-inspired glow-up Now there's a Lyriq with 615 horsepower and Le Mans-inspired sounds. Jonathan M. Gitlin Jan 23, 2025 8:00 am | 9 Credit: Cadillac Credit: Cadillac Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreThe Cadillac Lyriq was the first of a new breed of General Motors' electric vehicles. Built around a common battery platform (which used to be called Ultium), the midsize SUV has been on sale for about three years now, and for model year 2026, there's a new version available, the first Cadillac EV to wear the V-series badge."V-Series captures the spirit of Cadillac, embodying our relentless pursuit of engineering excellence through our racing and production vehicles," said John Roth, vice president of Global Cadillac. "LYRIQ-V takes this commitment a step further in the EV era, pushing our performance pedigree of V-Series to new heights with a powerful, personalized and high-tech driving experience that fits perfectly into our customers' lives," Roth said.As with other Cadillac V-series cars, you can expect a much higher power output than the base models. In this case, that's a hefty 615 hp (459 kW) and 650 lb-ft (880 Nm)not quite double the output of the single-motor Lyriq we drove back in 2023. The Lyriq-V uses a pair of motors to achieve that output, powered by the same 102 kWh battery pack as in the normal Lyriq.That's sufficient for an EPA range estimate of 285 miles (459 km), which is less than the regular all-wheel drive Lyriq, but the range hit is probably down to the Lyriq-V's 22-inch wheels. (When fitted to the regular Lyriq, the larger wheels also sap some of that car's 307-mile EPA range.)There are some other new additions to the Lyriq-V to go with that increased output, like V-mode, which delivers a 060 mph time of 3.3 seconds when you engage launch control. There's also Competitive mode, which Cadillac says "enables a suite of traction management features specifically engineered to increase vehicle agility." Cadillac Cadillac Super Cruise is standard on the Lyriq-V. Super Cruise is standard on the Lyriq-V. Push the red V button when you want to have fun. Cadillac Push the red V button when you want to have fun. Cadillac Super Cruise is standard on the Lyriq-V.Push the red V button when you want to have fun. Cadillac To go with this improved canyon-carving ability, the new front seats have more side bolstering, and there's a sporty new soundtrack, with powertrain sounds that were inspired by Cadillac's V-Series.R sports prototoype, which make use of the 23-speaker Dolby Atmos sound system. There are also bigger front brakes from Brembo and some unique styling parts like the front fascia and side rockers.Oh, and did we mention it still features Apple CarPlay, unlike GM's slightly more recent (but still Ultium-based) EVs?Production of the Lyriq-V begins in the coming weeks at GM's Spring Hill factory in Tennessee, with a starting price of $79,990 (including destination charge).Jonathan M. GitlinAutomotive EditorJonathan M. GitlinAutomotive Editor Jonathan is the Automotive Editor at Ars Technica. He has a BSc and PhD in Pharmacology. In 2014 he decided to indulge his lifelong passion for the car by leaving the National Human Genome Research Institute and launching Ars Technica's automotive coverage. He lives in Washington, DC. 9 Comments
    0 Comentários 0 Compartilhamentos 175 Visualizações
  • WWW.INFORMATIONWEEK.COM
    Why Enterprises Struggle to Drive Value with AI
    Lisa Morgan, Freelance WriterJanuary 23, 202510 Min ReadPanther Media GmbH via Alamy StockArtificial Intelligence is virtually everywhere, whether enterprises have an AI strategy or not. As AI capabilities continue to get more sophisticated, businesses are trying to capitalize on it, but they havent done enough foundational work to succeed. While its true that companies have been increasing their AI budgets over the last several years, its become clear that the ROI of such efforts varies significantly, based on many dynamics, such as available talent, budget, and a sound strategy. Now, organizations are questioning the value of such investments to the point of pulling back in 2025.According to Anand Rao, distinguished service professor, applied data science andartificial Intelligence at Carnegie Mellon University, the top three challenges are ROI measurement, realization, and maintenance.If the work Im doing takes three hours and now it takes a half an hour, thats easily quantifiable, [but] human performance is variable, says Rao. The second way is having a baseline. We don't [understand] human performance, but we are saying AI is 95% better than a human, but which human? The top-most performer, an average performer, or the new employee?When it comes to realizing ROI, there are different ways to look at it. For example, if AI saves 20% of five peoples time, perhaps one could be eliminated. However, if those five people are now spending more time on higher value tasks, then it would be unwise to let any of them go because they are providing more value to the business.Related:The other challenge is maintenance because AI models need to be monitored and maintained to remain trustworthy. Also, as humans use AI more frequently, they get more adept at doing so while AI is learning from the human, which may increase performance. Enterprises are not measuring that either, Rao says.[T]here's a whole learning curve happening between the human and the AI, and independently the two. That might mean that you may not be able to maintain your ROI, because it may increase or decrease from the base point, says Rao.Anand Rao, Carnegie Mellon UniversityTheres also a time element. For example, ChatGPT-4 was introduced in March 2023, but enterprises werent ready for it, but in six months or less, businesses had started investing systematically to develop their AI strategy. Nevertheless, theres still more to do.[T]he crucial fact is that we are still in the very early days of this technology, and things are moving very quickly, says Beatriz Sanz Saiz, global consulting data and AI Leader at business management consulting firm EY. Enterprises should become adept at measuring value realization, risk and safety. CIOs need to rethink a whole set of metrics because they will need to deliver results. Many organizations have a need for a value realization office, so that for everything they do, they can establish metrics upfront to be measured against, whether that is cost savings, productivity, new revenue growth, market share, employee satisfaction [or] customer satisfaction.Related:The GenAI ImpactWhile many enterprises have had plenty of success with traditional AI, Kjell Carlsson, head of AI strategy at enterprise MLOps platform Domino Data Lab, estimates that 90% of GenAI initiatives are not delivering results that move the needle on a sustained basis, nor are they on track to do so.[M]ost of these organizations are not going after use cases that can deliver transformative impact, nor do they have the prerequisite AI engineering capabilities to deliver production-grade AI solutions, says Carlsson. Many organizations are under the misconception that merely making private instances of LLMs and business apps with embedded GenAI capabilities available to business users and developers is an effective AI strategy. It is not. While there have been productivity gains from these efforts, in most cases, these have been far more modest than expected and have plateaued quickly.Related:Though GenAI has many similarities to driving business value with traditional AI and machine learning, it requires expert teams that can design, develop, operationalize and govern AI applications that rely on complex AI pipelines. These pipelines combine data engineering, prompt engineering, vector stores, guardrails, upstream and downstream ML and GenAI models, and integrations with operational systems.Successful teams have evolved their existing data science and ML engineering capabilities into AI product and AI engineering capabilities that allow them to build, orchestrate and govern extremely successful AI solutions, says Carlsson.Kjell Carlsson, Domino Data LabSound tech strategies identify a business problem and then select the technologies to solve it, but with GenAI, users have been experimenting before they define a problem to solve or expected payoff.[W]e believe there is promise of transformation with AI, but the practical path is unclear. This shift has led to a lack of focus and measurable outcomes, and the derailment of plenty of AI efforts in the first wave of AI initiatives, says Brian Weiss, chief technology officer at hyperautomation and enterprise AI infrastructure company Hyperscience. In 2025, we anticipate a more pragmatic or strategic approach where generative AI tools will be used to deliver value by attaching to existing solutions with clearly measurable outcomes, rather than simply generating content. [T]he success of AI initiatives hinges on a strategic approach, high-quality data, cross-functional collaboration and strong leadership. By addressing these areas, enterprises can significantly improve their chances of achieving meaningful ROI from their AI efforts.Andreas Welsch, founder and chief AI strategist at boutique AI strategy firm Intelligence Briefing, says early in the GenAI hype cycle, organizations were quick to experiment with the technology. Funding was made available, and budgets were consolidated to explore what the technology could offer, but they didnt need to deliver ROI. Times have changed.Organizations who have been stuck in the exploration phase without assessing the business value first, are now caught off guard when the use case does not deliver a measurable return, says Welsch. Set up a formal process and governance that assess the business value and measurable return of an AI product or project prior to starting. Secure stakeholder buy-in and establish a regular cadence to measure progress, ensure continued support or stop the project, [and] assess existing applications in your company. Which of those offers AI capabilities that you are not using yet? You dont need to build every app from scratch.Many Potholes to NavigateJamie Smith, CIO at University of Phoenix, says the cost of AI is being reflected more frequently in SaaS contracts, whether the contracts specify it or not.Weve seen this in the past 6 months, as the cost to compute using AI rises and rises and is set to continue to do so as models grow more robust -- and therefore more power hungry. SaaS providers are looking at their utility bills and passing the cost to businesses, says Smith. As a result, SaaS contracts -- and partnerships more broadly -- are going to come under a lot more scrutiny. If these costs are rising, then partners productivity needs to match.Edward Smyshliaiev, chief technology officer at Hedgefun:D says many organizations derail their AI ROI though a combination of overambition, under-preparation and a lack of alignment between AI teams and business leaders.AI isnt a magic wand; its a tool. To wield it effectively, companies need to ensure data pipelines are clean and reliable and invest in training staff to interpret and act on AI outputs, says Smyshliaiev. A shared vision between AI teams and leadership is critical -- everyone must know what success looks like and how to measure it.Sean Bhardwaj, managing partner at strategic consulting firm Breakthrough Growth Partners is a fractional chief AI officer and strategist. In this role, hes observed that two of the top reasons enterprises arent realizing better ROI on their AI initiatives is because they lack a foundational strategy and focus on the human side of AI adoption.For example, one of his clients wanted to implement AI-driven customer recommendations, only to discover mid-project that the data infrastructure couldnt support it. Similarly, organizations often assume that teams will adopt AI enthusiastically, which isnt necessarily the case.Planning for adoption with training and incentives is essential to see real engagement and impact, says Bhardwaj. I advise companies to see each stage as an investment in capability-building, with each phase laying the groundwork for the next.All too often, organizations discard AI initiatives that dont meet initial expectations rather than rethinking their approach.John Bodrozic, co-founder and CIO at homeowner lifecycle platform HomeZada, has observed that enterprises are relying solely on standalone AI to solve problems or find new growth opportunities, but they are ultimately being led by development teams and not product management teams.There are so many areas where AI can impact bottom-line cost savings and top line revenue growth, but only when these use-case scenarios are explored by cross-functional teams that combine software and AI development specialists with members of the functional team, says Bodrozic. Without this direct interaction, ROI from AI is challenging at best.The Business ViewA 2023 Gartner report found that only 54% of AI projects get past the proof-of-concept phase, and many of those fail to deliver on the promised financial or operational impact. According to Ed Gaudet, CEO and founder of health care risk management solution provider Censinet, companies may believe that AI will make everything better, but they never specify what better means.Enterprises must take a phased, strategic approach [that requires] defining clear use cases that have actual business value like the automation of a drudgery, supply chain optimization, or leveraging chatbots to meet better customer experience. Secondly, organizations need to create structural capabilities like a good data governance framework, scalable infrastructure and strong developer and engineering skills. Companies that train their employees in AI have a 43% higher success rate deploying AI projects.Nicolas Mougin, consulting and support director at global cloud platform Esker, credits rushed implementations as a reason for ROI shortfalls.The pressure to stay competitive in a rapidly evolving technological landscape drives many organizations to implement AI without sufficient planning. Instead of conducting thorough needs assessments or piloting solutions, businesses often rush to deploy tools in the hope of gaining an edge, says Mougin. However, hastily executed projects overlook key considerations such as data readiness, scalability or user adoption.Edward Starkie, director, GRC at global risk intelligence company Thomas Murray, believes that most organizations are not in a suitable position to be able to adopt AI and exploit it to its fullest extent.To be successful there is a level of maturity that is required which [depends] upon having the necessary mechanisms supporting the design, creation and maintenance of the technology in a field which is short of genuine expertise, says Starkie. [E]specially at board level, a lack of education is a key contributing factor. [Mandates] are being issued without the without understanding the importance of the core components being in place.About the AuthorLisa MorganFreelance WriterLisa Morgan is a freelance writer who covers business and IT strategy and emergingtechnology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.See more from Lisa MorganNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Comentários 0 Compartilhamentos 175 Visualizações