• I tested this new smart ring with no subscription, and it could replace my Oura
    www.zdnet.com
    ZDNET's key takeaways The RingConn Gen 2 is a subscription-free smart ring that retails for $300 It monitors your sleep, activity, stress, and vitals, and it's got a marathon battery life The only downside is the user interface feels underdeveloped. $299 at Amazon Most smart rings these days claim to offer marathon battery lives but often fall short of their promise. Take theOura Ring 4, for example, which claims to have an eight-day battery life but often only lasts me around four days, tops. I still love and willingly recharge that ring, but part of the appeal of smart rings, compared to smartwatches, is their longer battery life.Also: The best smart rings of 2025: Expert tested and reviewedI've got one for you if you want a smart ring with a truly competitive battery life. I've been testing the RingConn Gen 2 smart ring that boasts a marathon battery life of 10 to 12 days. The smart ring comes with other perks and a few drawbacks that I'll get to below. details View at Amazon Right off the bat, the RingConn 2 has some green flags. Unlike competing smart ring brands with products that start at $350 and go all the way up to $400 or $450, this smart ring costs $300 and does not require a subscription to gain full access to your health data. Sizing starts at size six and goes through size 14, and you can get the ring in three colors: silver, black, and gold.The build of the ring is more square than circular, but I found myself unbothered by this unique shape. It fits comfortably around my finger with no problems. Despite healthy and frequent wear, the ring doesn't tarnish easily either.Also:Oura Ring's ovulation tracking beats the calendar method, according to this studyThe RingConn Gen 2 comes with a case that extends its already impressively long battery life, powering the smart ring's empty battery for over 150+ days. As someone who is constantly charging several wearable devices at a time, this long-lasting charging case that I could use without hooking the smart ring up to an outlet made me partial to RingConn.Most smart rings offer up daily scores for two to three important health metrics: sleep, activity, and readiness. Readiness is calculated based on yesterday's activity, how you slept, and other biometric data, like how late your heart rate dropped as you slept.The RingConn Gen 2 measures your vitals, sleep, activity, and stress but doesn't measure readiness. Instead of readiness, it provides a Wellness Balance feature. It takes all of the aforementioned data and displays it in a flower-like graph, with longer petals for the biometrics that are meeting or excelling the recommended benchmarks and shorter petals for those that aren't.When all your petals are the same length it indicates that your wellness is at equilibrium. I liked that I could see all the important data displayed in such a digestible and visual manner right as I opened the app. RingConn's Wellness Balance compiles your activity, sleep, vitals, and stress scores into a holistic illustration of your health. Screenshot by Nina Raemont/ZDNETHardcore trainers use the readiness or energy feature on their smart ring apps to gauge how intense their exercise regimen should be for the day. If that's you, you might be displeased with the Wellness Balance functionality, and I'd recommend the Oura Ring, Ultrahuman Ring, or Galaxy Ring instead.The app delivers your scores alongside context that helps inform the reasoning behind your sleep or vitals score. I was ill one day while testing the ring and spent the entire day sleeping. Because of the large amount of time I spent in bed, it told me that too much sleep can slow down my metabolism or lead to weight gain.Also: The best fitness rings of 2025: Expert tested and reviewedRingConn says that the battery on its second-generation ring lasts up to 10 to 12 days, but in my testing, I found that it only lasted seven. Still, that's far longer than the battery lives of other smart rings I've tried, which last four to five days on a single charge. I can say without a doubt that this smart ring has the most impressive battery life out of every brand I've tried. Nina Raemont/ZDNETAccording to its website, the RingConn Gen 2 also boasts a sleep apnea detection feature with 90.7% accuracy. If you're a chronic snorer looking to learn more about how your breathing impacts you throughout the night, the sleep apnea feature could help monitor your conditions and answer some of your questions.It tells you when there are significant or minor outliers in your sleep throughout the night, not only providing a graph that details this but also a timeline that shows when your SpO2 fluctuated during the night.I wore the Oura Ring 4 in tandem with the RingConn Gen 2 and found that the latter seems to underestimate both the time spent asleep and the steps I've taken throughout the day.Also: Oura Ring 3 vs Oura Ring 4: Should you buy the discounted smart ring or the brand's newest?The RingConn and Oura both ranked my sleep efficiency in the 87th and 88th percentile. Oura said I got 11 hours and eight minutes of sleep, while RingConn said I spent 10 hours and 50 minutes asleep. RingConn reported 11,091 steps, while Oura reported 15,259 steps. I've seen in various Reddit threads that the Oura Ring tends to overestimate step count, which could account for the great disparity in steps between rings.On a healthy night of sleep, Oura recorded eight hours and two minutes, a sleep efficiency of 94%, and a sleep score of 90. RingConn recorded an 84 sleep score, seven hours and 45 minutes asleep, and a sleep efficiency of 91%. In both cases, RingConn is subtracting around 15 minutes from my night's sleep.ZDNET's buying adviceI enjoyed most aspects of wearing this subscription-free smart ring, and at $300 ($50 less than competitors), it's a smart ring I'd recommend to those looking for an alternative to Oura's subscription-based services. especially if you want a smart ring with a battery life that will actually last you a week before recharging.The one area where I noticed RingConn's smart ring lacking was in its user interface. The app feels underdeveloped, and some of the messages lacked personal context that proved they were being generated off of my own data. On one good sleep score day, all that it said when I clicked into the sleep tab was: "Good sleep makes you happy." That's my only true gripe, and I hope the recommendations can become more tailored and informative in future software updates.Otherwise, the RingConn Gen 2 is an impressive smart ring with comprehensive health metric monitoring that's on the cheaper end of the smart ring spectrum. It accurately tracks sleep with features like sleep apnea monitoring that could help you uncover your snoring patterns, it's got a marathon battery life (plus a charging case with 150 days' worth of juice in it), and a build you can wear comfortably.Featured reviews
    0 Reacties ·0 aandelen ·58 Views
  • The First Playercount Numbers For The Avowed Launch Are In
    www.forbes.com
    AvowedObsidianI remain somewhat confused about the pushback to reporting on playercount numbers, as they often paint very clear pictures of enormous hits (Helldivers 2, Marvel Rivals) or disastrous misses (Suicide Squad, Concord), even if console platform figures are unknown, and were just going on Steam data. Trends there indicate trends elsewhere, most of the time.Admittedly, Avowed is more complicated. This is a game that yes, is launching on Steam, but also Xbox Game Pass with cross-purchase on Battlenet. So it stands to reason that Steam numbers would be less than they would be otherwise. However, given that this is Xbox were talking about with less consoles sold than PS5, where Avowed hasnt launched, and only a portion of Game Passs 34 million subscribers have access to day one launches, it may not be quite as huge a disparity some think.What we can do is compare the Avowed launch to other, similar games that did also launch on Game Pass at the same time, keeping in mind that Avowed is a smaller-scale game thats a new IP for most (in effect, even if its based on Pillars of Eternity).AvowedSteamOn launch day, yesterday, Tuesday, February 18, Avowed peaked at 17,171 players on Steam, while in fact being the highest selling game on the platform for a time. This is 4,000 or so higher than the previous peak, which occurred during early access of 13,338. The weekend is coming up, which should increase things further. Using the jump from a 9,000 launch to a 13,000 peak in Early Access on the weekend, we can maybe estimate it reaches 22-23,000 or so.The most recent comparison would be Indiana Jones and the Great Circle, which peaked lower at 12,138 concurrent players at launch, despite being a well-known IP and with much-liked studio behind it. Then, of course, there are much higher-profile games where Avowed is only putting up a fraction of those numbers. Starfield, with a 330,723 peak, and Halo Infinite with a 272,586 peak. Forza Horizon 5 peaked at 81,096. Way back in the day, Obsidians far more comparable The Outer Worlds peaked at 20,349.There are other RPGs to compare it to, albeit no, no Game Pass launch, but for something like Dragon Age: The Veilguard, it released wide on PC, Xbox and 75 million PS5s. That peaked at 89,418 on Steam. Recent surprise, hardcore RPG hit Kingdom Come Deliverance II maxed out at 256,206 players.Dragon Age: The VeilguardBioWareIs Avowed underperforming? Overperforming? The problem with Microsoft is we just have no real idea what they consider a hit unless they start bragging about it. But even when they do, it doesnt say too much. They said Starfield had the most players at launch of any Bethesda game. But again it wasfree with Game Pass.I think Microsoft believes in Obsidian. I mean, they had them make both this game and The Outer Worlds 2, also out this year, and I think they recognize that its one of their most solid studio purchases, able to make quality games with budgets not on the scale of some of their biggest, perhaps overly big, blockbusters.Given that Avowed will essentially be the biggest pure Xbox exclusive of this year (I dont buy Fable is coming out in 2025), I think this may be a bit of an underperformance in terms of players and also review scores, but not enough to be bad and I think Obsidian and their future projects will be fine from here.Follow me , and .Pick up my sci-fi novels the and
    0 Reacties ·0 aandelen ·59 Views
  • US investigates whether DeepSeek obtained Nvidia chips through Singapore to bypass restrictions
    www.techspot.com
    The big picture: Singapore has found itself at the center of an investigation into the distribution of Nvidia's advanced semiconductors. The inquiry comes as Washington examines whether Chinese AI startup DeepSeek has been acquiring chips through the Southeast Asian nation, potentially circumventing US export controls. Singapore's Second Minister for Trade and Industry, Tan See Leng, addressed the issue in a statement to lawmakers. According to Tan, while Nvidia reported that 22 percent of its sales in the August-October 2023 period were attributed to Singapore, this figure primarily reflects billing practices rather than physical product delivery.Tan emphasized that the actual physical delivery of Nvidia products to Singapore represents less than one percent of Nvidia's overall revenue for the three-month period ending in October 2023. These deliveries were primarily for major enterprises and government use within Singapore.The discrepancy between billing attribution and physical delivery is not unique to Nvidia or Singapore. Tan explained that it is common practice for global entities to centralize billing for procured goods and services in their hubs, separate from where products are shipped. This strategy allows multinational companies operating across borders to streamline their financial operations, often billing everything through their headquarters address while shipping items directly to where they're needed.Nvidia has long acknowledged this practice in its financial reporting, stating that revenue by geographic area is based on the billing location of the customer, which may differ from the end customer and shipping location.Singapore's position in this matter is particularly sensitive due to its close ties with both China and the United States. The country has become a hub for many Chinese tech companies, including ByteDance's TikTok, which has its headquarters in Singapore. // Related StoriesMeanwhile, Singapore considers the US a key strategic partner in trade and politics, with significant military cooperation. The ongoing US-China trade tensions and technology restrictions have put Singapore in a challenging position. The country is keen to maintain its reputation as a business-friendly hub while also complying with international regulations and export controls.In response to the allegations surrounding DeepSeek's acquisition of Nvidia chips, Tan assured that the Singapore government is cooperating fully with US authorities to investigate the matter. He emphasized that Singapore does not condone any businesses using their Singaporean address to circumvent export controls set by other countries.The investigation comes in the wake of DeepSeek's release of a chatbot called R1, which has demonstrated capabilities comparable to US-developed tools. This development has raised questions about China's progress in AI technology and whether this progress has relied on Western technology.
    0 Reacties ·0 aandelen ·54 Views
  • Nvidia GeForce RTX 5070 Ti Review
    www.techspot.com
    Nvidia released the RTX 5090 and 5080 about a month ago well, sort of. Some of you, maybe 10, were able to buy an RTX 5090, with about 80 managing to get a 5080. So, technically, they were released. Some of those RTX 5090s have already melted, meaning even fewer of you currently have one, but hey, more are surely on the way.But putting that launch aside for a second, we now have the GeForce RTX 5070 Ti to consider. This new GPU is a cut-down version of the 5080, using the same GB203 die. The 5070 Ti is actually far more interesting because it still offers a 16GB VRAM buffer, but instead of costing $1,000 which is an outrageous price for a 16GB graphics card in 2025 the MSRP is set at $750.Now, we know Nvidia's MSRP figures are about as accurate as trying to play Fortnite after a few too many drinks, but the 5070 Ti pricing should be a bit more realistic since this product is likely to face competition from AMD. Of course, it might take a few weeks for prices to settle, but if the 5070 Ti can actually hit the $750 MSRP, it will almost certainly be a much better buy than the RTX 5080.RTX 5070 TiRTX 4070 Ti SuperRTX 5080RTX 4080 SuperRTX 4080Price MSRP$750$800$1,000$1,000$1,200Release DateFeb 20, 2025Jan 24, 2024Jan 30, 2025Jan 31, 2024Nov 16, 2022ProcessTSMC 4NDie Size (mm2)378 mm378.6 mm378 mm379 mmCore Config8960 : 280 : 968448 : 264 : 11210752 : 336 : 12810240 : 320 : 1129728 : 304 : 112L2 Cache (MB)48 MB64 MBGPU Boost Clock2452 MHz2610 MHz2617 MHz2550 MHz2505 MHzMemory Capacity16 GBMemory Speed28 Gbps21 Gbps30 Gbps23 Gbps22.4 GbpsMemory TypeGDDR7GDDR6XGDDR7GDDR6XBus Type / Bandwidth256-bit, 896 GB/s256-bit, 672 GB/s256-bit, 960 GB/s256-bit, 736 GB/s256-bit, 717 GB/sTotal Board Power300W285W360W320WThat's because it would cost at least 25% less while packing just 17% fewer cores, though they are clocked 6% lower. There is a 25% reduction in L2 cache, but since both models use a 256-bit memory bus, memory bandwidth has only been reduced by 7% due to the use of 28Gbps GDDR7 memory instead of 30Gbps for the 5080.In other words, the performance downgrade shouldn't be all that significant. And again, with both models featuring a 16GB VRAM buffer, the $750 price of the 5070 Ti seems far more reasonable. Essentially, the 5070 Ti replaces the 4070 Ti Super, coming in $50 cheaper while offering a slight performance upgrade. Of course, we have benchmarked both models, so let's get into the results.The CardsFor testing, Nvidia sent over the MSI Ventus 3X version of the 5070 Ti, which is overclocked above spec. However, for all testing, we ran the card at reference clock speeds, as we do with all models. For those interested, here's how the Ventus 3X performs out of the box when installed inside an ATX case in a 21C room. After an hour, we recorded a peak core temperature of 64C and a memory temperature of 76C, with a fan speed of 1,700 RPM.This allowed the card to maintain an average core clock frequency of 2,780 MHz while consuming 267 watts. Overall, it's a cool-running graphics card.We also tested the Gigabyte Aero OC model, which, under the same conditions, peaked at just 59C with a memory temperature of 64C quite a bit cooler than the Ventus 3X. However, the Aero OC is a much larger graphics card. It also achieved an average core clock frequency of 2,820 MHz, a fan speed of 1,250 RPM, and a power draw of 280 watts.Test System SpecsCPUAMD Ryzen 7 9800X3DMotherboardMSI MPG X870E Carbon WiFi (BIOS 7E49v1A23 - ReBAR enabled)MemoryG.Skill Trident Z5 RGB DDR5-6000 [CL30-38-38-96]Graphics CardsGeForce RTX 4070 GeForce RTX 4070 Super GeForce RTX 4070 Ti GeForce RTX 4070 Ti Super GeForce RTX 4080 GeForce RTX 4080 Super GeForce RTX 4090 GeForce RTX 5070 GeForce RTX 5080 GeForce RTX 5090 Radeon RX 7700 XT Radeon RX 7800 XT Radeon RX 7900 GRE Radeon RX 7900 XT Radeon RX 7900 XTXATX CaseMSI MEG Maestro 700L PZPower SupplyMSI MPG A 1000G ATX 3.0 80 Plus Gold 1000WStorageMSI Spatium 1TB M470 PCIe 4.0 NVMe M.2Operating SystemWindows 11 24H2Display DriverNvidia GeForce Game Ready 572.42 AMD Radeon Adrenalin 24.12.1Gaming BenchmarksMarvel RivalsStarting with Marvel Rivals, we see that the 5070 Ti at 1440p performs similarly to the 4080 Super and, therefore, the original RTX 4080, making it just 12% slower than the 5080. It's also 14% faster than the 7900 XT and 17% faster than the 4070 Ti Super, providing a reasonable performance boost over the model it replaces.At 4K, it remains 12% slower than the 5080 and again delivers 4080 Super-like performance, making it around 20% faster than the 4070 Ti Super and 7900 XT.Stalker 2: Heart of ChornobylNext, we have Stalker 2, where the 5070 Ti is much less impressive something we also observed with the 5080. Here, the 5070 Ti was just a single frame faster than the 4070 Ti Super at 1440p, making it only 8% faster than the 7900 XT. Interestingly, it was also just 11% slower than the 5080.At 4K, it is only 5% faster than the 4070 Ti Super and 11% faster than the 7900 XT, making these results fairly disappointing.Counter-Strike 2The new GeForce 50 series has been generally underwhelming in Counter-Strike 2 at 1440p, and the 5070 Ti is no exception, beating the 4070 Ti Super by a slim 1.5% margin while being 8% slower than the 7900 XT.We do see better results at 4K, but even here, the 5070 Ti is just 10% faster than the 4070 Ti Super and a mere 5% faster than the 7900 XT, so it's not particularly exciting.God of War RagnarkIn God of War Ragnark, the 5070 Ti is 12% faster than the 4070 Ti Super at 1440p, as well as the 7900 XT.At 4K, the lead extends to 17%, which, while not quite the 20% margin that would be more substantial, is still one of the better results we've seen so far.Delta ForceFor some reason, the GeForce 50 series has been underwhelming in Delta Force, and as a result, the 5070 Ti is actually 4% slower than the 4070 Ti Super and slightly slower than the 7900 XT.Increasing the resolution to 4K helps somewhat, but even then, the 5070 Ti is just 4% faster than the 4070 Ti Super, meaning performance with this new generation remains virtually unchanged in this game.Star Wars Jedi: SurvivorThe Star Wars Jedi: Survivor results are extremely unimpressive, with the 5070 Ti being just 9% faster than the 4070 Ti Super and 11% faster than the 7900 XT.As usual, the 4K data is slightly more favorable, but even then, the 5070 Ti is just 12% faster than the 4070 Ti Super.A Plague Tale: RequiemA Plague Tale: Requiem provides some of the more favorable data for the 5070 Ti. At 1440p, it is 20% faster than the 4070 Ti Super, 24% faster than the 7900 XT, and just 11% slower than the RTX 5080.Oddly, however, performance slips slightly at 4K, with the 5070 Ti leading the 4070 Ti Super by 17% and the 7900 XT by 19%, while trailing the 5080 by a 13% margin.Cyberpunk 2077: Phantom LibertyCyberpunk 2077 delivers another underwhelming 1440p result, with the 5070 Ti providing just a 9% uplift over the 4070 Ti Super, or 10% over the 7900 XT.That said, the 4K results are somewhat more impressive though not outstanding showing an 18% performance increase over the 4070 Ti Super and a 14% lead over the 7900 XT.Dying Light 2 Stay HumanThe Dying Light 2 results are more positive, showing a 19% improvement at 1440p, making the 5070 Ti 17% faster than the 7900 XT.At 4K, the margin extends to 25% over the 4070 Ti Super and 27% over the 7900 XT, making it one of the stronger results we've seen.Dragon Age: The VeilguardThe Dragon Age: The Veilguard results are less promising. Here, the 5070 Ti somehow trails the 4070 Ti Super by 3%, making it just 9% faster than the 7900 XT.That said, at 4K, it manages to edge out the 4080 Super, coming in 13% slower than the RTX 5080 but 11% faster than the 4070 Ti Super.War ThunderThe 5070 Ti struggled to show significant improvement over the 4070 Ti Super in War Thunder at 1440p, delivering just 5% more performance, with an average of 378 FPS.That margin increased to 10% at 4K, where it delivered performance comparable to the 7900 XTX, while trailing the RTX 5080 by just 10%.Marvel's Spider-Man RemasteredSpider-Man Remastered showed almost no performance difference between the 5070 Ti and the 4070 Ti Super at 1440p, with the newer model being only 1% faster.Even at 4K, the 5070 Ti delivered just a 5% performance uplift, though that did make it 14% faster than the 7900 XT.Hogwarts LegacyIn Hogwarts Legacy, the 5070 Ti was 12% faster than the 4070 Ti Super at 1440p, though it remained 15% slower than the 7900 XT.However, at 4K, the 5070 Ti was able to match the 7900 XT and the RTX 4080, making it 14% faster than the 4070 Ti Super.The Last of Us Part IPerformance in The Last of Us Part I was disappointing. The 5070 Ti was just 5% faster than the 4070 Ti Super at 1440p and 4% faster than the 7900 XT.At 4K, it managed to beat both the 4070 Ti Super and 7900 XT by a 12% margin consistent with other results we've seen.Star Wars OutlawsTesting Star Wars Outlaws at 1440p showed only a 5% gain for the 5070 Ti over the 4070 Ti Super, though that did make it 19% faster than the 7900 XT.Even at 4K, performance gains remained weak, with just a 6% uplift, averaging 36 FPS.StarfieldLastly, in Starfield, the 5070 Ti was just 9% faster than both the 4070 Ti Super and the 7900 XT at 1440p, averaging 85 FPS far from impressive.At 4K, performance worsened relative to expectations, with the 5070 Ti just 4% faster than the 4070 Ti Super. It was, however, 10% faster than the 7900 XT, though even that result was disappointing.Performance SummaryAlthough we didn't include 1080p performance data in the benchmarks above, we did run the tests and have the average results for reference. Let's take a closer look.Now at 1440p, the 5070 Ti was just 7% faster than the 4070 Ti Super and 8% faster than the 7900 XT, while being 12% slower than the 5080. Relative to the 5080, this looks decent, but compared to the model it replaces, the results are underwhelming.The 1440p results mirrored the general trend seen across the benchmarks, with the 5070 Ti providing only modest gains over its predecessor.As observed in many cases, the 4K data was slightly more favorable. The 5070 Ti was now 11% faster on average than the 4070 Ti Super, 14% faster than the 7900 XT, and 13% slower than the RTX 5080. While it stacks up relatively well against the 5080, its performance remains weak compared to the models it is replacing.Power ConsumptionFor power consumption measurements, we are using the MSI RTX 5070 Ti Ventus 3X OC model, which has been manually downclocked to match Nvidia's official specifications, as we test all GPUs at reference clock speeds. However, MSI may have adjusted settings such as voltages, which could affect efficiency. This likely explains why the 5070 Ti consumes as much power as the RTX 5080 in our testing.In short, the 5070 Ti uses roughly the same amount of power as the RTX 4080 and 4070 Ti Super, making it significantly more efficient than the Radeon RX 7900 XT.Ray Tracing PerformanceNow, let's take a look at ray tracing performance. We'll start with Metro Exodus Enhanced at 1440p, where the 5070 Ti is 13% faster than the 4070 Ti Super, indicating that RT performance hasn't seen a major improvement.RT - Metro Exodus EnhancedAt 4K, we see a 24% performance increase, though gains of over 20% in rasterization testing were rare.RT - Alan Wake IIThe Alan Wake II results are disappointing. At 1440p, the 5070 Ti was only 2% faster than the 4070 Ti Super.At 4K, the margin increased to 11%, which is better, but the card only managed 31 FPS with upscaling enabled making it difficult to consider this a playable experience.RT - Cyberpunk 2077: Phantom LibertyCyberpunk 2077 delivered even more disappointing results. In fact, at 1440p, the 5070 Ti was slightly slower than the 4070 Ti Super, albeit by just one frame.Even at 4K, performance remained similar, with the 5070 Ti matching the 4070 Ti Super while being 11% slower than the original RTX 4080. These are undeniably poor results for Cyberpunk 2077.RT - Marvel's Spider-Man RemasteredAt 1440p, the 5070 Ti is CPU-limited, allowing it to match the RTX 5080 but making it just 8% faster than the 4070 Ti Super.Increasing the resolution to 4K helps somewhat, with the 5070 Ti now 16% faster than the 4070 Ti Super. While not an outstanding result, it is at least an improvement over the 1440p performance.RT - Dying Light 2 Stay HumanTesting Dying Light 2 at 1440p showed the 5070 Ti outperforming the 4070 Ti Super by 13%, while also being 13% slower than the RTX 5080.At 4K, the margins increased slightly, with the 5070 Ti now 17% faster than the 4070 Ti Super and 14% slower than the RTX 5080.RT - Black Myth: WukongFinally, in Black Myth: Wukong, the 5070 Ti was just 6% faster than the 4070 Ti Super at 1440p, averaging 57 FPS even with quality DLSS upscaling enabled.At 4K, the experience is even worse, with the 5070 Ti delivering a mere 31 FPS with very high-quality ray tracing and quality DLSS upscaling. This is just 7% faster than the 29 FPS recorded on the RTX 4070 Ti Super, making for an equally unplayable experience.Ray Tracing Performance SummaryAlthough we did not include individual 1080p results in the per game breakdown, we do have the average data with quality upscaling enabled here.At 1440p, the 5070 Ti was only 6% faster than the 4070 Ti Super on average, which is an extremely underwhelming result. It was also 9% slower than the RTX 4080 and 14% slower than the RTX 5080.Unfortunately, the 4K results are not much better. The 5070 Ti was just 13% faster than the 4070 Ti Super and 9% slower than the 4080 Super, averaging only 51 FPS.Overall, the 5070 Ti does little to advance ray tracing performance. The technology still requires significant hardware investment to deliver a truly enjoyable experience.Cost per FrameMSRPWe have several pricing breakdowns to go through, starting with the official MSRP data, which, at least on Nvidia's side, has been highly unreliable in recent times. This is a major issue because if the RTX 5070 Ti were readily available at its $750 MSRP, it would be a solid option in today's market. It would offer the same level of value as the 7800 XT while providing significantly better ray tracing performance and DLSS support.Additionally, it would represent a 16% improvement in cost per frame compared to the 4070 Ti Super not groundbreaking for a next-gen product, but still a reasonable step forward. If you were in the market for a graphics card in the $800 price range, the 5070 Ti would be an exciting option. However, that's only if it were actually available at $750, which seems highly unlikely.RetailBut, let's stick with Nvidia's $750 pricing narrative and compare it to the best GPU pricing available in mid-2024. As shown, the 5070 Ti holds up well, beating the value of the 7900 XT, essentially wiping that product out. It also offers 14% better value than the 4070 Ti Super, which is a decent improvement certainly not groundbreaking after 12 months, but still a step in the right direction. And in the GPU market, progress has been hard to come by.MSRP RealWhile there will be listings for the RTX 5070 Ti at $750, stock is expected to be extremely limited. Most models will be priced well above MSRP, with typical asking prices closer to $900. This mirrors the situation with the RTX 5080, which has a supposed MSRP of $1,000 but actually sells for at least $1,200, and the 5090, which is marketed at $2,000 but is realistically closer to $2,500.Adjusting for these real-world prices $1,200 for the RTX 5080 and $900 for the RTX 5070 Ti the 5070 Ti offers the same value as the 4070 Ti Super. In other words, there is no improvement in cost per frame, which is exactly what we've come to expect from Nvidia. That said, when compared to the 7900 XT, the 5070 Ti still looks like a strong option though mostly because AMD botched the 7900 XT's launch pricing so badly.Ultimately, what this data tells us is, if the RTX 5070 Ti sells for $900 or more, it's a complete flop It would have been significantly better to buy an RTX 4070 Ti Super a year ago, or even an RTX 4070 Ti two years ago. Having an extra two years of use from your $800 purchase adds far more value than waiting for a card that only delivers 20% more performance at the same price.Regional / Australian RetailThe situation in Australia (and many other places we can assume) is even worse. Based on current retail pricing, the 7900 XT is slightly more competitive, but not by much certainly not enough to make it a recommended option, given that it only improves cost per frame by 6% compared to the 4070 Ti Super.The RTX 5070 Ti is supposed to have a recommended retail price (RRP) of $1,509 AUD. While some base models may be listed at that price, stock is extremely limited, and many retailers won't even have those models available. Instead, the cheapest model most buyers will find will cost 8% more, at around $1,630 AUD though in reality, most models will be priced even higher.What this means is that in Australia, even the least expensive RTX 5070 Ti models come in at an 11% increase in cost per frame over existing 4070 Ti Super stock. In other words, 23% more money, for 11% more performance, so that's crap.Even at $1,509 AUD, the pricing is bad. A direct conversion from $750 USD to AUD equals around $1,180 AUD. After adding taxes, the price should be just under $1,300 AUD essentially the current price of the 4070 Ti Super. Yet for some reason, another $200 AUD has been tacked on.What We LearnedSo there you have it if sold at the $750 MSRP, the new GeForce RTX 5070 Ti would be a reasonable purchase, at least in the current market. Setting opinions aside for a moment, these are the facts: compared to the 4070 Ti Super at 4K, the 5070 Ti is, on average, 11% faster, with margins reaching up to 24% in our testing. However, gains of 20% or more were rare, as the 11% average suggests.Ray tracing performance follows a similar pattern, averaging 13% faster with margins reaching up to 24%. In general, the 5070 Ti is typically less than 15% faster than the 4070 Ti Super.For those still using an RTX 3070, for example, the 5070 Ti offers, on average, 103% greater performance at 1440p and 155% greater performance at 4K a massive upgrade. However, the MSRP has increased by 50%, or 22% when adjusted for inflation. The biggest advantage is getting twice as much VRAM, which is particularly important given that the RTX 3070's 8GB buffer is frequently maxed out in modern games.Of course, all of this is based on the $750 MSRP, which, at least at launch, is unlikely to be a reality and may not be for several months. It will be interesting to see if AMD can capitalize on this and effectively push the 5070 series out of relevance. However, given AMD's track record, that seems unlikely. Still, this is the best opportunity they've had to strike in years.There's not much more to say about the 5070 Ti. At $750, it would be a decent deal not amazing or particularly exciting, but still a good option. However, if forced to pay over $800 it's no longer attractive, and a regretful purchase for those two didn't just snap up a 4070 Ti Super a year ago.For now, we'll have to wait and see where pricing settles, likely in a few months, and, of course, what AMD ultimately brings to the table in response.Shopping Shortcuts:Nvidia GeForce RTX 5070 Ti on AmazonNvidia GeForce RTX 5080 on AmazonNvidia GeForce RTX 5090 on AmazonAMD Radeon RX 7900 XTX on AmazonNvidia GeForce RTX 4070 Super on AmazonAMD Radeon RX 7800 XT on AmazonAMD Radeon RX 7900 XT on Amazon
    0 Reacties ·0 aandelen ·50 Views
  • The Humane AI Pin was always destined to fail, heres why
    www.digitaltrends.com
    Table of ContentsTable of ContentsProblems from the startA predestined failure thanks to a key errorWhy the Humane AI Pin was exciting and interestingApple in pedigree, but not in natureBattling the giants without a giant budgetCES 2024 saw a wave of innovations in the AI space, and one of the most hyped was the Humane AI Pin which launched to much aplomb thanks to the founders previous work at Apple. Except, it quickly became clear that despite raising hundreds of millions of dollars the company had a battle to persuade people that this was the future.I saw the Humane AI Pin at CES last year and at first, I was excited about a dedicated AI device. However, a high price tag, a required monthly subscription, and a lack of clear purpose made itRecommended VideosYet, the writing was on the wall last year, and today its official: the company has today announced that it is killing the AI Pin. As part of an acquisition of the Humane assets by HP for $116 million far less than the $230 million the company raised the AI Pin is being discontinued and all servers will stop working next week.RelatedYes, your very expensive standalone AI gadget just became a paperweight. Heres why the Humane AI Pin was destined to fail, and where I think the company completely missed the trick.HumaneTo understand the promise of the AI Pin, you need to first understand how the company positioned it. It was billed as a wearable computer that promised to free you from your smartphone, but at $699 upfront and a $24 per month subscription, it needed to offer a lot of features. All the initial reviews summarized it correctly: it just didnt work.Imagine the smarts of your phone built into a small device without a screen, but a small projector that displays the information in the palm of your hand. When you need to input text, dial a number to make a call, calculate a tip, or do anything else, just ask the assistant on the AI Pin. These were lofty goals, and the company ultimately failed to even come close.Where things got even murkier was that if you bought the AI Pin, the mandatory subscription wasnt linked to your existing number in any way. There was also no way to transfer that subscription or sell the product to someone else, so you had to believe in the phone-less future, or at least believe the companys marketing that this product was the future. It turns out that few people want this future.HumaneThe answer is fairly obvious, and one that I saw coming a year ago. During my nearly 20-year career in technology, I spent many years working directly with customers, and while the Humane AI Pin seemed like a cool product, it tried to change one heavily ingrained user behavior: the use of a phone screen.The company predicted 100,000 sales in year one but achieved just 10% of this, and many of these were likely reviewers who wanted to understand the hype. On many occasions last year, I was one of these, but once the first reviews came out, it was clear that this product was a bust.Considering that the Rabbit R1 standalone AI device launched at just $199 just weeks after the Humane AI Pin, its no surprise that the company has killed the product, its just a surprise that it took this long. As we found in our Rabbit R1 review, this product also isnt the answer, but it works more as a companion for your phone than trying to replace it, which is the fundamental error that Humane made with its first product.HumaneThe reason that the AI Pin was interesting to me is fairly simple: weve all had numerous occasions where its inconvenient to pull out your phone to calculate something, jot down a note, or pull up navigation directions. Coupled with bigger displays that also drain the battery life of your phone, there is a real problem that needs to be solved.Unfortunately, the Humane AI Pin is not the answer. Rather than approach it like Rabbit and build a companion for your phone which likely would not have enabled them to raise anywhere near as much money the company decided that no one needs a phone. Yet, the problem still exists.The answer seems fairly straightforward, and its not one that most third-party companies can build. Instead, its down to companies like Google, Samsung, and Apple to build more useful AI features into your phone. Were already seeing this with Google Gemini, Galaxy AI, and Apple Intelligence, but they still have a long way to go to solve the use cases for a product like the AI Pin.HumaneHumane was created by two ex-Apple engineers Imarin Chaudri and Bethany Bongiorno who both have undoubted pedigree in the technology space. While this pedigree helped the company raise $230 million in funding from some of the biggest companies in technology, it didnt help them kickstart a visionary shift in the industry.Apple has a pedigree for shifting consumer behaviors, with products like the iPod, iPhone, and Apple Watch inspiring, or revolutionizing entire industries. This was down to more than just the product, with the company laser-focused on precision marketing that made its products desirable.In its online presence, Humane execs like Bethany were very vocal about extolling the benefits of the Humane AI Pin, and theres no doubt that the company had some great feedback from actual users. However, these would have been outweighed by the sheer amount of negative feedback.As Bethany tweeted: the company had a big (and nearly impossible) goal, which was made even harder as its biggest competition the phone makers themselves kickstarted the current trend of AI in smartphones.HumaneAsk any founder if raising $230 million in your first four rounds of funding is worthy and many will probably gladly accept it. Then ask the same people if they think its enough to compete in the hottest space against companies like Apple, Google, or Samsung, and the answer will be very different.Ultimately, the Humane AI Pin was destined to fail as it tried to battle the biggest companies in the world with a fraction of the budget. To its credit, Humane has rolled out many new features over the past year, but it did so slower than the very phones it was trying to replace.Humane tried to change the world with its first product but lacked the foresight to fit in the market first. Had it done so, the company may have found that it had grossly overestimated the demand for this product, and maybe, just maybe, it would be different today. Instead, its assets are now owned by HP; whats the betting that Humane is the next Palm?Editors Recommendations
    0 Reacties ·0 aandelen ·58 Views
  • Google urges iPhone users to switch to standalone Gemini app
    www.digitaltrends.com
    Last fall, Google introduced a standalone Gemini app for iOS. At the same time, the AI assistant remained in the standalone Google app. Thats now changing. As 9to5Google first noted, Google is informing iOS users that it will remove Gemini support from the Google app. In doing so, it wants you to rely solely on the Gemini app.In an email to iOS users, Google says: Were making some changes to create an even better Gemini experience on iOS. Gemini is now available as its own app, and thats now the best place to use Gemini. To continue using Gemini, download the new Gemini app from the App Store. With the Gemini app, youll have access to all of the same features and more.Recommended VideosRemoving Gemini from the Google app doesnt come as much of a surprise.Please enable Javascript to view this contentBefore the standalone Gemini app was released, iOS users could easily switch between Google Search and Gemini within the Google app. However, in recent months, some of the newer Gemini features have been exclusive to the Gemini app and are no longer accessible through the Google app. A clear example is Gemini Live, which was never available in the Google app.The iOS Gemini app allows users to interact with Gemini via text, voice, images, and the camera, offering support in various areas. It can be utilized to learn about new subjects, compose thank-you notes, organize events, and more. Furthermore, it integrates with other Google apps such as Gmail, Maps, and YouTube, simplifying task completion. The Gemini app for iOS is intended to be a flexible tool that can assist with various tasks ranging from simple to complex.Editors Recommendations
    0 Reacties ·0 aandelen ·57 Views
  • The odds of a city-killer asteroid impact in 2032 keep rising. Should we be worried?
    arstechnica.com
    SMOD The odds of a city-killer asteroid impact in 2032 keep rising. Should we be worried? "Humanity has never tried to stop an asteroid impact for real." Eric Berger Feb 19, 2025 9:36 am | 6 Illustration of NASAs DART spacecraft with images of the asteroids Dimorphos (left) and Didymos (right) obtained by DART. Rubble pile asteroids like Dimorphos are thought to be bound together with very weak forces and mostly gravity, making them easier to break apart than a single boulder Credit: NASA/Johns Hopkins APL/Joshua Diaz Illustration of NASAs DART spacecraft with images of the asteroids Dimorphos (left) and Didymos (right) obtained by DART. Rubble pile asteroids like Dimorphos are thought to be bound together with very weak forces and mostly gravity, making them easier to break apart than a single boulder Credit: NASA/Johns Hopkins APL/Joshua Diaz Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreAn asteroid discovered late last year is continuing to stir public interest as its odds of striking planet Earth less than eight years from now continue to increase.Two weeks ago, when Ars first wrote about the asteroid, designated 2024 YR4, NASA's Center for Near Earth Object Studies estimated a 1.9 percent chance of an impact with Earth in 2032. NASA's most recent estimate has the likelihood of a strike increasing to 3.2 percent. Now that's not particularly high, but it's also not zero.Naturally the prospect of a large ball of rock tens of meters across striking the planet is a little worrisome. This is large enough to cause localized devastation near its impact site, likely on the order of the Tunguska event of 1908, which leveled some 500 square miles (1,287 square kilometers) of forest in remote Siberia.To understand why the odds from NASA are changing, and whether we should be concerned about 2024 YR4, Arsconnected with Robin George Andrews, author of the recently published book How to Kill an Asteroid. Good timing with the publication date, eh?Ars: Why are the impact odds increasing?Robin George Andrews: The asteroids orbit is not known to a great deal of precision right now, as we only have a limited number of telescopic observations of it. However, even as the rock zips farther away from Earth, certain telescopes are still managing to spy it and extend our knowledge of the asteroids orbital arc around the Sun. The odds have fluctuated in both directions over the last few weeks, but overall, they have risen; thats because the amount of uncertainty astronomers have as to its true orbit has shrunk, but Earth has yet to completely fall out of that zone of uncertainty. As a proportion of the remaining uncertainty, Earth is taking up more space, so for now, its odds are rising.Think of it like a beam of light coming out of the front of that asteroid. That beam of light shrinks as we get to know its orbit better, but if Earth is yet to fall out of that beam, it takes up proportionally more space. So, for a while, the asteroids impact odds rise. Its very likely that, with sufficient observations, Earth will fall out of that shrinking beam of light eventually, and the impact odds will suddenly fall to zero. The alternative, of course, is that they'll rise close to 100 percent.Ars: What are we learning about the asteroid's destructive potential?Andrews: The damage it could cause would be localized to a roughly city-sized area, so if it hits the middle of the ocean or a vast desert, nothing would happen. But it could trash a city, or completely destroy much of one, with a direct hit.The key factor here (if you had to pick one) is the asteroids mass. Each time the asteroid gets twice as long (presuming its roughly spherical), it brings with it 8 times more kinetic energy. So if the asteroid is on the smaller end of the estimated size range40 metersthen it will be as if a small nuclear bomb exploded in the sky. At that size, unless its very iron-rich, it wouldnt survive its atmospheric plunge, so it would explode in mid-air. There would be modest-to-severe structural damage right below the blast, and minor to moderate structural damage over tens of miles. A 90-meter asteroid would, whether it makes it to the ground or not, be more than 10x more energetic; a large nuclear weapon blast, then. A large city would be severely damaged, and the area below the blast would be annihilated.Ars: Do we have any idea where the asteroid might strike on Earth?Andrews: The "risk corridor" is currently spread over parts of the eastern Pacific Ocean, northern South America, the Atlantic Ocean, parts of Africa, the Arabian Sea and South Asia. Additional observations will ultimately narrow this down, if an impact remains possible.Ars: What key observations are we still waiting for that might clarify the threat?Andrews: Most telescopes will lose sight of this "small" asteroid in the coming weeks. But the James Webb Space Telescope will be able to track it until May. For the first time, its been authorized for planetary defense purposes, largely because its infrared eye allows it to track the asteroid further out than optical light telescopes. JWST will not only improve our understanding of its orbit, but also constrain its size. First observations should appear by the end of March.JWST may rule out an impact in 2032. But there's a chance we may be stuck with a few-percentage impact probability until 2028, when the asteroid makes its next Earth flyby. Bit awkward, if so.Ars: NASA's DART mission successfully shifted an asteroid's orbit in 2022. Could this technology be used?Andrews: Not necessarily. DARTa type of spacecraft called a kinetic impactorwas a great success. But it still only changed Dimorphos' orbit by a small amount. Ideally, you want many years of advance notice to deflect an asteroid with something like DART to ensure the asteroid has moved out of Earths way. I've often been told that at least 10 years prior to impact is best if you want to be sure to deflect a city killing-size asteroid. Thats not to say deflection is impossible; it just becomes trickier to pull off. You cant just hit it with a colossal spacecraft, because you may fragment it into several still-dangerously sized pieces. Hit it too softly, and it will still hit Earth, but somewhere that wasnt originally going to be hit. You have to be super careful here.Some rather clever scientists at the Lawrence Livermore National Laboratory (which has a superb planetary defense contingent) worked out that, for a 90-meter asteroid, you need 10 years to confidently deflect it with a kinetic impactor to prevent an Earth impact. So, to deflect 2024 YR4, if its 90 meters long and we have just a few years of time, wed probably need a bigger impactor spacecraft (but dont break it!)or wed need several kinetic impactors to deflect it (but each has to work perfectly).Eight years until impact is a little tight. Its not impossible that the choice would be made to use a nuclear weapon to deflect it; this could be very awkward geopolitically, but a nuke would impart a bigger deflection than an equivalent DART-like spacecraft. Or, maybe, theyd opt to try and vaporize the asteroid with something like a 1 megaton nuke, which LLNL says would work with an asteroid this size.Ars: So it's kind of late in the game to be planning an impact mission?Andrews: This isnt an ideal situation. And humanity has never tried to stop an asteroid impact for real. I imagine that if 2024 YR4 does become an agreed-upon emergency, the DART team (JHUAPL + NASA, mostly) would join forces with SpaceX (and other space agencies, particularly ESA but probably others) to quickly build the right mass kinetic impactor (or impactors) and get ready for a deflection attempt close to 2028, when the asteroid makes its next Earth flyby. But yeah, eight years is not too much time.A deflection could work! But it wont be as simple as just hitting the asteroid really hard in 2028.Ars: How important is NASA to planetary defense?Andrews: Planetary defense is an international security concern. But right now, NASA (and America, by extension) is the vanguard. Its planetary defenders are the watchers on the wall, the people most responsible for not just finding these potentially hazardous asteroids before they find us, but also those most capable of developing and deploying tech to prevent any impacts. America is the only nation with (for now!) a well-funded near-Earth object hunting program, and is the only nation to have tested out a planetary defense technique. Its a movie clich that America is the only nation capable of saving the world from cosmic threats. But, for the time beingeven with amazing planetary defense mission contributions from ESA and JAXAthat clich remains absolutely true.Eric BergerSenior Space EditorEric BergerSenior Space Editor Eric Berger is the senior space editor at Ars Technica, covering everything from astronomy to private space to NASA policy, and author of two books: Liftoff, about the rise of SpaceX; and Reentry, on the development of the Falcon 9 rocket and Dragon. A certified meteorologist, Eric lives in Houston. 6 Comments
    0 Reacties ·0 aandelen ·65 Views
  • How to Make AI Projects Greener, Without the Greenwashing
    www.informationweek.com
    Samuel Greengard, Contributing ReporterFebruary 19, 20255 Min ReadTithi Luadthong via Alamy StockFor businesses across every industry, artificial intelligence is rapidly reshuffling the deck. The technology opens the door to deeper insights, advanced automation, operational efficiencies, and cost savings.Yet, AI also delivers some baggage. The power-hungry nature of the technology, which impacts everything from data centers to training and using generative AI models, raises critical questions about sustainability. AI could double data center electricity demand in the US by 2030.As a result, business and IT leaders can easily find themselves caught in the crossfire between AIs benefits and risks. As organizations pursue carbon targets and other sustainability issues, a lack of clarity about the technology -- and perceptions of inconsistencies -- can evoke charges of greenwashing.Sustainable AI touches everything from the direct energy requirements that power artificial intelligence models to supply chain, reporting, hardware and data center operations. It can also raise questions about when and where organizations should use AI -- and when they shouldnt.Sustainable AI is about using AI in ways that minimize environmental impact while promoting sustainability throughout its lifecycle, says Sammy Lakshmanan, a partner at PwC Sustainability. The goal isnt to just reduce AIs footprint. Its to make AI both effective and sustainable.Related:Beyond the AI HypeA growing challenge for CIOs and other tech leaders is to fully understand the impact of AI, including GPUs that devour energy at about a 10x rate over other chips. While no company wants to miss the opportunities that AI can deliver, its also important to recognize that the technology comes with costs. Theres a temptation for organizations to get caught up in an AI arms race without looking at the returns, states Autumn Stanish, a director and analyst at Gartner.A haphazard or inconsistent approach to AI can contribute to the perception that a company is engaging in greenwashing. Many of the common uses of AI link directly to climate change, says David Rolnick, an assistant professor in the School of Computer Science at McGill University. Framing specific AI initiatives as net positives or negatives isnt the right approach, he argues. Its vital to gain a more holistic understanding of how AI impacts sustainability.Greenwashing problems often revolve around two key issues, Rolnick says. First, companies that use carbon offsets must recognize that they arent reducing emissions produced by AI systems. Second, sloppy reporting creates more questions than answers. While quantifying the carbon generated from AI is difficult -- especially Scope 3 emissions -- a lack of transparency increases the odds that a company will find itself in the crosshairs of activists and the media.Related:But theres also the fundamental question of how an organization uses AI, Rolnick says. Its important to put AI to work strategically. There are many places where it can improve efficiency -- particularly when it comes to automating processes and optimizing system -- but there also are many instances where it doesnt provide any significant advantages. This includes tossing generative AI at every problem. In many cases, humans make better decisions, he states.As companies pursue carbon reduction targets, its important to identify where delivers specific strategic advantages -- and how it impacts sustainability in both positive and negative ways. Sustainable AI does not happen by accident -- it involves proper governance and engineering to create systems that are efficient and beneficial for productivity and innovation, Lakshmanan explains.Cracking the CodeTurning an AI strategy into broader sustainability issues helps build an energy framework based in renewables -- including wind, solar and emerging sources of nuclear energy, such as small modular reactors (SMRs). While this approach doesnt directly lower the energy demand for AI, it can significantly curtail carbon output.Related:The challenge lies in verifying that energy labeled as sustainable or carbon free is genuinely renewable, Lakshmanan points out. As a result, he recommends that organizations adopt transparency tools such as renewable energy certificates (RECs) and Power Purchase Agreements (PPAs) that help track the lifecycle impacts of renewable infrastructure.There are also practical steps organizations that help align AI with sustainability initiatives. This includes improvements in data center efficiency, such as better hardware and understanding when CPUs are a better option than GPUs. It also involves responsible data practices such as optimizing AI algorithms and models through pruning and sampling, and with transfer learning, which can significantly decrease computational demands by recycling pre-trained models.Transfer learning involves using a model trained for one task to improve results for a related task.Training and inferencing models in a horizontal or cross-cutting manner can alleviate the need to repeat processes across departments and groups, Lakshmanan points out. For example, Summarizing documents is a repeatable process whether it relates to sustainability or tax documents. Theres no need to train the system twice for the same capability, he explains.The end goal, Lakshmanan says, is to adopt a holistic approach that spins a tight orbit around both innovation and the greater use of renewables. For instance, if an organization uses carbon offsets, he recommends pairing the program with a meaningful decarbonization strategy. This ensures offsets complement broader sustainability targets rather than replacing them. It makes AI projects both innovative and environmentally responsible.Beyond the AlgorithmAvoiding greenwashing accusations also requires sound carbon accounting practices that can measure and track AI emissions. A growing array of consulting firms and private companies offer tools to track AI emissions and optimize energy usage based on real-time grid conditions.Measurement, combined with deeper analysis of AI and data center energy consumption, can boost efficiency in other ways. There are ways to use AI to analyze and improve power consumption, including putting AI on the edge, says Gillian Crossen, Risk Advisory Principal and Global Technology Leader at Deloitte. Not everything has to go through the data center. AI can also right-size models and produce other insights and gains that offset its power requirements.Finally, its important to avoid over-marketing claims or publishing data that presents an unrealistically positive picture to the public and investors, says Thomas P. Lyon, Dow Professor of Sustainable Science, Technology and Commerce at the University of Michigans Ross School of Business. An organization must be able to fully substantiate its claims about AI and sustainability, typically through metrics and third-party verification.With transparency across key segments, including customers, investors, partners and employees, the risks of greenwashing subside. Organizations should step back and think about how they can use AI effectively, Rolnick says. There are legitimate and productive use cases but theres also a lot of energy waste associated with AI. Without a detailed assessment and a clear understanding of the various factors the risks increase.About the AuthorSamuel GreengardContributing ReporterSamuel Greengard writes about business, technology, and cybersecurity for numerous magazines and websites. He is author of the books "The Internet of Things" and "Virtual Reality" (MIT Press).See more from Samuel GreengardNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Reacties ·0 aandelen ·71 Views
  • When did life begin on Earth? New evidence reveals a shocking story
    www.newscientist.com
    Ingo Oeland/AlamyEarth is some 4.5 billion years old. When it formed from colliding rocks around a dim, young sun, it was presumably lifeless, and geologists long thought that life didnt emerge for a billion years or more. This idea came from analysis of moon rocks brought back from the Apollo landings, which indicated Earth was pummelled by space rocks between 4 billion and 3.8 billion years ago an event called the Late Heavy Bombardment. The implication was that the origin of life as we know it must have begun after that, since any earlier organisms would have been blitzed.Theres two issues with that, says Philip Donoghue at the University of Bristol, UK. First, models suggest that some life could have survived deep in the oceans. More damningly, it now seems that the Late Heavy Bombardment didnt actually happen. The Apollo missions only created the impression of a huge bombardment over a brief period because they all collected rocks of a similar age.We now know that, early in Earths history, large impacts occurred sporadically over hundreds of millions of years. However, we also know that a body the size of Mars collided with Earth just after it was formed, vaporising the planets surface. If life originated before then, it would have been wiped out, says Donoghue.Earths oldest rocksLife began when inert matter self-organised into living systems, but, despite decades of research, how that happened remains a mystery. Figuring out when it happened is also a big challenge because the fossil record gets worse the further back
    0 Reacties ·0 aandelen ·75 Views
  • Electrodes made from bread could replace metal conductors
    www.newscientist.com
    David Bujdos holds a bread-based electrodeLiz PalmerOld pieces of bread can be transformed into precisely shaped electrodes using water and heat. The bread-based components could replace metal electrodes in devices while reducing the hundreds of tonnes of bread wasted daily.Bread has all sorts of stuff in it, such as starch, protein and water, says David Bujdos at the University of Pennsylvania. You can just heat it at a really high temperature without oxygen and you get the carbon backbone out of that.
    0 Reacties ·0 aandelen ·66 Views