TechSpot
TechSpot
Tech Enthusiasts - Power Users - IT Professionals - Gamers
1 people like this
553 Posts
2 Photos
0 Videos
0 Reviews
Recent Updates
  • Intel proposes new modular standards for laptops and mini PCs to improve repairability
    www.techspot.com
    Forward-looking: Outdated laptops often necessitate complete replacements since they typically only allow users to upgrade memory or storage. Intel aims to address this issue by urging manufacturers to embrace new modular design standards. Like Framework and MNT PCs, these standards would allow users to upgrade motherboards and other components without purchasing a new laptop. Intel recently outlined its ideas for allowing users to upgrade and replace individual laptop components. If widely adopted, the new standards could reduce costs and e-waste.The company's proposal describes standardized measurements for future laptop motherboards and I/O modules. Following a clear set of parameters might speed up the design process for new components.Additionally, users could upgrade and replace boards, USB-C ports, Thunderbolt ports, and other parts without completely replacing the laptop. An affordable tier for mainstream laptops would accommodate 14- and 16-inch devices, enabling single- or dual-fan upgrades to improve cooling.Intel also envisions strengthening modularity and standardization for mini-PCs. The company's diagram (below) depicts a 5L chassis with slide rails to facilitate easy swapping of the CPU, memory, GPU, and storage. The I/O ports and other parts would also enable easy repairs.Framework and MNT have provided modular laptops for years, but their initiatives remain niche. Framework's products allow customers to swap mainboards, ports, screens, keyboards, and many other components. The company also aims to provide upgradeable dedicated graphics for its 16-inch laptop, but the AMD Radeon RX 7700S is the only available option so far. // Related StoriesMeanwhile, MNT recently unveiled the successor to its Arm-based Reform laptop. Owners of the prior model can upgrade to a new SoC, swap other components, and 3D print a new chassis.The primary downside of Framework and MNT devices is that upgrades require users to remain within each company's hardware ecosystem. In contrast, Intel's proposal could potentially create an environment resembling desktop DIY PCs, allowing users to choose between parts from numerous vendors. However, whether the industry would agree upon a single modular laptop standard remains uncertain.Wide adoption wouldn't just lower upgrade costs and reduce e-waste; it would also deliver right-to-repair advocates a major victory. The increasingly popular movement aims to resist manufacturers' efforts to block repairs and maintenance by users or third-party hardware vendors. Right-to-repair advocacy usually centers on machines like smartphones and tractors, but laptop users have also begun complaining about recent models that no longer allow RAM and SSD upgrades.
    0 Comments ·0 Shares ·48 Views
  • AMD confirms microcode vulnerability revealed in beta BIOS update
    www.techspot.com
    What just happened? AMD has confirmed a security vulnerability in some of its processors, which was inadvertently revealed through a beta BIOS update from Asus. The flaw, described as a "microcode signature verification vulnerability," came to light before AMD could officially disclose it, sparking concerns in the cybersecurity community. The vulnerability was first noticed by Tavis Ormandy, a security researcher at Google's Project Zero. Ormandy spotted a reference to the flaw in the release notes of an Asus beta BIOS update for one of its gaming motherboards. "It looks like an OEM leaked the patch for a major upcoming CPU vulnerability," Ormandy wrote in a public mailing list post.AMD has since acknowledged the issue. The company has not yet specified which of its products are affected but has indicated that mitigations are being developed and deployed.The vulnerability appears to be related to the microcode and seems to circumvent the process that ensures only official, AMD-signed microcode can be loaded into the processor. Exploiting this vulnerability requires not only local administrator access to the targeted system but also the capability to develop and execute malicious microcode, according to AMD. This high bar for exploitation suggests that while the vulnerability is serious, it's not something that could be easily weaponized by casual attackers.While the full extent of the vulnerability's impact is not yet known, security experts have begun speculating about its potential consequences. Demi Marie Obenour, a software developer for Invisible Things, suggested that if an attacker could load arbitrary microcode, they might be able to compromise critical security features such as System Management Mode (SMM), Secure Encrypted Virtualization-Secure Nested Paging (SEV-SNP), and Dynamic Root of Trust for Measurement (DRTM).The recent discovery of a microcode signature verification vulnerability is not an isolated incident. Over the years, AMD has faced several security challenges across its product lines. // Related StoriesIn March 2018, researchers from CTS Labs uncovered a series of vulnerabilities affecting AMD's Ryzen and Epyc processors. These flaws, collectively known as RYZENFALL, MASTERKEY, CHIMERA, and FALLOUT, posed security risks to both consumer and enterprise-grade processors. Exploiting the vulnerabilities required administrative access, according to AMD.In August 2024, a more widespread vulnerability named "Sinkclose" was disclosed. This flaw in the System Management Mode potentially exposed hundreds of millions of devices to security risks. In this case, exploiting the vulnerability required kernel-level access, making it a threat primarily to "seriously breached systems," AMD said at the time.
    0 Comments ·0 Shares ·48 Views
  • Subaru vulnerability exposed millions of cars to remote hacking and tracking
    www.techspot.com
    A hot potato: Security researchers have uncovered alarming vulnerabilities in Subaru's Starlink system, potentially exposing millions of vehicles to unauthorized access and extensive location tracking. While Subaru has said that it doesn't sell location data, the potential for misuse is a significant concern. The discovery began when Sam Curry, having purchased a 2023 Impreza for his mother, decided to examine its internet-connected features during a Thanksgiving visit.Curry and fellow researcher Shubham Shah found they could hijack control of various vehicle functions, including unlocking doors, honking the horn, and starting the ignition. However, what Curry found most disturbing was the ability to access detailed location history. "You can retrieve at least a year's worth of location history for the car, where it's pinged precisely, sometimes multiple times a day," Curry told Wired. He added, "Whether somebody's cheating on their wife or getting an abortion or part of some political group, there are a million scenarios where you could weaponize this against someone."The researchers began by identifying a weakness in the password reset functionality on the SubaruCS.com site, an administrative portal intended for Subaru employees. By simply guessing an employee's email address, they could initiate a password reset process, exposing a critical flaw in the system's design.Further investigation revealed that while the site did ask for answers to two security questions during the reset process, these were verified using client-side code running in the user's browser rather than on Subaru's servers. This oversight allowed the researchers to easily bypass the security questions, highlighting a significant lapse in the company's cybersecurity measures. "There were really multiple systemic failures that led to this," Shah told Wired.Curry and Shah then used LinkedIn to locate the email address of a Subaru Starlink developer, exploiting the vulnerabilities to take over this employee's account, which granted them access to sensitive information and controls. The compromised account allowed the pair to look up any Subaru owner using various personal identifiers such as last name, zip code, email address, phone number, or license plate. // Related StoriesMoreover, they discovered that they could access and modify Starlink configurations for any vehicle, as well as reassign control of Starlink features. This included the ability to remotely unlock cars, honk horns, start ignitions, and locate vehicles.Most alarmingly, Curry and Shah gained access to detailed location histories of vehicles, with data going back at least a year. "You can retrieve at least a year's worth of location history for the car, where it's pinged precisely, sometimes multiple times a day," Curry explained to Wired.Subaru quickly patched the security flaws after the researchers reported their findings in late November. However, the incident raises broader concerns about privacy and data security in the automotive industry. The researchers warn that similar vulnerabilities likely exist in other automakers' systems.A Subaru spokesperson confirmed to Wired that certain employees can access location data, stating that it's necessary for purposes such as sharing vehicle location with first responders in case of collisions. "All these individuals receive proper training and are required to sign appropriate privacy, security, and NDA agreements as needed," the company said. It also said it doesn't sell location data.The discovery is part of a larger trend of security vulnerabilities in connected vehicles. Curry and other researchers have previously identified similar issues affecting multiple car manufacturers, including Acura, Genesis, Honda, Hyundai, Infiniti, Kia, and Toyota.This incident underscores the growing privacy concerns surrounding modern vehicles. A recent report by the Mozilla Foundation highlighted that 92 percent of car manufacturers give owners little to no control over collected data, and 84 percent reserve the right to sell or share this information.
    0 Comments ·0 Shares ·57 Views
  • Nvidia prepares to move Maxwell, Pascal, and Volta GPUs to legacy driver status
    www.techspot.com
    In context: Nvidia is preparing to transition its Maxwell, Pascal, and Volta GPU architectures to a legacy driver branch, signaling the end of an era for these iconic products. This transition reflects the company's focus on supporting more recent hardware capabilities, particularly in areas such as AI and ray tracing. Nvidia's CUDA 12.8 release notes indicate that support for the older architectures is now considered "feature-complete" and will be frozen in an upcoming release.This move marks a significant shift for Nvidia as it begins to phase out support for the remaining GTX-era architectures. While CUDA support will continue for Maxwell, Pascal, and Volta GPUs, they will no longer receive new features in future updates. It's important to note that this change does not immediately affect GeForce gaming driver support, as Maxwell and Pascal GPUs are still included in the support list for the GeForce RTX series driver.Nvidia has not provided a specific date for the end of full support for these three GPU architectures, but the transition is expected to occur soon. Once this change takes effect, the GTX 16-series, based on the Turing architecture of the RTX 20-series, will be the only remaining GTX-series GPUs with full support.The Maxwell architecture, introduced 11 years ago, represents the oldest of the outgoing GPU architectures still supported by Nvidia on the consumer side. It debuted with the GeForce GTX 750 series and was followed by the GTX 900 series. Maxwell brought significant performance-per-watt improvements over its predecessor, Kepler, and was particularly notable for its efficiency in mobile GPUs.Pascal, introduced in 2016 with the GeForce GTX 1000 series, marked one of Nvidia's most significant architectural advancements in the 2010s. It utilized TSMC's 16nm finFET plus technology, doubling the density of Maxwell's 28nm node and delivering substantial performance gains. The GTX 1080, for instance, offered 60-65 percent higher performance than its predecessor, the GTX 980. // Related StoriesVolta, released in 2017, was primarily focused on AI applications and enterprise use. It introduced Tensor cores, specialized units designed for AI workloads, which provided nine times the performance of Pascal in AI-specific tasks. Volta was largely confined to the enterprise sector, with the Titan V being the only desktop GPU to feature this architecture.For Linux users, most distributions will continue to support legacy versions of the Nvidia driver, ensuring that affected cards will remain functional for the foreseeable future. However, users should be aware that they will not receive new features or optimizations moving forward.
    0 Comments ·0 Shares ·40 Views
  • www.techspot.com
    PSA: Asus and other PC hardware vendors have spent the last few years exploring ways to make installing and removing motherboard components easier and safer. Asus' latest method aims to eliminate release buttons and latches from PCIe slots, but recent reports indicate that the procedure risks chipping the gold fingers. Owners of Asus' AMD X800 and Intel Z800 motherboards should exercise caution when removing their graphics cards. Multiple reports show that the company's recently introduced Q-Release Slim system can scratch the GPU's connection pins.A recently uploaded Bilibili video showed how the mechanism can chip the end of a graphics card's short gold finger, which faces the long gold finger. Hardware Luxx's Andreas Schilling later shared a photo of his GeForce RTX 5090 exhibiting similar damage. Schilling reported that the pins frequently became caught in the PCIe slot during benchmarks involving repeated GPU removal and swapping.Asus introduced Q-Release Slim to simplify the process of detaching graphics cards and other PCIe add-in boards. Traditional mechanisms require pressing a latch at the end of the slot to release the gold finger, which often proves difficult because large GPUs can completely obscure the latch.The company began using an alternative method in 2021 involving pressing a release button placed in a more accessible position. MSI demonstrated a similar system at Gamescom last year, suggesting that the trend is catching on. However, Asus's Q-Release Slim mechanism aims to simplify things further. // Related StoriesWith Q-Release Slim, users simply hold the motherboard down with one hand, slightly tilt the card upward, and remove it by pulling toward the I/O bracket. The design prevents the gold fingers from shifting in any other direction.Asus is working to address the recent reports of chipped connectors, but hasn't yet provided details on a remedy. It remains unclear if the company might revert to an earlier system or try something new with future motherboards.The PCIe slot isn't the only port motherboard manufacturers are trying to improve. MSI also introduced a Q-Release system for the M.2 slot for Z800 boards. Installing and removing NVMe SSDs normally involves screws that can sometimes become frustrating due to their small size. With MSI's new system, pressing a lock button behind the drive secures the SSD without the need for screws. Asus uses a similar system with a release latch and a sliding gate that supports 2280 and 2230 drives. Unlike Asus's GPU slot method, SSD Q-Release mechanisms haven't encountered widespread problems thus far.
    0 Comments ·0 Shares ·43 Views
  • Seagate is now sampling 36TB hard drives based on HAMR technology
    www.techspot.com
    TL;DR: Seagate was the first manufacturer to introduce heat-assisted magnetic recording hard drives to the market and continues to innovate in magnetic storage technology. The company is expanding the capacity of its next-generation storage units, though I/O performance may be a concern for some customers. Seagate started shipping hard drives with HAMR tech in December 2024, turning a long-awaited technological advancement into a commercial reality. Now, the storage specialist is announcing that even more advanced HAMR drives, with capacities of up to 36 terabytes, are on the way.The 36TB HAMR drives are being shipped to a select group of customers for testing and validation. Like the earlier HAMR units, these new Exos M drives are built on the Mozaic 3+ technology platform to deliver "unprecedented" areal density. The drives utilize a complex 10-platter design, achieving an areal density of 3.6TB per platter.According to Seagate CEO Dave Mosley, the company has already reached an areal density of over 6TB per disk in its test environments. The goal, he says, is to further increase the data density to 10TB per platter. Seagate also states that Mozaic 3+ is a highly efficient storage platform, enabling the new Exos M drives to lower the total cost of ownership and reduce energy consumption.HAMR drives have been specifically engineered to meet the needs of data centers, offering 300 percent more storage capacity within the same physical footprint. Seagate also estimates a 25 percent reduction in cost per terabyte and a 60 percent reduction in power consumption per terabyte. Dell Technologies, an early adopter of the Mozaic 3+ platform, plans to integrate the Exos M 32TB drives into its "high-density" storage products in the near future.While touted as the cutting edge of magnetic storage technology, the new drives come with a notable caveat that Seagate does not mention in its press release. According to the product page for HAMR drives, the highest-capacity models (32TB and 36TB) rely on shingled magnetic recording (SMR) to achieve their impressive areal density and overall storage capacity. // Related StoriesSMR-based drives use a sophisticated architecture with overlapping data tracks, which can negatively impact write operations. However, this limitation may not be a significant issue for data centers and startups training AI language models, as their primary focus is on storing vast amounts of data rather than frequent or intensive write operations.As IDC researcher Kuba Stolarski noted, hard disk drives remain a "critical" technology for AI applications and other enterprise-level storage demands. Stolarski highlighted that a significant majority (89 percent) of data stored by leading cloud services is still archived on hard disk drives.
    0 Comments ·0 Shares ·44 Views
  • EA shares dip nearly 20% after Dragon Age, FC 25 disappoint during the holidays
    www.techspot.com
    In brief: Shares of Electronic Arts are down nearly 20 percent after the company warned that earnings for the fiscal third quarter would be lower than initially forecasted due in large part to two franchises that underperformed during the holidays. As of writing, shares of EA are trading for $115.80. Ahead of the announcement, the stock was changing hands at $142.35.EA said its global football franchise, which had experienced two consecutive fiscal years of double-digit net bookings growth, hit a rough patch in the fiscal third quarter. Dragon Age also underperformed during the quarter; its 1.5 million players were nearly 50 percent less than EA had anticipated.As a result, EA revised its third quarter preliminary net bookings. The company now expects approximately $2.215 billion, down from the previous forecast of between $2.4 billion and $2.55 billion. Net revenue, meanwhile, should check in around $1.883 billion; previously, EA had forecasted net revenue in the range of $1.875 billion to $2.025 billion.EA CEO Andrew Wilson said that despite the hiccups, they remain confident in their long-term strategy and expect a full return to growth in FY26. The executive added that this month, teams conducted a comprehensive gameplay refresh in addition to their annual team of the year update in FC 25, and that early feedback has been encouraging.CFO Stuart Canfield concurred, adding that as they look to FY26, they expect grow with the launch of more iconic franchises.Electronic Arts had been at the forefront of soccer games with its popular FIFA franchise, which dates back to 1993. A licensing dispute with FIFA in 2022, however, brought an end to the popular series. The company still makes soccer games, but they're now under the EA Sports FC brand. The most recent, EA Sports FC 25, launched on September 27, 2024, for most major platforms.EA is scheduled to release its next earnings report on February 4, and will host a conference call do discuss the results at 5 pm Eastern. // Related Stories
    0 Comments ·0 Shares ·25 Views
  • Sony's next-gen XM6 headphones might drop this summer
    www.techspot.com
    Highly anticipated: Sony's might have a fresh pair of flagship noise-canceling cans dropping soon. A few recent FCC filings have led to speculation that the company is preparing to unleash its next-gen WH-1000XM6 headphones sometime this year. Recent FCC filings give us a sneak peek into what could be Sony's next-gen XM6 headset. Drawings suggest a slightly tweaked physical design from the current XM5 model.One of the most intriguing changes is detachable earpads. The filing shows the earpads can lift off, with the headphone's nameplate and serial numbers tucked underneath. Sony hid that info along the inner pad edge on the XM5.It's unclear if the earpads use magnets, clips, or some other adhering system to stay put. Screws seem unlikely, given the FCC requires the nameplate to be visible to users. Whatever the mode, having the earpads be removable could allow for easier repairs or swapping in different pad materials down the line.The headphones also appear to sport a tweaked hinge. Hopefully, this translates to a foldable design. For context, the XM5 only folds flat, not allowing for more compact storage. Users considered this a major downside over the XM4, so this is Sony's chance to change a previously unfavorable design decision.The FCC filing indicates Sony is sticking with Bluetooth 5.3 and LE Audio support. Some minor antenna changes are evident, which could improve wireless performance and range. The Walkman Blog notes that the driver size remains unchanged at 30mm.Sony lists the tested unit as a prototype rather than production-ready. Most of Sony's previous FCC filings were for either pre-production or finished models. So take what you will from this. // Related StoriesNailing down an official unveiling is a guessing game, but the FCC filing requests a short confidentiality period ending July 22nd, 2025. So, we will likely get an official XM6 reveal before then.The most recent rollouts of Sony's XM series have followed a biennial schedule. The XM3 landed in August 2018, the XM4 in August 2020, and the XM5 in May 2022. That release cadence means the series is already overdue for a refresh, so a spring or summer 2025 launch for the XM6 seems probable.
    0 Comments ·0 Shares ·22 Views
  • AMD and Intel new graphics drivers optimize Final Fantasy VII Rebirth
    www.techspot.com
    Designed to provide you with a clean, modern and easy-to-use interface where you can quickly access the latest software features, game stats, performance reports, driver updates, and much more - all from one convenient location.Take advantage of the ALT+R hotkey to open AMD Software: Adrenalin Edition directly in-game, making it even more convenient and easy to adjust your favorite features and get a fantastic gaming experience.What's New:Gaming HighlightsIntel Game On Driver support on Intel Arc B-series, A-series Graphics GPUs, and Intel Core Ultra with built-in Intel Arc GPUs for:Final Fantasy VII RebirthPerformance Improvements:Intel Arc B-series Graphics GPUs (vs. Intel 31.0.101.6257 driver):Final Fantasy VII Rebirth (DX12):Up to 12.8% average FPS uplift at 1080p with High settingsUp to 11.6% average FPS uplift at 1440p with High settingsIntel Core Ultra Series 2 with built-in Intel Arc GPUs (vs. Intel 31.0.101.6458 driver):Final Fantasy VII Rebirth (DX12):Up to 9.0% average FPS uplift at 1080p with Medium settingsKnown Issues Intel Arc B-Series Graphics ProductsF1 24 (DX12) with XeSS FG:Application crashes may occur when dynamically changing XeSS FG settings during gameplay. Recommended to toggle settings in the game menu before starting a race."Alt + Enter" shortcut may change the display mode to fullscreen exclusive, causing crashes.MLPerf: May show intermittent errors on multi-GPU systems. Disabling the integrated GPU is recommended.Topaz Labs Photo AI: Corruptions may occur with certain image enhancement operations.Magix Vegas Pro: Corruptions may occur when using the style transfer feature.Dassault Systmes CATIA: Application may crash when using the HQAO option.Cyberlink Power Director: May experience tearing and lag in the preview window and exported video.Adobe Lightroom Classic: Lower-than-expected performance. Workaround:Edit > Preferences > PerformanceSet Graphics Processor to "Custom"Enable "Use GPU for Display," "Use GPU for Image Processing," and "Use GPU for Export."Intel Arc A-Series Graphics ProductsCall of Duty: Black Ops 6 (DX12): Shadows may appear darker than expected in certain campaign scenarios.Topaz Gigapixel AI: Intermittent crashes may occur during image export.Adobe After Effects: Flickering may occur in the preview window during playback.Cyberlink Power Director: Tearing and lag in the preview window and exported video.Intel Core Ultra Series 1 with Built-in Intel Arc GPUsPugetBench for Davinci Resolve Studio V19: Intermittent errors with the Extended benchmark preset.Davinci Resolve Studio v19.0: Errors may occur during OpenVino test scenario rendering.Topaz Video AI: Corruptions may occur during video export after applying some enhancement models.Adobe After Effects: Flickering in the preview window during playback.Cyberlink Power Director: Tearing and lag in the preview window and exported video.Intel Core Ultra Series 2 with Built-in Intel Arc GPUsAdobe Premiere Pro: Output video may show corruption after performing 8K AV1 encode.PugetBench for Davinci Resolve Studio V19: Intermittent errors with the Extended benchmark preset.Davinci Resolve Studio v19.0:Errors while rendering OpenVino test scenarios.Errors may occur intermittently during AI/ML scenario rendering.Topaz Video AI: Corruption during video export after applying enhancement models.Horizon Zero Dawn Remastered (DX12): Crash may occur during gameplay with frame generation enabled.Adobe After Effects: Flickering in the preview window during playback.Cyberlink Power Director: Tearing and lag in the preview window and exported video on certain Intel Core Ultra Series 2 Mobile Processors.Intel Graphics Software Known IssuesApplication crashes when navigating pages or dialogue boxes due to framework issues. A fix will be available in IGS with WinAppSDK 1.7.Intermittent crashes when resetting settings via "Reset All Settings" in Windows 10. Individual page resets are unaffected.Single application crash may occur on the first re-arrange of metrics in the select metrics window. Subsequent usage unaffected.FPS Limiter may not function with VSync ON and Low Latency Mode enabled.Profile controls may not always supersede global controls.Non-native panel resolution scaling may result in unintended behavior.Incorrect reporting of Xe Cores for some Intel Arc B-Series Graphics Products.Intel Graphics Software Performance Tuning (BETA)Performance and features may behave unexpectedly in multi-GPU scenarios:The performance tuning page may attempt to apply changes to multiple devices simultaneously instead of per GPU selector.Previous Release Notes:HighlightsSupport for:Atomic HeartCompany of Heroes 3Fixed IssuesCorruption may be briefly observed when moving Netflix video between displays or minimize-to-fullscreen on some AMD Products such as AMD Ryzen 7 6800U.Maximum encode bitrate is limited to 100Mbps for certain applications.AMD Bug Report Tool pop-up or system hang may be observed after driver upgrade on some hybrid graphics notebooks.Application crash may be observed while playing Hitman 3 with ray tracing settings enabled.Valve Index VR headset may show a blank screen with 144Hz refresh rate setting on Radeon RX 7000 series GPUs.Certain videos played with Movies and TV may briefly show corruption when moving the window between displays on some AMD Graphics Products such as AMD Radeon RX 6700 XT.Situational performance drop may be observed in DirectX 11 based games on Radeon RX 6000 series GPUs using Ryzen processors.Known IssuesHigh idle power has situationally been observed when using select high-resolution and high refresh rate displays on Radeon RX 7000 series GPUs.Video stuttering or performance drop may be observed during gameplay plus video playback with some extended display configurations on Radeon RX 7000 series GPUs.Application crash may be observed while opening Premium Gold Packs in EA SPORTS FIFA 23.Some virtual reality games or apps may experience lower-than-expected performance on Radeon RX 7000 series GPUs.Brief display corruption may occur when switching between video and game windows on some AMD Graphics Products such as the Radeon RX 6700 XT.Metrics overlay may intermittently re-size to 50% after gameplay.Corruption may be observed in Returnal in certain scenes with ray tracing enabled on Radeon RX 6000 series GPUs.Previous release notesSupportForspoken Up to 7% increase in performance for Forspoken @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-524Dead SpaceIREE compiler using MLIR interface on Vulkan.Additional Vulkan extensions. Click here for more information.Radeon RX 6000 Series GPUs now have support for newly introduced streaming capabilities including pre-filter toggle, pre-analysis feature and CAML technology.A new version of AMD Link improves overall connectivity across all supported Radeon products, RX 400 series and newer, so you can game from anywhere on virtually any device.Boost your performance with AMD Software - read the latest blog HERE and learn how this newly unified driver delivers performance gains since Windows 11 first launched.Game OptimizationsUp to 4% increase in performance for Marvel's Spider-Man Remastered @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-518Up to 3% increase in performance for Sniper Elite 5 @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-519Up to 6% increase in performance for Shadow of the Tomb Raider @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-520Up to 7% increase in performance for Quake II RTX @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-521Up to 4% increase in performance for Hitman 3 @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-522Up to 6% increase in performance for Marvel's Guardians of the Galaxy @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-523Up to 19% increase in performance for F1 2022 @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-525Up to 9% increase in performance for DOOM Eternal @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-526Up to 4% increase in performance for Borderlands 3 @ 4k, using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950XT GPU, versus the previous software driver version 22.11.2 RS-527Up to 4% increase in performance for Hogwarts Legacy @ 4K using AMD Software: Adrenalin Edition 23.2.1 on the Radeon RX 6950 XT and Radeon 7900 XTX, versus the previous software driver version (22.11.2 for 6950 XT, 23.1.2 for 7900 XTX) RS-530Fixed IssuesAMD Software: Adrenalin Edition may fail to launch with the error message "Delayed Write Failed" on Microsoft Windows 11 version 22H2.Poor performance and load time may be observed while playing SpaceEngine.Corruption may be observed while scrolling the points shop in Steam on Radeon RX 6000 series GPUs.Performance drop may be observed during Fortnite and YouTube playback with Enhanced Sync enabled on some AMD Graphics Products such as AMD Radeon RX 5700 XT.Corruption or game crash may be observed while playing Door Kickers 2.Missing or flickering textures may be observed while playing Emergency 4.Application crash may be observed when launching Baldur's Gate 3 using Vulkan API on Radeon RX 7000 series GPUs.Stuttering may be observed while playing Sea of Thieves on Radeon RX 6000 and above series GPUs.Corruption may be observed while playing Battlefield 4 with Post Process Quality settings set to high or ultra on Radeon RX 6000 and above series GPUs.White foliage may be observed while playing Hogwarts Legacy on Radeon RX 7000 series GPUs.Intermittent system stuttering or UI flickering may occur when two videos are simultaneously playing using chromium-based browsers on some multi-display configurations.Known IssuesHigh idle power has situationally been observed when using select high-resolution and high refresh rate displays on Radeon RX 7000 series GPUs.Video stuttering or performance drop may be observed during gameplay plus video playback with some extended display configurations on Radeon RX 7000 series GPUs.Application crash may be observed while opening Premium Gold Packs in EA SPORTS FIFA 23.Some virtual reality games or apps may experience lower-than-expected performance on Radeon RX 7000 series GPUs.AMD Bug Report Tool pop-up or system hang may be observed after driver upgrade on some hybrid graphics notebooks. Users are recommended to use the factory reset install option as a workaround.Corruption may be briefly observed when moving Netflix video between displays or minimize-to-fullscreen on some AMD Products such as AMD Ryzen 7 6800U.Certain videos played with Movies and TV may briefly show corruption when moving the window between displays on some AMD Graphics Products such as AMD Radeon RX 6700 XT.Brief display corruption may occur when switching between video and game windows on some AMD Graphics Products such as the Radeon RX 6700 XT.Maximum encode bitrate is limited to 100Mbps for certain applications.AMD Software: Adrenalin Edition 23.2.1 is compatible with the following AMD Radeon products.Radeon RX 7900 Series GraphicsRadeon RX 6900/6800/6700/6600/6500/6400 Series GraphicsRadeon RX 5700/5600/5500/5300 Series GraphicsRadeon VIIRadeon RX Vega Series GraphicsAMD Radeon Pro DuoRadeon RX 500 / Radeon 500X Series GraphicsRadeon RX 400 Series GraphicsImportant NotesAMD Link users running Radeon RX 7000 series GPUs will need to update to a newer version of AMD Link now available on various platforms.AMD is working with the game developers of Hogwarts Legacy to resolve performance issues when enabling ray tracing.Package ContentsAMD Software: Adrenalin Edition 23.2.1 Driver Version 22.40.01.45 for Windows 10 and Windows 11 (Windows Driver Store Version 31.0.14001.45012).Previous Release Notes:Support for:AMD Ryzen 7000 Series ProcessorsGroundedKnown IssuesRadeon Super Resolution may fail to trigger after changing resolution or HDR settings on games such as Nioh 2.Oculus dashboard menu and rendered controllers may appear bouncing/wobbly on Oculus Quest 2 with some AMD Graphics Products such as the Radeon RX 6800 XT Graphics.GPU utilization may be stuck at 100% in Radeon performance metrics after closing games on some AMD Graphics Products such as Radeon 570.While previewing the timeline in VEGAS Pro, some colors may appear inverted.Display may briefly show corruption when switching between video and game windows on some AMD Graphics Products such as the Radeon RX 6700 XT.When Vertical Refresh Sync is set globally to Always Off, system stuttering or driver timeout may occur during video playback using Radeon RX 6000 series GPUs.Users may encounter dropped frames during video playback using hardware acceleration in browsers on Radeon RX 6000 series GPUs.AMD Noise SuppressionOur newest feature: AMD Noise Suppression reduces background audio noise from your surrounding environment using a real-time deep learning algorithm, providing greater clarity and improved concentration whether you are focused on an important meeting or staying locked-in on a competitive game. To learn more, check out our blog post here!.OpenGL OptimizationsUp to 79% increase in performance in Minecraft @ 4k Fabulous settings, using Radeon Software Adrenalin 22.7.1 on the Radeon RX 6950XT, versus the previous software driver version 22.6.1 RS-491Up to 75% increase in performance in Minecraft @ 4k Fabulous settings, using Radeon Software Adrenalin 22.7.1 on the Radeon RX 6400, versus the previous software driver version 22.6.1 RS-495Radeon Super ResolutionExpanded support for discrete Radeon RX 5000 and 6000 series GPUs on AMD Ryzen processor notebooks with hybrid graphics.RSR has been improved to provide a more seamless experience in borderless fullscreen mode with a performance/quality slider to personalize your gaming experience.Important NotesAMD Software Capture and Stream features and Overlay support for Clone mode and Eyefinity display configurations will be introduced at a later date.OpenGL applications that are 10-bit aware are no longer supported with HDR display capabilities. Enabling 10-Bit Pixel Format in advanced graphics settings is only recommended for use of 10-bit aware OpenGL applications and not required for enabling 10-Bit Color Display Capabilities.AMD is working with the game developers of Call of Duty: Warzone to resolve an issue where users may be experiencing stuttering on the Caldera map.Package ContentsThe AMD Software: Adrenalin Edition 22.9.1 installation package contains the following:AMD Software: Adrenalin Edition 22.9.1 Driver Version 22.20.19.11 for Windows 10 and Windows 11 (Windows Driver Store Version 31.0.12019.11001).Radeon Product CompatibilityNote: AMD Radeon R9 Fury, Radeon Pro Duo, Radeon RX 400 Series, Radeon RX 500 Series and Radeon RX Vega Series graphics are only supported by Radeon Software Adrenalin Edition on Windows 7/10 64-bit.Radeon Desktop Product Family Compatibility AMD Radeon RX 6900 Series GraphicsAMD Radeon RX 6800 Series GraphicsAMD Radeon RX 5700 Series GraphicsAMD Radeon R9 200 Series GraphicsAMD Radeon RX 5500 Series GraphicsAMD Radeon RX 500AMD Radeon 500X Series GraphicsAMD Radeon VIIAMD Radeon Pro DuoAMD Radeon RX Vega Series GraphicsAMD Radeon R7 200 Series GraphicsAMD Radeon RX 400 Series GraphicsAMD Radeon R5 300 Series GraphicsAMD Radeon R7 300 Series GraphicsAMD Radeon R5 200 Series GraphicsAMD Radeon R9 Fury Series GraphicsAMD Radeon HD 8500 - HD 8900 Series GraphicsAMD Radeon R9 Nano Series GraphicsAMD Radeon HD 7700 - HD 7900 Series GraphicsAMD Radeon R9 300 Series GraphicsAMD Radeon RX Vega Series GraphicsAMD Radeon RX 500 Series GraphicsAMD Radeon RX 400 Series GraphicsAMD Radeon Pro DuoAMD Radeon R7 300 Series GraphicsAMD Radeon R7 200 Series GraphicsAMD Radeon R9 Fury Series GraphicsAMD Radeon R5 300 Series GraphicsAMD Radeon R9 Nano Series GraphicsAMD Radeon R5 200 Series GraphicsAMD Radeon R9 300 Series GraphicsAMD Radeon HD 8500 - HD 8900 Series GraphicsAMD Radeon R9 200 Series GraphicsAMD Radeon HD 7700 - HD 7900 Series GraphicsMobility Radeon Family CompatibilityAMD Radeon R9 M300 Series GraphicsAMD Radeon R7 M200 Series GraphicsAMD Radeon R7 M300 Series GraphicsAMD Radeon R5 M200 Series GraphicsAMD Radeon R5 M300 Series GraphicsAMD Radeon HD 8500M - HD 8900M Series GraphicsAMD Radeon R9 M200 Series GraphicsAMD Radeon HD 7700M - HD 7900M Series GraphicsAMD APU Product Family CompatibilityAMD APU products codenamed "Kaveri", "Godavari" and "Carrizo" are only supported by AMD Radeon Software Crimson Edition on Windows 7 (32 & 64-bit), Windows 8.1 (64-bit) and Windows 10 (64-bit). AMD's 7th Generation APU products Radeon Graphics are only supported by AMD Radeon Software Crimson Edition on Windows 7 (32 & 64-bit) and Windows 10 (64-bit).AMD APU Product Family CompatibilityAMD A-Series APUs with Radeon R4, R5, R6, or R7 GraphicsAMD A-Series APUs with Radeon R3, R4, R5, R6, R7, or R8 GraphicsAMD Pro A-Series APUs with Radeon R5 or R7 GraphicsAMD Pro A-Series APUs with Radeon R5, R6, or R7 GraphicsAMD Athlon Series APUs with Radeon R3 GraphicsAMD FX-Series APUs with Radeon R7 GraphicsAMD Sempron Series APUs with Radeon R3 GraphicsAMD E-Series APUs with Radeon R2 GraphicsAMD Radeon HD 8180 - HD 8400 Series GraphicsCompatible Operating Systems:Radeon Software Crimson ReLive Edition is designed to support the following Microsoft Windows platforms. Operating System support may vary depending on your specific AMD Radeon product.Windows 10 (32 & 64-bit version)Windows 8.1 (32 & 64-bit version)Windows 7 (32 & 64-bit version with SP1 or higher)Notes:This driver is not intended for use on AMD Radeon products running in Apple Boot Camp platforms. Users of these platforms should contact their system manufacturer for driver support. When installing Radeon Software Crimson ReLive Editionfor the Windows operating system, the user must be logged on as Administrator, or have Administrator rights to complete the installation of Radeon Software Crimson ReLive Edition. Radeon Software Crimson ReLive Edition requires Windows 7 Service Pack 1 to be installed.AMD terminated support for Windows 8 32-bit. We have links version 17.1.2, which was the last version for Windows 8 32-bit.Previous versions:
    0 Comments ·0 Shares ·38 Views
  • Asia's richest man Mukesh Ambani plots world's largest AI data center in India
    www.techspot.com
    Forward-looking: Indian business magnate Mukesh Ambani is gearing up to take on global tech giants like Google, Microsoft, and OpenAI. The billionaire has announced plans to build what could potentially become the world's largest AI data center, strategically located in his home state of Gujarat. The data center is being constructed in Jamnagar and is projected to have a staggering total capacity of three gigawatts upon completion. For perspective, the largest operational data centers today peak at under one gigawatt. This means Ambani's facility would dwarf anything currently in existence.Building such a massive data center is no small feat. Cost estimates place the project in the range of $20-30 billion, based on pricing for similar large-scale facilities. Bloomberg notes that Reliance Industries, Ambani's flagship company, has approximately $26 billion on its balance sheet. To finance this ambitious undertaking, Ambani may need to explore creative solutions, including potential government funding.Indian billionaire Mukesh Ambani and his wife Nita Ambani, pictured alongside their iconic 27-story residence, Antilia a symbol of architectural marvel and unparalleled luxury in Mumbai.Adding to the excitement, the report highlights that Ambani has already begun purchasing chips from Nvidia to power the data center's infrastructure.If Ambani can pull it off, he aims to dominate the AI market by offering low-cost inferencing services. This strategy involves renting out compute power to companies looking to run AI models without shouldering the expense of building and maintaining their own data centers. // Related StoriesAmbani is no stranger to aggressive, market-disrupting tactics. His infamous price war in the telecom sector drove several Indian carriers, including Aircel and Telenor, out of business. Now, it seems he's applying a similar playbook to the burgeoning AI compute market.What makes this project even more intriguing is Ambani's commitment to sustainability. The massive data center will be powered, as much as possible, by renewable energy sourced from an adjacent green energy complex featuring wind, solar, and hydrogen power.However, Ambani isn't alone in the race to expand AI compute capacity. This week, OpenAI, SoftBank, and Oracle announced a new venture called the Stargate Project, with plans to invest a staggering $500 billion in AI infrastructure across the United States.
    0 Comments ·0 Shares ·38 Views
  • New vaccine from MIT and Caltech could prevent future coronavirus outbreaks
    www.techspot.com
    The big picture: Researchers from MIT and Caltech have developed an experimental nanoparticle vaccine designed to protect against a broad range of coronaviruses. This includes not only variants of the virus responsible for Covid-19 but also other coronaviruses currently circulating in animals that could potentially jump to humans in the future. In simple terms, scientists may have discovered a way to stay ahead of the next coronavirus outbreak before it even starts. Traditional vaccines typically focus on the most accessible parts of viruses, which are often the rapidly mutating receptor-binding domains. These RBDs enable the virus to latch onto and infect human cells. However, because these regions mutate frequently with each new replication of the virus, vaccines targeting them can quickly become ineffective unless updated promptly.Caltech researchers decided to take a different approach. Instead of targeting the highly variable RBDs, they focused on more conserved regions of the virus areas that remain relatively stable across different strains. They engineered a nanoparticle vaccine that displays 60 copies of RBDs from up to eight different coronaviruses. While these viruses share conserved regions, their variable regions differ, allowing the vaccine to prepare the immune system to respond to a broader range of potential threats.When injected, the nanoparticle exposes the immune system to all the distinct RBD shapes simultaneously. This prompts the body to produce antibodies targeting both the variable and conserved regions of the viruses.With this approach, the immune system gains a robust line of defense capable of neutralizing entire families of viruses, making it much harder for any single strain to evolve and bypass the vaccine's protection.In animal studies, a nanoparticle vaccine called "mosaic-7COM" outperformed earlier iterations, such as "mosaic-8," by generating strong antibody responses against seven different SARS-CoV-2 variants and four other related coronaviruses from the sarbecovirus family. The vaccine successfully prevented the virus from infecting the test animals. // Related StoriesRemarkably, mosaic-7COM also demonstrated near-equivalent effectiveness in animals that had previously received existing mRNA-based Covid-19 vaccines. This finding reflects a real-world scenario, where next-generation vaccines must enhance pre-existing immunity.Efforts are already underway to progress the mosaic-8 nanoparticle into clinical trials. Researchers also plan to test mosaic-7COM soon, given its superior performance in recent studies. Additionally, they are exploring ways to adapt the vaccines for mRNA delivery, a step that could simplify large-scale manufacturing.Image credit: Artem Podrez, Polina Tankilevitch
    0 Comments ·0 Shares ·46 Views
  • Many people left Meta after Zuckerberg's changes, but user numbers have rebounded
    www.techspot.com
    A hot potato: Meta and Mark Zuckerberg have been through a lot of changes this month, many of which have upset some people. Removing fact-checkers, allowing users to say pretty much anything, and killing diversity programs led to calls for a boycott of Meta platforms. It doesn't appear to have much of an effect on its user numbers, though things might have been different had TikTok's future been more certain. Earlier this month, Zuckerberg signaled that he wanted Meta to be more like X by removing "politically biased" third-party fact checkers in favor of community notes and focusing on free speech and political discourse. The company is also killing its DEI programs.Some users of Meta's platforms deleted their accounts in protest at the new policies. There were also calls for a boycott from R.E.M frontman Michael Stipe, who urged people to log out of Facebook, Instagram, Threads, Messenger, WhatsApp, Giphy, Meta Quest, and Ray-Ban Meta smart glasses for a week as part of a campaign called "Lights Out Meta."View this post on InstagramA post shared by Michael Stipe (@michaelstipe)As noted by Business Insider, several analytics firms reported a slight decline in engagement among Meta users following the announcement of the changes. However, news of TikTok's impending ban sent many of ByteDance's users over to its rival's platforms. Facebook's number of Daily Active Users (DAUs) had been down 2% for most of January, according to Apptopia, but it began showing year-on-year growth ahead of the Supreme Court's decision to uphold the TikTok ban.Engagement on Meta's platforms has now returned to the same level it was before Zuckerberg's announcement. Instagram, which has a user demographic closer to TikTok's, saw an even more impressive rebound in DAUs than Facebook. Instagram also has the advantage of Reels, a short-form video platform that has attracted many former TikTokers. // Related StoriesIt's not just new users that Meta might be thanking TikTok for. If TikTok does disappear from the United States completely the app is working but still not available to download from US app stores it's estimated that Meta could bring in up to $3.37 billion from newly available ad revenue.A recent survey of 1,346 Americans by CivicScience found that 36% of participants supported Meta's changes while 32% opposed them and 32% were neutral. GenZ users aged 18-24 were the biggest supporters, with just over half supporting the moves.It was reported earlier today that many people are selling phones with TikTok installed on eBay in the wildly optimistic hope of making thousands of dollars.
    0 Comments ·0 Shares ·47 Views
  • Sony to cease Blu-ray production, leaving physical media fans concerned
    www.techspot.com
    The big picture: Sony, LG, Panasonic, and retailers in general have been gradually withdrawing from the Blu-ray business for years as consumers increasingly favor streaming and other forms of digital distribution. However, some market indicators suggest that demand for physical media hasn't completely evaporated, with many consumers harboring concerns over quality and ownership. Sony recently announced plans to end Blu-ray media production in February, delivering yet another heavy blow to the future of physical media. The company will also cease manufacturing recording MiniDiscs, MD data, and MiniDV cassettes.This news follows Sony's decision last year to stop producing writable 25GB BD-REs, 50GB BD-RE DLs, 100GB BD-RE XLs, and 128GB BD-R XLs for consumers. In this week's brief announcement, Sony confirmed that "Blu-ray media production" will halt, though the exact formats affected were not specified.Manufacturers have been steadily exiting the Blu-ray market for years, as physical media sales continue to decline in favor of digital distribution. Panasonic, for example, stopped producing Blu-rays in 2023, citing weakening demand amid the growing popularity of streaming. Similarly, Best Buy discontinued sales of Blu-rays, Ultra-HD Blu-rays, and DVDs later that year.Sony has also acknowledged that the Blu-ray storage market never lived up to expectations. When the company announced the end of BD-R production last year, LG followed suit by exiting the Blu-ray player market a few months later. Currently, Sony and Panasonic are the last remaining major manufacturers of Blu-ray drives.It remains unclear how Sony's decision to end Blu-ray production will affect the distribution of physical games for its PlayStation consoles. The recently launched PlayStation 5 Pro does not include an optical drive by default, but demand for one was so strong that Sony's optional $80 Blu-ray drive attachment sold out quickly after the PS5 Pro's launch last fall. // Related StoriesThe retreat from Blu-ray manufacturing also carries significant implications for the home video market. While streaming dominates the landscape, many consumers continue to prefer physical discs for various reasons.Although the quality of 4K streaming is likely good enough for most viewers, Ultra-HD Blu-rays offer substantially higher bitrates. Furthermore, retaining a physical disc signifies true ownership of media, as content is never permanently available on streaming services.For now, major Hollywood films still receive physical home video releases. Meanwhile, consumer demand for discs has spurred a wave of Ultra-HD Blu-ray remasters of classic films from distributors like the Criterion Collection and others.
    0 Comments ·0 Shares ·14 Views
  • Tech giants urge EU to review Italy's anti-piracy measures amid overblocking concerns
    www.techspot.com
    Facepalm: Italy's Piracy Shield anti-piracy system has become a subject of intense debate and scrutiny since its launch in early 2024. Designed to combat live sports piracy by rapidly blocking piracy-related domain names and IP addresses, the system is being criticized for its broad powers and unintended consequences. While Piracy Shield has successfully blocked numerous pirate sources, it has also been plagued by incidents of overblocking. Reports indicate that legitimate services such as Google Drive and Cloudflare have been inadvertently blocked, causing disruptions for Italian internet users. These incidents have raised concerns about the system's accuracy and its potential impact on lawful online activities.The tech industry has recently voiced its apprehensions about Piracy Shield. The Computer & Communications Industry Association (CCIA), representing major tech companies like Amazon, Cloudflare, and Google, expressed serious concerns in a letter to the EU Commission. While acknowledging the system's intent to protect intellectual property rights, the CCIA argues that the DNS and IP-level blocking measures employed by Piracy Shield are overly broad and potentially harmful.One of the most significant incidents occurred on October 20, 2024, when Google Drive was mistakenly blocked by Piracy Shield. This error resulted in a three-hour blackout for all Italian users, with lingering effects for a substantial portion of users even after 12 hours.The CCIA has also raised concerns about the lack of transparency and adequate safeguards in Piracy Shield's implementation. The fact that the system was developed by a company affiliated with soccer league Serie A, one of the few rightsholders currently authorized to use it, has led to questions about potential conflicts of interest.In addition to blocking concerns, recent amendments to Italian copyright law have introduced new reporting obligations for intermediary providers. These changes, which include potential criminal penalties for non-compliance, have been criticized for potentially conflicting with EU law and creating a chilling effect on online expression and innovation. // Related StoriesThe tech industry is calling for significant reforms to Piracy Shield, including more robust verification protocols, enhanced transparency, and improved redress mechanisms for affected users. The CCIA has gone so far as to urge the EU Commission to engage with the Italian government to halt the anti-piracy measures pending a thorough review of their legality under EU law.As Piracy Shield approaches its first anniversary, its effectiveness in reducing illegal sports streaming remains debatable. While some data suggests a potential decrease in pirate traffic, conclusive evidence of its impact on new subscriber uptake and customer retention for legal services is still lacking.
    0 Comments ·0 Shares ·16 Views
  • Old Gigabyte laser mouse bursts into flames, the company is investigating
    www.techspot.com
    A hot potato: Optical laser-based mice should operate on a very small amount of current, and they should certainly not catch fire without any prior warning signs. However, as a Reddit user recently explained, a low-power peripheral can quickly turn into a safety hazard when fire unexpectedly becomes involved. A Reddit user shared his "hot" misadventure on the PC Master Race subreddit recently. User Lommelinn reported that his Gigabyte mouse caught fire a small incident that could have escalated into a major disaster because it almost burned down his apartment."I smelled smoke early this morning, so I rushed into my room and found my computer mouse engulfed in large flames," Lommelinn explained. His room quickly filled with black smoke, though he was able to extinguish the fire. Lommelinn noted that he inhaled a significant amount of smoke during the incident, and his room was left in poor condition, with black soot and particles coating the surfaces.The burned peripheral has been identified as the M6880X, an old Gigabyte wired mouse that went on sale in July 2014. The laser mouse features a "high accuracy" laser tracking engine, on-the-fly DPI switching between tree different values (800, 1200, 1600), and should pose no personal safety threats whatsoever. Lommelinn said he avoided the worst, but a mouse catching fire and burning a hole in the desk is still a shocking and unsettling experience.Needless to say, fellow redditors are going wild with speculation. Multiple factors could have caused the incident, including a short circuit or a faulty component that finally gave up. Some speculate that a damaged solder joint, degraded wiring, or an aging component may have caused excessive current flow, ultimately leading to the fiery disaster.The flaming M6880X mouse eventually caught Gigabyte's attention. The company acknowledged the incident shared by Lommelinn, stating that customer safety remains their top priority. Gigabyte is looking into the case, and has already contacted the unlucky Redditor to provide further support. One Redditor humorously pointed out the mystery of how a small 5V, 0.5A USB-powered mouse can suddenly go up in flames and take half a desk with it. // Related StoriesLommelinn also shared additional details about his now-burned setup, confirming that the mouse does not require batteries and that there is no glass surface in his room that could have focused light onto the device. With no apparent external causes for the fire, we'll have to wait for the conclusion of Gigabyte's investigation to understand what went wrong.
    0 Comments ·0 Shares ·14 Views
  • Chinese artificial sun shatters world record with 1,066-second plasma confinement
    www.techspot.com
    What just happened? China's Experimental Advanced Superconducting Tokamak (EAST), nicknamed the artificial sun, has shattered its own world record for plasma confinement. On Monday, EAST maintained a steady-state high-confinement plasma for an astounding 1,066 seconds, more than doubling its previous record of 403 seconds set in 2023. The breakthrough by the Institute of Plasma Physics (ASIPP) at the Hefei Institute of Physical Science represents a significant step forward in the quest for fusion power generation. The ability to sustain plasma for over 1,000 seconds is considered a crucial milestone in fusion research, bringing scientists closer to the goal of replicating the sun's nuclear fusion processes for clean, limitless energy production.For more than seven decades, scientists around the world have been on a quest to harness the power of fusion energy, facing numerous challenges along the way. Key among them has been the need to manage extreme heat, with researchers striving to achieve and maintain temperatures exceeding 100 million degrees Celsius. Scientists have also grappled with operational stability, working to confine plasma for extended periods. The fusion process itself demands exquisite control, requiring the development of sophisticated systems to regulate the intricate dance of atomic nuclei. Perhaps most crucial of all is the pursuit of energy efficiency, as researchers strive to create a fusion reaction that produces more power than it consumes.While EAST's milestone doesn't address all of these challenges, the record-breaking run demonstrates significant progress in confining plasma for extended periods. "A fusion device must achieve stable operation at high efficiency for thousands of seconds to enable the self-sustaining circulation of plasma, which is essential for the continuous power generation of future fusion plants," explained Song Yuntao, ASIPP director and vice president of HFIPS, adding that the recent record marks a critical step toward realizing a functional fusion reactor.EAST, which began operations in 2006, is the first tokamak to contain a deuterium plasma using superconducting niobium-titanium toroidal and poloidal magnets. The tokamak maintains plasma in the "H-mode" or high-confinement regime, a state employed by modern tokamaks that results in a sudden improvement of plasma confinement by a factor of two. This achievement is particularly significant for the development of future fusion reactors, including the International Thermonuclear Experimental Reactor (ITER) currently under construction in France.Since joining the ITER program in 2006 as its seventh member, China has been responsible for approximately 9 percent of the project's construction and operation, with ASIPP serving as the primary institution for the Chinese mission.Gong Xianzu, head of the EAST Physics and Experimental Operations division, said EAST has made several upgrades that helped lead to this success, including new plasma diagnostic tools. The heating system, which originally consumed energy equivalent to nearly 70,000 household microwave ovens, has now doubled its power output while preserving operational stability and continuity. // Related StoriesNow, a new generation of experimental fusion research facilities is under construction in Hefei in the Anhui Province, home to the EAST reactor. They will continue their work of accelerating the development and practical application of fusion energy technology.
    0 Comments ·0 Shares ·14 Views
  • Doom: The Dark Ages gameplay details, release date, and system requirements revealed
    www.techspot.com
    Highly anticipated: As Bethesda and id Software start the final hype cycle before the launch of Doom: The Dark Ages on May 15, the official PC system requirements reveal that ray tracing, a once-optional feature that dramatically reduces frame rates, is increasingly becoming mandatory in major game releases. Notably, the game also requires an NVMe SSD. Doom: The Dark Ages, the upcoming prequel to id Software's 2016 revival of the classic FPS franchise, is now available for pre-order and is set to launch on May 15 on Steam, the Microsoft Store, Game Pass, Xbox series consoles, and PlayStation 5. Customers who pre-purchase the premium edition can begin playing on May 12.The system requirements confirm that the game requires a GPU capable of hardware-assisted ray tracing, even at the lowest graphics settings. Other examples of this trend include Star Wars Outlaws, Indiana Jones and the Great Circle, and the upcoming Assassin's Creed Shadows.Gameplay at 1080p and 60fps with low settings requires an Nvidia RTX 20 series, AMD Radeon RX 6000 series, or newer graphics card with at least 8 GB of VRAM. For higher graphics settings at 1440p, 32GB of RAM and 10GB of VRAM are recommended, while maxed-out 4K gameplay increases the VRAM requirement to 16GB.After showcasing the game at this week's Xbox Developer Direct, id Software also confirmed that Doom: The Dark Ages supports path tracing. Titles like Indiana Jones, Alan Wake 2, and Cyberpunk 2077 have used path tracing, also known as full ray tracing, to render extremely comprehensive lighting and shadows, albeit with a substantial performance cost. Further details on Doom's implementation of the feature are forthcoming.Click to enlargeAnother new mandate that might concern users with older PCs is 100GB of NVMe storage. High-end games now commonly require SSDs but typically also support SATA drives, and many will still launch on hard disk drives. // Related StoriesIn a roughly 10-minute demonstration (watch below), id showcased new weapons and other features from Doom: The Dark Ages. The chainsaw appears to have been replaced with a mace, flail, electrified gauntlet, and a throwable shield. New guns include a rail spike launcher, skull chipper, and a double-barreled plasma cannon. Additionally, the new entry boasts the largest and most densely populated levels in the franchise's history.Doom: The Dark Ages also gives players more granular control over the difficulty settings than prior titles. The options menu includes sliders for player damage, enemy damage, the parry window, overall game speed, and more.
    0 Comments ·0 Shares ·30 Views
  • www.techspot.com
    TL;DR: Parents, students, and educators across North America are reeling after what is shaping up to be the largest data breach of the new year. Hackers infiltrated a cloud-based software provider used by K-12 schools, compromising the sensitive information of millions of students and school personnel. Based in Folsom, California, PowerSchool serves 16,000 schools globally and manages data for over 60 million students. On January 7, the company confirmed that attackers had accessed and exfiltrated personal data stored in its Student Information System.The stolen data includes Social Security numbers, medical records, and home addresses. A report by Bleeping Computer revealed an extortion note from the attackers claiming they had stolen the records of 62.4 million students and 9.5 million teachers.Among the hardest hit is the Toronto District School Board in Canada, which disclosed Monday that information on all students enrolled between 1985 and 2024 was exposed, equating to 1.4 million students and over 90,000 teachers. The data included names, dates of birth, health card numbers, home addresses, disciplinary notes, and even residency status. The district noted that the scope of the breach varied depending on the enrollment period but affected every student within that timeframe.District NameStudents ImpactedTeachers ImpactedToronto District School Board1,484,73390,023Peel District School Board943,08239,693Dallas Independent School District787,21279,718Calgary Board of Education593,518133,677Memphis-Shelby County School485,08754,501San Diego Unified472,278Possibly not stolenCharlotte-Mecklenburg Schools467,97457,486Wake County Public School461,00592,783California's Menlo Park City School District also reported significant fallout. All current students, staff, and anyone enrolled or employed since the 2009 2010 school year were impacted. This breach includes nearly 10,700 students and many former staff members.PowerSchool stated it had communicated with the hackers, who allegedly said they would not release the data, supported by a video of its purported deletion. However, experts warn that such claims are impossible to verify and that the threat actors could still post the stolen information on the dark web. Several school districts have included these assurances in their breach notifications despite the dubious deletion claims from the attackers. // Related StoriesPowerSchool has not confirmed the number of affected individuals or whether it paid a ransom. However, it has begun offering those impacted a free two-year credit monitoring package. The breach illustrates the vulnerabilities of online education systems. It's not just banks, large corporations, and social media platforms that hackers target.
    0 Comments ·0 Shares ·69 Views
  • First-ever data center on the Moon set to launch next month
    www.techspot.com
    In brief: The new space race has attracted multiple private ventures. From cargo delivery to facility construction in orbit and on the Moon, the burgeoning space economy has everyone racing to get in on the ground floor. If successful, an upcoming mission will establish the first lunar data center. Florida-based startup Lonestar Data Holdings plans to launch the first Moon-based data center dubbed the "Freedom Data Center." The compact but fully operational information hub will piggyback on an upcoming lunar lander mission by Intuitive Machines aboard a SpaceX Falcon 9 rocket in February. Lonestar says storing data on the Moon offers unique benefits.First, it provides unmatched physical security and protection from natural disasters, cyber threats, and geopolitical conflicts that could put Earth-based data at risk. The solar-powered mini-facility is also much more environmentally friendly than energy-hungry data centers on our home planet, utilizing naturally cooled solid-state drives.The company has already lined up some high-profile early customers for their lunar platform, including the state of Florida, the Isle of Man government, AI firm Valkyrie, and the pop rock band Imagine Dragons.The company has been working towards this milestone for years, successfully testing data storage on the Moon in February last year and aboard the International Space Station in 2021. However, putting something as complex as a data center on the lunar surface is still an enormous technical challenge.The harsh environment, maintenance difficulties, and astronomical costs could create some problematic issues. There are also inherent risks associated with space launch. There is no option for equipment recovery if something goes wrong. Thankfully, the data center will have a ground-based backup at a Flexential facility in Tampa. // Related StoriesLonestar has yet to release specific operational details or hardware specs. It will be interesting to see the company's plans for communication between lunar and ground-based facilities.Lonestar isn't the only venture planning to establish a lunar data center. Reuters reports that several other companies are eyeing similar space-based facilities, including Lumen Orbit, which recently raised $11 million at a $40 million valuation.Masthead credit: Pavel Chagochkin
    0 Comments ·0 Shares ·67 Views
  • www.techspot.com
    7-Zip is a free and open-source file archiver utility compatible with many compression formats. By default, 7-Zip creates 7z-format archives with a .7z file extension, but when used for creating ZIP and GZIP formats, 7-Zip provides a compression ratio that is 2-10 % better than the ratio provided by PKZip and WinZip.You can use 7-Zip on any computer, even in commercial organizations free of charge. Most of the source code is under the GNU LGPL license. The unRAR code is under a mixed license: GNU LGPL + unRAR restrictions. You don't need to register or pay for 7-Zip.FeaturesHigh compression ratio in new 7z format with LZMA compressionSupported formats: Packing / unpacking: 7z, ZIP, GZIP, BZIP2 and TARUnpacking only: RAR, CAB, ISO, ARJ, LZH, CHM, MSI, WIM, Z, CPIO, RPM, DEB and NSISFor ZIP and GZIP formats, 7-Zip provides a compression ratio that is 2-10 % better than the ratio provided by PKZip and WinZipStrong AES-256 encryption in 7z and ZIP formatsSelf-extracting capability for 7z formatIntegration with Windows ShellPowerful File ManagerPowerful command line versionPlugin for FAR ManagerLocalizations for 69 languages7-Zip works in Windows 98/ME/NT/2000/XP/Vista/7/8/10/11. There is a port of the command line version to Linux/Unix.What's NewThe default dictionary size values for LZMA/LZMA2 compression methods were increased.7-Zip now can calculate the following hash checksums: SHA-512, SHA-384, SHA3-256 and MD5.APM and HFS support was improved.If an archive update operation uses a temporary archive folder and the archive is moved to the destination folder, 7-Zip shows the progress of moving the archive file, as this operation can take a long time if the archive is large.The bug was fixed: 7-Zip File Manager didn't propagate Zone.Identifier stream for extracted files from nested archives (if there is open archive inside another open archive).Some bugs were fixed.Previous release notes:The bug was fixed: 7-Zip could crash for some incorrect ZSTD archives.The bug was fixed: 7-Zip could not unpack some ZSTD archives.New switch -myv={MMNN} to set decoder compatibility version for 7z archive creating. {MMNN} is 4-digit number that represents the version of 7-Zip without a dot. If -myv={MMNN} switch is specified, 7-Zip will only use compression methods that can be decoded by the specified version {MMNN} of 7-Zip and newer versions. If -myv={MMNN} switch is not specified, -myv=2300 is used, and 7-Zip will only use compression methods that can be decoded by 7-Zip 23.00 and newer versions.New switch -myfa={FilterID} to allow 7-Zip to use the specified filter method for 7z archive creating.New switch -myfd={FilterID} to disallow 7-Zip to use the specified filter method for 7z archive creating.Some bugs were fixed.
    0 Comments ·0 Shares ·67 Views
  • www.techspot.com
    What just happened? Researchers at Carnegie Mellon University's School of Computer Science have introduced a breakthrough technology that could transform the way wearable devices are powered. Called "Power-Over-Skin," this innovation allows electricity to travel through the human body, potentially eliminating the need for traditional batteries in a wide range of wearable tech. The technology addresses a longstanding challenge in the wearable device industry. From continuous glucose monitors for diabetes management to pacemakers and fitness trackers, these devices rely on batteries that require regular charging and maintenance.Andy Kong, a leading member of the research team, explains that Power-Over-Skin aims to remove this obstacle by enabling devices to operate seamlessly and discreetly an essential factor for effective health monitoring.Still in its early stages, the technology employs a single battery-powered transmitter worn on the body to deliver power to various receivers. During testing, the researchers successfully powered small objects, including LED lights, a Bluetooth joystick embedded in a ring, and a light-up earring. They found that the power received by these devices was directly proportional to their distance from the transmitter, with closer devices receiving more power.Kong compares the process to how a radio uses air as a medium to transmit signals between a broadcasting station and a car stereo, with the key difference being that Power-Over-Skin uses body tissue as the transmitting medium. The researchers focused on maximizing power transmission efficiency through the body, discovering that square waves delivered significantly more power than the sine waves used in earlier experiments.The technology operates through capacitive coupling at 40 MHz RF energy, enabling safe and efficient power transfer through the skin. This approach allows continuous power delivery directly from a source on the body to multiple wearable devices. // Related StoriesWhile the current demonstrations focus on low-power electronics, the researchers envision future applications powering more energy-intensive devices like smart glasses and advanced wearables. To achieve this, the team is working to increase the power output by a factor of 10, with the ultimate goal of powering devices such as earbuds.
    0 Comments ·0 Shares ·63 Views
  • Canon pushes the limits of 35mm with record-breaking 410-megapixel sensor
    www.techspot.com
    What just happened? Canon has announced a 410-megapixel CMOS image sensor, setting a new record for the most number of pixels on a 35mm full-frame sensor. Before getting too excited, you should know that the sensor is destined for applications like surveillance, industrial imaging, and medicine that demand extreme resolution rather can consumer-grade cameras. The 410-megapixel (24,592 x 16,704 pixels) sensor boasts a resolution that is 198 times greater than Full HD, and 12 times higher than 8K. With it, users should be able to crop any part of an image without a significant drop in quality.Canon said the new sensor utilizes a back-illuminated stacked formation in which the signal processing element and pixel segment are interlayered. The imaging specialist also had to redesign the circuitry pattern, which enabled a super fast readout speed of 3,280 megapixels per second.Super-high pixel count sensors already exist in medium- or larger formats, but fitting this level of resolution into a 35mm sensor is unprecedented and will help contribute to the miniaturization of shooting equipment, Canon said.Full-color and monochrome versions of the sensor will be available. The latter will additionally feature a four-pixel binning function that effectively treats four pixels as one a trick we've seen smartphone image sensors use to boost performance, especially in low light situations. Canon said that when this feature is in use, the sensor can capture 100-megapixel video at 24 frames per second. Otherwise, video is limited to eight frames per second.Canon has been pushing the bounds of what's possible with imaging tech for a while now. Late last year, the company showed off a 250-megapixel APS-H CMOS sensor intended for industrial inspection applications. On the consumer side, we got a new EOS R series camera aimed at entry-level photographers wanting to step up from a smartphone. // Related StoriesCanon will showcase its new sensor at the SPIE Photonics West conference for optics and photonics. The show starts on January 28 and runs through the 30th in San Francisco.
    0 Comments ·0 Shares ·67 Views
  • www.techspot.com
    WTF?! With everything that's happened to TikTok over the last few days, it's no surprise that the app and owner ByteDance have rarely been out of the news cycle. There have even been reports that YouTube's most-subscribed star, Jimmy Donaldson, aka MrBeast, is in the race to buy TikTok's US operations, though his spokesperson said discussions are ongoing. Donaldson posted a jokey message on X on January 13 that read, "Okay fine, I'll buy Tik Tok so it doesn't get banned." A day later, he wrote that "Unironically I've had so many billionaires reach out to me since I tweeted this, let's see if we can pull this off."Donaldson taking over TikTok might be more plausible than it sounds. There were reports earlier this month that Chinese authorities were considering selling TikTok to X owner Elon Musk. Donald Trump has said he is open to Musk or fellow tech billionaire Larry Ellison buying the app in the US.On Tuesday, law firm Paul Hastings announced that it is advising a syndicate of investors led by Recruiter.com founder/CEO Jesse Tinsley in an all-cash offer to acquire TikTok's US operations and various assets from ByteDance. It adds that the investor group comprises institutional investors and high-net-worth individuals, including MrBeast.The statement does not include the size of the bid, though Trump said the app was valued at $1 trillion with a "permit" to operate in the US under 50% US ownership. Donaldson's net worth is estimated to be around $1 billion.On Tuesday evening, Donaldson replied to a tweet about his bid to buy TikTok alongside investors. He wrote that "The leading groups who are all credible bidding on Tik Tok have reached out for us to help them, I'm excited to partner/make this a reality. Big things cooking"While all signs point to Donaldson trying to become a part-owner of US TikTok, MrBeast spokesperson Matthew Hiltzik said the YouTube star hasn't yet officially joined any bids. "Several buyers are holding ongoing discussions with Jimmy," Hiltzik told The Associated Press on Wednesday. "He has no exclusive agreements with any of them." // Related StoriesAfter blocking access for US users on Saturday, hours before it was due to be banned, TikTok's services began returning on Sunday, for which the company thanked Trump. On his first day in office on Monday, the president issued an executive order telling the DoJ not to enforce the Protecting Americans from Foreign Adversary Controlled Applications Act or to punish those who violate it for 75 days.TikTok is still not available to download from the Apple App and Google Play Stores after being removed over the weekend.
    0 Comments ·0 Shares ·73 Views
  • Buffalo's latest USB stick features hardware-level antivirus security
    www.techspot.com
    In a nutshell: Buffalo has introduced the RUF3-KEV a USB flash drive with a unique twist. This seemingly ordinary device is engineered to serve as the final line of defense against viral infections when transferring files between computers. At first glance, the RUF3-KEV looks like an ordinary USB stick. However, beneath its unassuming exterior lies Buffalo's proprietary "DiXiM Security Endpoint" an embedded anti-malware system that monitors files for threats in real-time. According to PC Watch, any rogue programs or virus signatures are instantly quarantined and neutralized before they can cause harm.The drive also includes a built-in antivirus scanner that scans new file transfers, detecting and removing infected payloads on the fly. An additional "heuristic" layer analyzes program behaviors, identifying and isolating anything that exhibits suspicious traits.For added security, the RUF3-KEV features password authentication, preventing unauthorized access to its data.The RUF3-KEV features a capless design to prevent dust infiltration, and its auto-retracting USB connector reduces wear and tear. Despite being a USB 3.2 Gen 1 flash drive, it boasts a compact stature, measuring approximately 19.8 x 10 x 68 mm.It's worth noting that USB 3.2 Gen 1 is far from the latest standard, as it has been surpassed by USB 3.2 Gen 2, USB 3.2 Gen 2x2, and USB4. Clearly, speed is not the primary focus of this device. // Related StoriesThe drive is available in 64GB, 32GB, and 16GB capacities. Pricing, announced in Japanese yen, starts at 10,000 ($64) for the 64GB version, 8,300 ($53) for the 32GB model, and 6,600 ($42) for the 16GB option.For now, the RUF3-KEV is only available via Amazon Japan, which may involve additional shipping costs. While Buffalo has a US subsidiary (Buffalo Americas), it remains unclear if the device will launch stateside.For users frequently transferring sensitive files between computers, the added peace of mind may justify the extra expense.Buffalo, best known for its LinkStation and TeraStation lines of network-attached storage devices that enable centralized data access across networks, also offers a range of portable drives and networking components.
    0 Comments ·0 Shares ·71 Views
  • World's fastest supercomputer, "El Capitan," goes online to safeguard US nuclear weapons
    www.techspot.com
    What just happened? The world's fastest supercomputer has gone online at the Lawrence Livermore National Laboratory (LLNL) in California. Called "El Capitan," the machine was unveiled earlier this month after around eight years of research and development. It will be used to secure the US nuclear stockpile and for classified research. El Capitan can reach a peak performance of 2.746 exaFLOPS, making it the National Nuclear Security Administration's first exascale supercomputer. It's the world's third exascale machine after the Frontier supercomputer at Oak Ridge National Laboratory in Tennessee and the Aurora supercomputer at the Argonne Leadership Computing Facility, also in Illinois.The world's fastest supercomputer is powered by more than 11 million CPU and GPU cores integrated into 43,000+ AMD Instinct MI300A accelerators. Each MI300A APU comprises an EPYC Genoa 24-core CPU clocked at 1.8GHz and a CDNA3 GPU integrated onto a single organic package, along with 128GB of HBM3 memory.According to Pythagoras Watson, the team lead of the advanced technology system at LLNL, the system's peak performance is 2.79 quintillion calculations per second. As a measure of how astronomically large that number is, Watson explained to CBS News that if you went back in time 2.79 quintillion seconds, you'd arrive more than 70 billion years before the Big Bang.Built at a cost of around $600 million, the world's newest supercomputer was primarily designed to safeguard and secure the US nuclear arsenal, but it will also perform other classified tasks related to national security, including AI and machine learning workloads. It will also solve problems in materials science and physics.El Capitan was commissioned by the U.S. Department of Energy as part of its CORAL-2 program to replace the Sierra supercomputer, which was deployed in 2018. While Sierra is still in service, El Capitan far exceeds its speed and efficacy with 18 times faster performance. As pointed out by Live Science, Sierra is still operational, and was recently ranked as the 14th most-powerful supercomputer globally. // Related StoriesEl Capitan, which shares its name with the famous granite rock formation at Yosemite National Park, became fully operational last year, achieving a score of 1.742 exaFLOPS in the High-Performance Linpack (HPL) benchmark, which is used globally to judge supercomputing speeds. According to the LLNL, it would take a million smartphones working on a single calculation at the same time to match what El Capitan can do in one second.
    0 Comments ·0 Shares ·59 Views
  • Trump announces $500 billion "Stargate" initiative that aims to build largest-ever AI infrastructure project
    www.techspot.com
    What just happened? Donald Trump has announced that tech titans OpenAI, Oracle, and SoftBank are joining forces on a colossal $500 billion initiative dubbed "Project Stargate." According to the US president, the goal is to secure AI compute in the country and gain an edge against China, which has been rapidly advancing its own AI ambitions. At a White House press conference, Trump was flanked by the companies' top brass as he hyped up Stargate by proclaiming that it will cement the future of technology in the US while creating over 100,000 new jobs.The project's plan is for OpenAI, Oracle, and Japanese conglomerate SoftBank to plow an initial $100 billion into data center infrastructure across the country. This will balloon to a mind-boggling total investment of half a trillion dollars over the next four years. Big names like Arm, Microsoft, and Nvidia are also listed as technology partners.Oracle CEO Larry Ellison said at the conference that it's already begun building 10 data centers measuring half a million square feet each in Texas. But details are still hazy on where the next facilities will pop up.Ellison added that the project will include the construction of the largest computer ever built. This could be used to invent cancer vaccines and help prevent Covid-19-like pandemics.Not everyone seems enthusiastic about the announcement, though. Pushing back against the Stargate hype is none other than Elon Musk, who fired shots at the funding claims. // Related Stories"They don't actually have the money," Musk argued on X, saying SoftBank has under $10 billion truly locked in based on his intel.OpenAI CEO Sam Altman quickly clapped back under the same post, taunting Musk to "come visit the first site" and accusing him of putting his own companies' interests over those of the US. The two tech billionaires have been engaged in an increasingly bitter legal battle over OpenAI, which Musk was an early investor in before exiting.However, serious questions remain around AI's impact on jobs, safety risks, and more. Trump has already axed some of the Biden-era guardrails around AI and experts are warning that a lack of regulation could have major consequences as capabilities rapidly advance.There's been some buzz that the Stargate project is OpenAI's attempt to reduce its reliance on Microsoft by investing in dedicated AI infrastructure. But in its blog post, OpenAI appears to shut down that speculation, mentioning that the initiative actually builds on their existing partnership.The company says that it will "continue to increase its consumption of Azure as OpenAI continues its work with Microsoft with this additional compute to train leading models and deliver great products and services."
    0 Comments ·0 Shares ·48 Views
  • Bing search results in Edge are obscuring Chrome links, promoting Microsoft's browser
    www.techspot.com
    WTF?! Google and Microsoft have spent years engaging in dirty tricks campaigns designed to push people onto their respective browsers, Chrome and Edge. The latest tactic is one employed by the Windows maker: Edge hides Chrome's download links for some users when they perform a Bing search for the browser. As noticed by Windows Latest, searches for Chrome using Edge and via Bing (when signed out of your Microsoft account) on Windows 11 result in a "promoted by Microsoft" banner appearing at the top of the search results.The banner is a recommendation by the Redmond firm, advising users there's no need to download a new web browser and highlighting that Edge offers a fast, secure, and modern web experience that saves time and money. It also comes with the obligatory "Try now" button.Forcing obtrusive ads for its products down people's throats isn't new territory for Microsoft, of course. But this one arguably goes a little further by hiding the Chrome download links that are beneath the banner, and the small portion of the top Google result that is visible appears mostly blurred out.Courtesy of Windows LatestIt's easy to see the search results by clicking on the "See more" button further down the screen, and most people who do a search for Chrome likely intend to download it, no matter what Microsoft claims. However, less tech-savvy users may be persuaded by the banner's claims. // Related StoriesThe other thing to note is that few people are likely to encounter this banner. Google has an almost 90% share of the global search engine market, whereas Bing has 4%. It's a similar story in the browser market: Chrome has a 68.3% share, Edge has just under 5%.It appears that not everyone is seeing the banner. I couldn't get it to show, so it might be limited to a small set of users or certain locations.Microsoft's war against Chrome goes back a long way. Some examples of its pushiness include the company telling people in 2021 that the rival browser was "so 2008" and Edge was better. There were also full-size Edge ads that appeared on the Chrome website, and Edge was accused of stealing data from Chrome without users' consent in January.Google isn't a stranger to using such tactics, either. The company shows prompts to Edge users recommending Chrome, and in 2020 it showed a message that read "Google recommends switching to Chrome to use extensions securely" whenever Edge users visited the Chrome Web Store, though Google quickly removed that message.
    0 Comments ·0 Shares ·71 Views
  • Nvidia GeForce RTX 5090 Review
    www.techspot.com
    Exciting times for us computer enthusiasts as we can finally showcase the new GeForce RTX 5090 and the next generation of Nvidia GPUs, codenamed Blackwell, with the new flagship graphics card priced at $2,000.It's been two years since Nvidia released the mighty GeForce RTX 4090, an insane $1,600 GPU that smashed the previous-generation flagship by a 60% margin that is, 60% faster on average at 4K. This made it an extremely powerful and exciting option for high-end gaming, even if it was undeniably expensive.So, what's on offer here, and how can Nvidia justify a $2,000 price tag for the RTX 5090?Nvidia has faced some challenges this generation. While the RTX 50 series takes advantage of cutting-edge technologies such as PCI Express 5.0 and GDDR7 memory, the GPU is built using the same TSMC 4N process as the previous generation. Without improvements to the production node, significant performance gains would require an architectural overhaul, which isn't yet on the table.RTX 4090 FE on the left, 5090 FE on the rightGeForce RTX 5090GeForce RTX 4090GeForce RTX 5080GeForce RTX 4080 SuperGeForce RTX 4080Price $US MSRP$2,000$1,600$1,000$1,000$1,200Release DateJanuary 30, 2025October 12, 2022January 30, 2025January 31, 2024November 16, 2022ProcessTSMC 4NDie Size (mm2)750 mm2608.5 mm2378 mm2379 mm2Core Config21760 / 680 / 19216384 / 512 / 17610752 / 336 / 12810240 / 320 / 1129728 / 304 / 112L2 Cache (MB)96 MB72 MB64 MBGPU Boost Clock2407 MHz2520 MHz2617 MHz2550 MHz2505 MHzMemory Capacity32 GB24 GB16 GBMemory Speed28 Gbps21 Gbps30 Gbps23 Gbps22.4 GbpsMemory TypeGDDR7GDDR6XGDDR7GDDR6XBus Type / Bandwidth512-bit / 1792 GB/s384-bit / 1008 GB/s256-bit / 960 GB/s256-bit / 736 GB/s256-bit / 717 GB/sTotal Board Power575 W450 W360 W320 WTherefore, Nvidia's solution was to create a bigger and more powerful GPU. The die is now 23% larger, featuring 33% more cores. It comes equipped with 32 GB of 28Gbps GDDR7 memory on a 512-bit wide memory bus, delivering a bandwidth of 1,792 GB/s a hefty 78% increase over the RTX 4090.The RTX 5090 is a powerhouse, but it comes with an even steeper price tag, making it 25% more expensive than the RTX 4090. Given that price increase, we expect it to deliver performance far beyond what the specs suggest.RTX 4090 vs RTX 5090 ThermalsBefore we dive in and get into the blue bar graphs, let's take a look at how Nvidia's Founders Edition version of the RTX 5090 performs compared to the RTX 4090 FE card. For this comparison, we tested The Last of Us Part 1 at 4K with maxed-out settings.After an hour of load inside an enclosed ATX case, the RTX 5090 reached a peak GPU temperature of 73C, which is remarkable given how quiet and compact the card is. The fan speed peaked at 1,600 RPM and remained inaudible over our case fans, which are already very quiet.The cores averaged a clock speed of 2,655 MHz, while GPU power averaged 492 watts. The memory temperature peaked at 88C, with an operating frequency of 2,334 MHz, providing a transfer speed of 28 Gbps.In comparison, the RTX 4090 FE model peaked at 68C, with a memory temperature of 80C, and its fans spinning just below 1,500 RPM. Clearly, the RTX 5090 runs slightly hotter and louder. However, given that the RTX 5090 consumed, on average, 35% more power during testing and is a significantly smaller card, these results are nothing short of remarkable.We are incredibly impressed with what Nvidia has achieved here. The RTX 5090 might be the most impressive graphics card we've ever seen. You would never guess, just by looking at it, how much thermal load this cooler can handle so efficiently. It's an outstanding achievement. Now, let's see how it performs.Test System SpecsCPUAMD Ryzen 7 9800X3DMotherboardMSI MPG X870E Carbon WiFi (BIOS 7E49v1A23 - ReBAR enabled)MemoryG.Skill Trident Z5 RGB DDR5-6000 [CL30-38-38-96]Graphics CardsGeForce RTX 4070 GeForce RTX 4070 Super GeForce RTX 4070 Ti GeForce RTX 4070 Ti Super GeForce RTX 4080 GeForce RTX 4080 Super GeForce RTX 4090 GeForce RTX 5090 Radeon RX 7700 XT Radeon RX 7800 XT Radeon RX 7900 GRE Radeon RX 7900 XT Radeon RX 7900 XTXATX CaseMSI MEG Maestro 700L PZPower SupplyMSI MPG A 1000G ATX 3.0 80 Plus Gold 1000WStorageMSI Spatium 1TB M470 PCIe 4.0 NVMe M.2Operating SystemWindows 11 24H2Display DriverNvidia GeForce Game Ready 566.36 WHQL AMD Radeon Adrenalin 24.12.1Gaming BenchmarksMarvel RivalsStarting with Marvel Rivals at 1440p, we see that the RTX 5090 delivers 30% more performance than the RTX 4090. While this is a decent performance improvement, factoring in the 25% price increase makes it considerably less exciting.At 4K resolution, the margin increases slightly to 33%. This is a solid uplift, but the extreme price premium dampens the enthusiasm.S.T.A.L.K.E.R. 2: Heart of ChornobylS.T.A.L.K.E.R. 2 isn't the most optimized game, and as a result, the RTX 5090 maxes out at 94 fps at 1440p. This makes it only 22% faster than the RTX 4090, offering a very mild performance gain.At 4K, however, the RTX 5090 achieves a more reasonable 42% performance gain, rendering an average of 71 fps.Counter-Strike 2Next, we have Counter-Strike 2. At 1440p, the RTX 5090 is slightly slower than the RTX 4090, although the 1% lows are notably stronger. It's worth mentioning that the RTX 5090 was slower than the RTX 4090 at 1080p in multiple instances. This suggests a possible overhead issue with the Blackwell architecture, or perhaps the RTX 5090's large core count isn't being efficiently utilized at lower resolutions. Further investigation is needed here.Even at 4K, the RTX 5090 only offers an 8% performance increase over the RTX 4090. The issue doesn't appear to be a CPU bottleneck, given the higher frame rates observed at 1440p.God of War RagnarkPerformance in God of War Ragnark is outstanding at 1440p, hitting 268 fps on the ultra preset. However, this is only 22% faster than the RTX 4090, which is disappointing given the 25% higher cost.At 4K, the RTX 5090 scales better, achieving a 36% performance improvement with 195 fps compared to 143 fps on the RTX 4090 a much more favorable result.Delta ForceIn Delta Force, the RTX 5090 provides just 17% more performance than the RTX 4090 at 1440p. However, frame rates here are extreme and likely approaching a CPU bottleneck.At 4K, the margin extends to 27%, rendering 160 fps. While this is an improvement, it's still not an impressive uplift, especially considering the 25% higher price and the two-year gap between releases.Warhammer 40,000: Space Marine 2Space Marine 2 is a very CPU-limited game, and at 1440p, we appear to be hitting the limits of the 9800X3D processor. Oddly, the RTX 5090 is 4% slower than the RTX 4090 here. As observed in other instances at 1080p, this could indicate an overhead issue or inefficiencies in workloads that limit the RTX 5090's performance.At 4K, the RTX 5090 resolves this problem, delivering a 30% performance increase over the RTX 4090. While this is a decent uplift, it is undercut by the 25% price hike.Star Wars Jedi: SurvivorIn Star Wars Jedi: Survivor, the RTX 5090 delivers just a 14% improvement over the RTX 4090 at 1440p. However, with an average of 191 fps, performance remains impressive overall.At 4K, the RTX 5090 crosses the 100 fps threshold with 102 fps, making it 21% faster than the RTX 4090. Still, this is a disappointing margin given the higher cost.A Plague Tale: RequiemIn A Plague Tale: Requiem, the RTX 5090 delivers a 21% performance improvement over the RTX 4090 at 1440p. The results are partly CPU-limited, as suggested by similar 1% lows between the two GPUs.At 4K, the RTX 5090 pulls ahead with a 42% performance uplift, making this one of the better margins seen in the benchmarks.Cyberpunk 2077: Phantom LibertyIn Cyberpunk 2077: Phantom Liberty, the RTX 5090 struggles to deliver noteworthy gains at 1440p, with just a 19% improvement over the RTX 4090. The 1% lows are also similar, indicating other system limitations may be at play.At 4K, the margin improves to 32%. While the overall performance is excellent, this result remains underwhelming. It's worth noting that the second-highest preset was used, and ray tracing was not enabled for this test.Dying Light 2 Stay HumanFrame rates in Dying Light 2 using the high preset are extreme at 1440p, reaching 198 fps with the RTX 5090. However, this makes it only 24% faster than the RTX 4090.Even at 4K, the performance gain remains modest at 25% over the RTX 4090, which scales directly with the 25% price increase.Dragon Age: The VeilguardIn Dragon Age: The Veilguard, frame rates are limited to just under 130 fps at 1440p using the ultra preset, which selectively applies some ray tracing effects. While the focus of this portion of the review is on rasterization performance, ray tracing plays a role here.When increasing the resolution to 4K, the RTX 5090 averages 96 fps, only 10% faster than the RTX 4090. This is a very disappointing result.War ThunderWar Thunder runs at extremely high frame rates, even with the highest quality preset enabled. At 1440p, the performance is clearly CPU-limited, which we confirmed by testing at 1080p.Moving to 4K removes the CPU bottleneck, but even then, the RTX 5090 is only 15% faster than the RTX 4090. Granted, with frame rates well over 300 fps, performance is more than sufficient for gameplay, but in terms of relative performance, the RTX 5090 is underwhelming here.Marvel's Spider-Man RemasteredMarvel's Spider-Man Remastered is heavily CPU-limited at 1440p, with both the RTX 4090 and RTX 5090 capped at 222 fps.At 4K, the CPU bottleneck is mostly removed, but the RTX 5090 still appears slightly limited, averaging 212 fps. As a result, the RTX 5090 is just 26% faster than the RTX 4090.Hogwarts LegacyHogwarts Legacy is another title that is mostly CPU-limited at 1440p, resulting in similar performance between the RTX 4090 and RTX 5090.Increasing the resolution to 4K allows the RTX 5090 to pull ahead, delivering a 31% performance improvement. While the performance is excellent overall, the value remains questionable.The Last of Us Part IIn The Last of Us Part I, the RTX 5090 provides a solid performance uplift at 1440p, where it is 28% faster than the RTX 4090, averaging 204 fps. This results in excellent overall performance.At 4K, the RTX 5090 offers a 40% performance increase, averaging 125 fps. This is a strong result, especially when compared to most other titles.Star Wars OutlawsThe RTX 5090 achieves over 100 fps in Star Wars Outlaws at 1440p using the ultra preset. With ray tracing forced on, the RTX 5090 is 22% faster than the RTX 4090.Oddly, the margin decreases at 4K, where the RTX 5090 is just 19% faster than the RTX 4090. Typically, we expect the RTX 5090 to show greater advantages at higher resolutions, but that isn't the case here.StarfieldFinally, in Starfield, the RTX 5090 is only 4% faster than the RTX 4090 at 1440p using ultra-quality settings, limiting performance to 125 fps.At 4K, the RTX 5090 improves slightly but is still just 7% faster than the RTX 4090. There seems to be a limitation in this title that prevents the RTX 5090 from delivering the margins seen in other games at 4K.Performance SummaryAlthough we did not include 1080p data for individual games, here are the average results across the 17 games tested. As seen, both the RTX 4090 and RTX 5090 are heavily CPU-limited at this resolution, making them ideal for CPU benchmarking rather than GPU evaluation.Even at 1440p, the RTX 5090 is often heavily limited by the CPU, resulting in just a 12% performance improvement over the RTX 4090 across the 17 games tested.Now at 4K we can see the potential of the GeForce RTX 5090 where it delivers an average performance improvement of 27%, which looks solid on raw numbers but it's somewhat disappointing from a value perspective considering it costs 25% more than the 4090. This is why we've been joking internally, calling it the 4090 Ti as it really feels like that's what it is.Even if the RTX 5090 maintained the same $1,600 MSRP as the RTX 4090, it would still feel underwhelming as a next-generation flagship GPU. For comparison, the RTX 4090 was on average 60% faster than the RTX 3090 Ti, while launching at a lower price. It was also 73% faster than the RTX 3090 with only a 7% price increase. By comparison, the RTX 5090's performance and value fall far short of expectations for a generational leap.Power ConsumptionNow, let's look at power consumption. Most of our power data was recorded at 1440p, which is not ideal for measuring the full power usage of the RTX 5090, but we supplemented this with additional tests for clarity. In Starfield at 1440p, the RTX 5090 increased power consumption by 12% compared to the RTX 4090.In Star Wars Outlaws, we observed a 17% increase in power usage at 1440p, rising from 532 watts to 624 watts. Interestingly, in Space Marine 2, where the RTX 5090 performed worse than the RTX 4090 at 1440p, power consumption decreased by 15%, demonstrating that the RTX 5090 is highly efficient when not operating at full load.To better evaluate power usage, we re-tested the Radeon RX 7900 XTX, RTX 4090, and RTX 5090 at 4K in three games where the RTX 5090 performed well: Dying Light 2, Cyberpunk 2077, and A Plague Tale: Requiem.In these tests, the RTX 5090 increased power consumption by 37 41%, depending on the game. These results align more closely with the performance gains seen in these titles. Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption.Finally, we re-ran those same power tests with a 60 fps cap, which yielded some interesting results. In A Plague Tale: Requiem, power consumption for the RTX 5090 was nearly identical to the RTX 4090, with just a 2% increase. In Cyberpunk 2077, the RTX 5090 showed an 8% increase, while in Dying Light 2, it consumed 15% more power.Ray Tracing PerformanceRT - Metro Exodus EnhancedMetro Exodus Enhanced remains one of the few ray tracing games that provides a truly transformative experience with ray tracing enabled, so we felt it was important to include.As a side note before we show you the results, we've encountered issues testing Metro Exodus Enhanced with Radeon GPUs as of late. While the game has worked in the past, enabling ray tracing now causes system crashes with Radeon GPUs, regardless of whether AMD or Intel systems are used. AMD has replicated the problem and is aware of the issue, but unfortunately, a fix was not available in time for this review. As a result, we decided to exclude Radeon data and focus solely on the RTX 4090 and RTX 5090 performance.At 1080p, the RTX 5090 was 21% faster than the RTX 4090, and at 1440p, the margin increased to 33%. We did not test 4K ray tracing performance, as most titles deliver poor and often unplayable performance at that resolution, even with upscaling. However, Metro Exodus Enhanced would likely perform well on both the RTX 4090 and RTX 5090.RT - Alan Wake IIIn Alan Wake II, with quality upscaling enabled, the RTX 5090 was just 19% faster than the RTX 4090 at 1080p. Moving to 1440p did not significantly improve the results, with the RTX 5090 showing only an 18% performance gain.Overall, these are weak gains for the RTX 5090, and even with ray tracing enabled, the performance only just breaks the 100 fps barrier.RT - Cyberpunk 2077: Phantom LibertyUsing the ultra ray tracing preset with quality upscaling, Cyberpunk 2077: Phantom Liberty shows the RTX 5090 performing comparably to the RTX 4090 at 1080p, likely due to CPU limitations.At 1440p, the RTX 5090 pulls ahead slightly, offering an 11% performance increase with an average of 129 fps.RT - Marvel's Spider-Man RemasteredIn Marvel's Spider-Man Remastered, performance is heavily CPU-limited at both 1080p and 1440p. This is problematic, as frame rates are capped at 128 fps at 1440p, which is a limit achieved even by the RTX 4080 Super.While 4K benchmarks might provide more insight, the 128 fps cap at lower resolutions is concerning. Although this is solid performance overall, for those with high-refresh-rate monitors, it may not be enough. Furthermore, it's unlikely that many users spending $2,000 or more on a graphics card would settle for gaming at 60 fps, which is what would likely occur at 4K without upscaling.RT - Dying Light 2 Stay HumanIn Dying Light 2 using the high ray tracing preset with quality upscaling, the RTX 5090 achieved an average of 208 fps at 1080p, making it 18% faster than the RTX 4090.At 1440p, where CPU limitations are not a factor, the RTX 5090 was only 22% faster than the RTX 4090, making this an underwhelming result given the price premium.RT - Black Myth: WukongWith the very high ray tracing preset, the RTX 5090 delivered 123 fps at 1080p with quality upscaling, providing a 34% performance improvement over the RTX 4090.At 1440p, the RTX 5090 maintained a similar margin, being 36% faster and rendering an average of 98 fps. While this is a reasonable step forward relative to past products, the overall performance remains less impressive, especially since upscaling is required.Ray Tracing Performance SummaryWe used a five-game average for the ray tracing data since Metro Exodus Enhanced had to be excluded due to the issues with Radeon GPUs. On average, the RTX 5090 was 14% faster than the RTX 4090 at 1080p with upscaling.At 1440p, the RTX 5090 showed an average performance increase of just 17%. Notably, even with upscaling, the average frame rate at 1440p was just 123 fps far from impressive for a graphics card priced at $2,000.Cost per FrameHere's how the current and previous-generation mid-range to high-end GPUs compare in terms of value, based on MSRP. At $2,000, the RTX 5090 offers only a 1.5% improvement in value per frame compared to the RTX 4090.In other words, after more than two years, there's no meaningful improvement in cost per frame. The RTX 5090 is essentially just a faster RTX 40 series GPU.If we consider the best retail pricing for mid-2024 and assume the RTX 5090 will sell for $2,000, the value proposition looks slightly better. However, realistically, do we believe the RTX 5090 will actually sell for $2,000? Probably not.If anything, the retail price is likely to climb higher, making the value situation even worse. At $2,000, the RTX 5090 already represents poor value, and anything higher would make it an even tougher sell.What We Learned: It's the World's Fastest Gaming GPU, But...The GeForce RTX 5090 is now the world's fastest gaming GPU no surprise there. What is shocking, however, is that in our testing, it was on average just 27% faster than the RTX 4090 at 4K, while costing at least 25% more.This is why we've referred to it as the RTX 4090 Ti because, let's be honest, that's exactly what it is. Nvidia has tried to disguise this by marketing DLSS 4 multi-frame generation as a game-changing feature, akin to dangling a shiny set of keys to distract gamers.Speaking of DLSS 4, we haven't mentioned frame generation much in this review, despite Nvidia heavily promoting it as a key feature of the GeForce 50 series. This omission might seem odd, but we believe frame generation deserves a separate, dedicated analysis.We're already working on an in-depth DLSS 4 review, which will explore the technology in greater detail soon. The reason we tackle topics like frame generation and upscaling separately is that testing these features properly is complex. It's less about frame rates and more about image quality and, in the case of frame generation, latency.The reason we tackle topics like frame generation and upscaling separately is that testing these features properly is complex. It's less about frame rates and more about image quality and, in the case of frame generation, latency.To summarize briefly, frame generation doesn't deliver what Nvidia's marketing claims. It's not a true performance-enhancing feature; you're not genuinely going from 60 fps to 120 fps. Instead, you're getting the appearance of smoother gameplay, albeit with potential graphical artifacts, but without the tangible benefits of higher frame rates such as improved input latency.That doesn't mean frame generation is useless or that it's not a good technology. It can be helpful in certain scenarios, but Nvidia has weaponized the feature to mislead consumers, making claims like the upcoming RTX 5070 being faster than the RTX 4090, which is fundamentally untrue.We also strongly believe that showcasing frame generation performance in benchmark graphs is misleading. And while Nvidia would love for us to do just that, we see this as a slippery slope for gamers a race to the bottom, where winning benchmarks would become about who can spit out the most amount of interpolated frames... input and visual quality be damned.As it stands, DLSS 3 and DLSS 4 frame generation are best described as frame-smoothing technologies. Under the right conditions, they can be effective, but they don't truly boost FPS performance. Moreover, they're entirely unsuitable for competitive shooters or fast-paced games where the goal of high frame rates is to reduce input latency. Nvidia's narrative that all gamers will or should use frame generation couldn't be further from reality.Notes about CPU Pairing with the RTX 5090 and Ray TracingMoving on to another topic, about CPU performance, it's clear from the 1440p data we gathered that anyone investing in an RTX 5090 needs a high-end CPU, such as the 9800X3D. Even with the Zen 5 3D V-Cache processor, you'll frequently encounter CPU bottlenecks, especially if you aim for high refresh rates with ray tracing enabled.Speaking of ray tracing, you're almost certainly going to find reviews where the RT performance of the RTX 5090 relative to the 4090 is more impressive than what we saw for the majority of our testing, and this will come down to the quality settings used.Our testing focused on real-world scenarios that prioritize frame rates above 60 fps, as we believe most gamers spending $2,000 on a GPU won't settle for console-like frame rates.But in an effort to provide a bit more context, for example, in Black Myth: Wukong, we tested at 1440p using DLSS quality upscaling, where the RTX 5090 delivered 98 fps a 24% improvement over the RTX 4090. But if we disable upscaling, which we feel most gamers using ray tracing won't do, the frame rate of the 5090 drops to 64 fps, but this also meant that it was now 45% faster than the 4090, so a far more impressive margin here.This is comparable to what we see at 4K using DLSS upscaling, though again we're only gaming at around 60 fps, which some gamers will find acceptable, but I personally find it less than desirable, especially when spending so much money.Ultimately, the point is that the RTX 5090 can be 40-50% faster than the RTX 4090, depending on the game and settings. However, as demonstrated in this review, when targeting high frame rates, the difference is typically much smaller.Bottom LineAll things considered, the GeForce RTX 5090 is an impressive performer that falls short of meeting the expectations for a next-generation flagship GPU. It doesn't move the needle forward in terms of value or innovation and could easily fit into the GeForce 40 series lineup. If Nvidia had launched this as an RTX 4090 Ti, few would have batted an eye.We understand that Nvidia couldn't do much given the limitations of the current process node. However, they still could have delivered a more exciting product series. Even at $1,600, the RTX 5090 would have been far more appealing still not amazing, but much better than it is now.Without a process node upgrade, this release doesn't come close to the leap we saw from the RTX 3090 to the RTX 4090, which was vastly more significant. It's also clear that as Nvidia cements its position as the leader in AI hardware, GeForce seems to have taken a back seat to the big money in AI(just check out this graph, it's insane).We still expect the RTX 5090 to age well. While today's 27% average performance gain over the RTX 4090 is underwhelming, this margin will likely increase over time, potentially reaching 40% in more games.Unfortunately, this also means the more affordable models in the GeForce RTX 50 series will probably be underwhelming, offering only minor performance gains over the GPUs they replace. Nvidia could have addressed this by providing better VRAM configurations.For example, 12 GB on the RTX 5070 is simply unacceptable it should have at least 16 GB. If Nvidia had done this, the RTX 5070 might have been a worthwhile upgrade over the RTX 4070 and a much more significant step up from the RTX 3070.For those looking for a more positive take, the good news is that the RTX 5090 is faster than the RTX 4090, pushing 4K gaming closer to high-refresh-rate experiences. If you already had oodles of money to blow on a graphics card and missed out on the RTX 4090, the RTX 5090 could be a great addition to your gaming setup.In summary, the RTX 5090 is 25% more expensive than the RTX 4090, delivers an average of 27% more performance, includes 33% more VRAM, and consumes around 30% more power. Interpret that as you like. For now, our review is complete with a closer look at DLSS 4 coming soon let us know your thoughts on Nvidia's new flagship graphics card in the comments.Shopping Shortcuts:Nvidia GeForce RTX 5090 on AmazonNvidia GeForce RTX 5080 on Amazon (soon)AMD Radeon RX 7900 XTX on AmazonNvidia GeForce RTX 4070 Ti Super on AmazonNvidia GeForce RTX 4070 Super on AmazonAMD Radeon RX 7800 XT on AmazonAMD Radeon RX 7900 XT on Amazon
    0 Comments ·0 Shares ·64 Views
  • Beta or not, Apple Intelligence becomes a default iOS setting across devices
    www.techspot.com
    Cutting corners: Apple Intelligence is now an "opt-out" feature across the entire Apple ecosystem, despite still being marketed as a beta product. Most users aren't keen to have incomplete features forced upon them, but shoving AI down everybody's throat is a popular trend among corporations dabbling in the technology. After much anticipation, Apple introduced its generative AI suite last October with the release of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1. Initially, "Apple Intelligence" features were disabled by default users had to manually enable them through system settings. But now Cupertino has decided to reverse course likely due to slow adoption forcing users to "enjoy" generative AI features on mobile devices and Macs, whether they want them or not.Apple Intelligence has become an opt-out feature with the release of iOS 18.3, iPadOS 18.3, and macOS Sequoia 15.3. According to the company, these GenAI features will be automatically enabled after installing the updates and during the device setup process.Users can still disable it through the Apple Intelligence & Siri Settings panel. However, proper hardware support is required, so the generative AI tools will only work on iPhone 15 Pro series, iPhone 16 series, and iPhone 16 Pro series. Similarly, iPad and Mac devices with M1 or newer Arm processors will have the features enabled by default after updating their operating systems.Apple's willingness to push beta and somewhat unreliable software features onto users is not unprecedented. The company took a similar approach with Siri in 2011. When users reported issues with the digital assistant, Apple brushed them off as "side effects" of the software's beta phase. Siri was finally rolled out as a full feature in 2013, shedding its "beta" label and becoming a standard part of iOS.Apple Intelligence appears to be following a similar path, as users are discovering some troublesome side effects with this brand-new technology. For example, the AI summaries feature has been generating fake headlines, prompting reporters to urge Apple to address the issue. As a stopgap measure, the latest OS updates will temporarily disable notifications for the entire "news and entertainment" app category until the problem is resolved. // Related StoriesIn today's rapidly evolving Wild West GenAI world, Apple seems to be mimicking its competitors by prioritizing corporate goals over user preference. Microsoft and Google have already made similar moves, enabling AI features by default on their platforms so users can experience the "wonders" of generative AI even if they have no need for it.
    0 Comments ·0 Shares ·70 Views
  • www.techspot.com
    A hot potato: Meta has responded to complaints from Facebook, Instagram, and Threads users who say they suddenly found themselves following Donald Trump, Vice President JD Vance, and First Lady Melania Trump without their knowledge. There are also reports of Democratic content being hidden. Some have pointed to Meta and CEO Mark Zuckerberg's recent shifting political stance as the cause, but the company claims otherwise. Meta's communication director, Andy Stone, responded to the reports on Threads and X. He wrote that people were not made to automatically follow any of the official accounts for the President, Vice President or First Lady.Stone went on to say that the accounts are managed by the White House, so the content on the pages changes with a new administration but the followers remain unchanged. The same procedure was carried out during the last presidential transition.However, there are plenty of replies that cast doubt on Stone's explanation. Many people say they never followed Biden, Trump, or any political accounts, yet they are suddenly following the POTUS, VPOTUS, and FLOTUS accounts.There are also complaints that those who suddenly found they are following Trump cannot unfollow him. Stone said that it may take some time for follow and unfollow requests to go through as the White House accounts change hands. // Related StoriesIt's been noted that when a candidate leaves office, Meta creates archived POTUS accounts that followers are automatically signed up to. Some claim that as this happened, they automatically started following Trump, Vance and Melania Trump accounts.Something else that has led to people accusing Meta of political bias is happening on Instagram. Users say that when doing hashtag searches for #Democrat or #Democrats, they are seeing messages that state, "We've hidden these results," and "Results for the term you searched for may contain sensitive content." Meta said the issue is caused by a technical problem affecting multiple hashtags, including some related to the Republican Party.Seperate reports that people are seeing more political content in their feeds are a result of the changes that Zuckerberg announced at the start of January. The CEO also suspended the fact-checking program and reduced the amount of censorship on Meta's platforms. He later appeared on the Joe Rogan podcast, where he said that companies need more "masculine energy." It led to a lawyer dropping Meta as a client in a copyright case due to Zuckerberg's "toxic masculinity and neo-Nazi madness."
    0 Comments ·0 Shares ·69 Views
More Stories