• VENTUREBEAT.COM
    Nvidia launches blueprint for AI agents that can analyze video
    Nvidia launched Blueprint for AI Agents that can analyze video today as part of its CES 2025 opening keynote by CEO Jensen Huang.Read More
    0 Comments 0 Shares 35 Views
  • VENTUREBEAT.COM
    Nvidia gets key design wins to bring AI to autonomous vehicle fleets
    Nvidia has announced it has key design wins for autonomous vehicles with car makers such as Toyota, Aurora and Continental.Read More
    0 Comments 0 Shares 34 Views
  • VENTUREBEAT.COM
    Inworld AI teams with Nvidia and Streamlabs on intelligent streaming assistant
    Inworld AI announced it is collaborating with Nvidia and Streamlabs to create an intelligent streaming assistant using AI technology.Read More
    0 Comments 0 Shares 35 Views
  • WWW.THEVERGE.COM
    Nvidia is bringing a native GeForce Now app to Steam Deck
    Photo by The Verge Nvidia plans to release a native GeForce Now app for Steam Deck later this year, according to a blog post. Its already relatively straightforward to get Nvidias cloud gaming service set up on Steam Deck thanks to a special script from Nvidia, but a native app should be easier to install and will support up to 4K resolution and 60 fps with HDR when connected to a TV.Nvidia also plans to bring GeForce Now to some major VR headsets later this month, including the Apple Vision Pro, Meta Quest 3 and 3S, and Pico virtual- and mixed-reality devices. When GeForce Now version 2.0.70 is available, people using those headsets will be able to access an extensive library of games they can stream by visiting play.geforcenow.com in their browser.The company also says that two major titles from Microsoft will be available on GeForce Now when they come out this year: Avowed, which launches February 18th, and DOOM: The Dark Ages, which is set to be available sometime this year.
    0 Comments 0 Shares 33 Views
  • WWW.THEVERGE.COM
    Nvidias AI NPCs are no longer chatbots theyre your new PUBG teammate
    Nvidia has spent the last two years showing off its digital ACE characters that can have in-game conversations with you using generative AI. But at CES 2025, the company is taking the ACE characters a step further by showing how they can be autonomous game characters including, sometime this year, a teammate to help you get a chicken dinner in PUBG. Nvidia says that ACE characters can use AI to perceive, plan, and act like human players, per a blog post. Powered by generative AI, ACE will enable living, dynamic game worlds with companions that comprehend and support player goals, and enemies that adapt dynamically to player tactics. The characters are powered by small language models (SLMs) that are capable of planning at human-like frequencies required for realistic decision making as well as multi-modal SLMs for vision and audio that allow AI characters to hear audio cues and perceive their environment.As for how that will work in PUBG, youll be able to team up with the PUBG Ally, which Nvidia and PUBG publisher Krafton are calling the worlds first Co-Playable Character (CPC). The Ally will be able to communicate using game-specific lingo, provide real-time strategic recommendations, find and share loot, drive vehicles, and fight other human players using the games extensive arsenal of weapons, Nvidia says. RelatedBasically, it sounds like an AI teammate you can talk with natural language whos supposed to be as capable as a human. And a video shows the Ally indeed helping a player find specific loot, bringing over a vehicle, and attempting to flank opposing players. But the video is heavily edited and isnt live, so Im skeptical that the Ally will work as well as is being shown here.AI characters built with ACE are coming to other games, too. Naraka: Bladepoint Mobile PC Version will get a a local inference AI Teammate feature in March 2025, while Naraka: Bladepoint on PC will get the feature later in 2025, according to Nvidias blog post. AI Teammates powered by NVIDIA ACE can join your party, battling alongside you, finding you specific items that you need, swapping gear, offering suggestions on skills to unlock, and making plays thatll help you achieve victory.Kraftons upcoming life simulation game called inZOI will also get CPCs called Smart Zoi. And Nvidia says that ACE characters can be bosses, too, and theyll be used for boss encounters in Wemade Nexts MIR5.
    0 Comments 0 Shares 31 Views
  • WWW.THEVERGE.COM
    Asus latest ROG Flow Z13 gaming tablet uses AMDs new integrated graphics
    Asus has a new version of its Surface Pro-like gaming tablet for CES, and its making some sizable changes both inside and out. The Asus ROG Flow Z13 for 2025 is once again a slightly chunky, almost-half-inch-thick, 13-inch tablet with a built-in kickstand, magnetic keyboard cover, a bunch of ports, and a clear window on its rear with RGB lighting to show off its innards.RelatedThat fun glass window is now larger, with a direct view of the motherboard, but the biggest change for the ROG Flow Z13 is its switch to integrated graphics. That may seem like a step backward for a gaming-focused tablet since gamers covet dedicated GPUs, but Asus is outfitting it with AMDs powerful new Strix Halo processor. The ROG Flow Z13 can be configured with the Ryzen AI Max 390 for $1,999.99 or the Ryzen AI Max Plus 395 for $2,199.99. The top-end model with the Max Plus 395 has 16 CPU cores and 40 graphics cores, while the base-model Ryzen AI Max Plus 390 (curse these names) has 12 CPU cores and 32 graphics cores. The Z13 utilizes a redesigned stainless steel vapor chamber for cooling these graphics-heavy chips, which are capable of 120W TDP. 1/31/3All that power in the Z13 is responsible for driving a 13-inch, 2560 x 1600 touchscreen display with a speedy 180Hz refresh rate (up from 165Hz on the last-gen model), which you dont often find in laptops and tablets of this size. For ports, its got two USB 4, one USB-A 3.2 Gen 2, HDMI 2.1, a microSD card slot capable of UHS-II speeds, and a 3.5mm combination headphone / mic jack. Its also got a 5-megapixel front-facing webcam and 13-megapixel rear-facing camera so you can flash your RGB as you awkwardly take tablet photos in public. It also has Wi-Fi 7.The Z13 supports USB-C Power Delivery for charging, but that wont be powerful enough to allow its full performance under load. Instead, it comes with a 200W power adapter that uses Asus proprietary and reversible slim power jack like on its recent laptops.Other quality-of-life improvements for the Z13 include a new detachable keyboard with larger keycaps and a more generously sized touchpad. And on its right side, beside the power button and volume rocker, is a new ScreenXpert button that summons a Command-Center-like widget that includes multiple-display window management controls, quick access to operating modes like Turbo mode or Silent mode, and other settings like muting your mic. Its primarily there to help control things while in tablet mode since the keyboard contains shortcuts for most of these functions.1/41/4I got a quick glimpse of the new ROG Flow Z13 at a preview event, and Asus sent me a preproduction model right before CES to get a little bit of hands-on time. Its what Im writing this post on right now, and boy do I appreciate the updates to this keyboard cover. The 1.7mm key travel and bigger touchpad go a long way toward getting work done. While the Ryzen 395 chip has the potential to be power-hungry, the battery life on the Z13 shows some promise. Asus is only claiming 10 hours of battery life, and I did manage to get through a full eight-plus-hour workday of Chrome tabs, streaming music (though the speakers seem kind of bad at first listen), and writing across multiple virtual desktops the day before flying to CES with pretty much no issues.I definitely prefer a proper laptop to a tablet with a kickstand and keyboard cover, but being able to remove the keyboard deck for a little more flexibility and comfort when its time to fire up a game is pretty slick. I tried out a little Helldivers 2 on the Flow Z13, and it performed quite well, especially for a tablet. Set to the Z13s native 2.5K resolution, in-game render scale on Ultra Quality, and texture details on medium, I saw 60fps or just slightly under, and it looked really nice. If I bumped it down from Ultra Quality to Quality scaling, it jumped up to an even smoother 80fps. This was, of course, while the tablet was plugged in and its fans were blasting on Turbo mode. Diving in again while unplugged dropped the Ultra Quality render scale performance down to the 45 to 50fps range since playing on battery limits you to Performance mode instead of Turbo.1/41/4This is preproduction hardware, but so far, its pretty impressive for integrated graphics. AMDs new chip might have something special here for thin and light devices, but since it lacks Thunderbolt 5, that means the Flow Z13 cant use the full GPU bandwidth of Asus new XG Mobile eGPU. (Previous models could use the older XG Mobile via its proprietary connector.) But of course, that would make this somewhat portable PC gaming solution a little less portable, and the new XG Mobile costs about as much as the Flow Z13 itself.But does a gaming tablet make much sense in 2025 when portable PC gaming is being so adequately served by the Steam Deck and a bunch of other dedicated handhelds? Well have to see how a production model of the ROG Flow Z13 fares when it launches sometime in February.Photography by Antonio G. Di Benedetto / The Verge
    0 Comments 0 Shares 33 Views
  • WEWORKREMOTELY.COM
    Coinbase: Product Designer II - Consumer
    Time zones: EST (UTC -5), CST (UTC -6), MST (UTC -7), PST (UTC -8), AST (UTC -4), NST (UTC -3:30)Ready to be pushed beyond what you think youre capable of?At Coinbase, our mission is to increase economic freedom in the world. Its a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform and with it, the future global financial system.To achieve our mission, were seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the companys hardest problems.Our work culture is intense and isnt for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, theres no better place to be.What youll be doingWork in a small team to iteratively improve user experienceTake new features from ideation, to prototyping, to user testing, to productionDesign web and mobile experiences that are simple and intuitiveParticipate in regular design reviews where youll seek out specific feedback on your designs and incorporate relevant feedbackExecute on the product roadmap and help define product strategyDaily collaboration with Engineering, User Research, and ProductWhat we look for in you3+ years of professional Product Design experienceBachelor's degree in a related fieldExperience designing consumer-facing experiences for web and mobileExcellence in UX thinking, visual design, and written communicationExperience working in a collaborative environment with engineers, user researchers, and product teamsFluency in Figma and prototyping toolsLow ego, collaborative, and open mindedMust be able to read, write and speak in EnglishNice to havesInterest in crypto or financial productsUser of Coinbase productsCrypto-forward experience, including familiarity with onchain activity such as interacting with Ethereum addresses, using ENS, and engaging with dApps or blockchain-based services.Pay Transparency Notice: The target annual salary for this position can range as detailed below. Full time offers from Coinbase also include target bonus + target equity + benefits (including medical, dental, and vision).Pay Range:$149,500$149,500 CADCommitment to Equal OpportunityCoinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here. Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law.Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here).Global Data Privacy Notice for Job Candidates and ApplicantsDepending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here.
    0 Comments 0 Shares 34 Views
  • WWW.CNET.COM
    Whale TV Rolls Out Updated TV OS With AI-Powered Recs, Voice Assistant
    The independent TV operating system maker Whale TV is starting to roll out a new iteration of its Whale OSthat includes AI recommendations and a voice assistant.Whale TV's partners -- which include more than 400 brands, such as Philips, JVC and Telefunken -- will be among the first to get access to Whale OS 10, though US availability is yet to be confirmed. The Singapore-based company, formerly known as Zeasn, powers over 43 million active smart TVs globally, according to a press release.Initiallyannounced in September, the updated Whale OS 10 features user profiles, AI recommendations and a ChatGPT-powered voice assistant, which "enables users to interact with their TV in a more natural, conversational way," according to a press release from the company Tuesday. The upgrades could make the TV OS more comparable to the likes of Roku and Fire TV."Whale TV helps users effortlessly navigate to their favorite entertainment, whether it's streamed, broadcasted or played on a connected device," the company says.Zeasntransformed into Whale TV in Septemberas it aimed to build consumer awareness after being marketed exclusively to TV brands and factories, according to a release at the time.Along with the OS update, the company is bringing its free TV streaming service Whale TV Plus to all Whale TV devices. With Whale TV Plus, which unites Whale TV's former services rlaxx TV and Whale Live, users can stream free, ad-supported TV without any extra sign-up steps.
    0 Comments 0 Shares 34 Views
  • WWW.CNET.COM
    Everything Announced at Nvidia's CES Event in 12 Minutes video
    Everything Announced at Nvidia's CES Event in 12 Minutes Jan 7, 2025 Tech At CES 2025, Nvidia CEO Jensen Huang kicks off CES, the world's largest consumer electronics show, with a new RTX gaming chip, updates on its AI chip Grace Blackwell and its future plans to dig deeper into robotics and autonomous cars. Here it is. Our brand new GForce RTX 50 series, Blackwell architecture, the GPU is just a beast, 92 billion transistors, 4000 tops, 4 petaflops of AI, 3 times higher than the last generation Ada, and we need all of it to generate those pixels that I showed you. 380 ray tracing teraflops so that we could for the pixels that we have to compute, compute the most beautiful image you possibly can and of course 125 shader teraflops. There's actually a concurrent shader teraflops as well as an integer unit of equal performance. So two dual shaders, one is for floating 0.1 is for integer. G7 memory from micron 1.8 terabytes per second, twice the performance of our last generation, and we now have the ability to intermix AI workloads with computer graphics workloads. And one of the amazing things about this generation is the programmable shader is also able to now process neural networks. So the shader is able to carry these neural networks and as a result, we invented. Neuro texture compression and neural material shading with the Blackwell family RTX 5070, 4090 performance at 5:49. Impossible without artificial intelligence, impossible without the four tops, 4 tear ops of AI tensor cores. Impossible without the G7 memories. OK, so 5070, 4090 performance, $549 and here's the whole family starting from 5070 all the way up to $5090 5090 dollars, twice the performance of a 4090. Starting Of course we're producing a very large scale availability starting January. Well, it is incredible, but we managed to put these gigantic performance GPUs into a laptop. This is a 5070 laptop for 1299. This 5070 laptop has a 4090 performance. And so the 5090, the 5090. Will fit into a laptop, a thin laptop. That last laptop was 14, 4.9 millimeters. You got a 5080, 5070 TI and 5070. But what we basically have here is 72 Blackwell GPUs or 144 dies. This one chip here is 1.4 exaflops. The world's largest supercomputer, fastest supercomputer, only recently. This entire room supercomputer only recently achieved an exaflop plus. This is 1.4 exaflops of AI floating point performance. It has 14 terabytes of memory, but here's the amazing thing the memory bandwidth is 1.2 petabytes per second. That's basically, basically the entire. Internet traffic that's happening right now. The entire world's internet traffic is being processed across these chips, OK? And we have, um, 10 130 trillion transistors in total, 2,592 CPU cores. Whole bunch of networking and so these I wish I could do this. I don't think I will so these are the blackwells. These are our ConnectX. Networking chips, these are the MV link and we're trying to pretend about the the the MV link spine, but that's not possible, OK. And these are all of the HBM memories, 1214 terabytes of HBM memory. This is what we're trying to do and this is the miracle, this is the miracle of the black wall system so we fine tune them using our expertise and our capabilities and we turn them into the Llama Nemotron suite of open models. There are small ones that interact in uh very very fast response time extremely small uh they're uh what we call super llama Nemotron supers they're basically your mainstream versions of your models or your ultra model, the ultra model could be used uh to be a teacher model for a whole bunch of other models. It could be a reward model evaluator. Uh, a judge for other models to create answers and decide whether it's a good answer or not, basically give feedback to other models. It could be distilled in a lot of different ways, basically a teacher model, a knowledge distillation, uh, uh, model, very large, very capable, and so all of this is now available online and Via Cosmos, the world's first. World foundation model. It is trained on 20 million hours of video. The 20 million hours of video focuses on physical dynamic things, so dynamic nature, nature themes themes, uh, humans, uh, walking, uh, hands moving, uh, manipulating things, uh, you know, things that are, uh, fast camera movements. It's really about teaching the AI, not about generating creative content, but teaching the AI to understand the physical world and from this with this physical AI. There are many downstream things that we could uh do as a result we could do synthetic data generation to train uh models. We could distill it and turn it into effectively to see the beginnings of a robotics model. You could have it generate multiple physically based, physically plausible, uh, scenarios of the future, basically do a Doctor Strange. Um, you could, uh, because, because this model understands the physical world, of course you saw a whole bunch of images generated this model understanding the physical world, it also, uh, could do of course captioning and so it could take videos, caption it incredibly well, and that captioning and the video could be used to train. Large language models. Multimodality large language models and uh so you could use this technology to uh use this foundation model to train robotics robots as well as large language models and so this is the Nvidia cosmos. The platform has an auto regressive model for real-time applications as diffusion model for a very high quality image generation. It's incredible tokenizer basically learning the vocabulary of uh real world and a data pipeline so that if you would like to take all of this and then train it on your own data, this data pipeline because there's so much data involved we've accelerated everything end to end for you and so this is the world's first data processing pipeline that's could accelerated as well as AI accelerated all of this is part of the Cosmos platform and today we're announcing. That Cosmos is open licensed. It's open available on GitHub. Well, today we're announcing that our next generation processor for the car, our next generation computer for the car is called Thor. I have one right here. Hang on a second. OK, this is Thor. This is Thor This is This is a robotics computer. This is a robotics computer takes sensors and just a madness amount of sensor information, process it, you know. Umpteen cameras, high resolution radars, LIDARs, they're all coming into this chip, and this chip has to process all that sensor, turn them into tokens, put them into a transformer, and predict the next path. And this AV computer is now in full production. Thor is 20 times. The processing capability of our last generation Orin, which is really the standard of autonomous vehicles today. And so this is just really quite, quite incredible. Thor is in full production. This robotics processor, by the way, also goes into a full robot and so it could be an AMR, it could be a a a human or robot, uh, it could be the brain, it could be the, uh, manipulator, uh, this this processor basically is a universal robotics computer. The chat GPT moment. For general robotics is just around the corner. And in fact, all of the enabling technologies that I've been talking about is. Going to make it possible for us in the next several years to see very rapid breakthroughs, surprising breakthroughs in in general robotics. Now the reason why general robotics is so important is whereas robots with tracks and wheels require special environments to accommodate them. There are 3 robots. 3 robots in the world that we can make that require no green fields. Brown field adaptation is perfect. If we, if we could possibly build these amazing robots, we could deploy them in exactly the world that we've built for ourselves. These 3 robots are one agentic robots and agentic AI because you know they're information workers so long as they could accommodate uh the computers that we have in our offices, it's gonna be great. Number 2, self-driving cars, and the reason for that is we spent 100+ years building roads and cities. And then number 3, human or robots. If we have the technology to solve these 3. This will be the largest technology industry the world's ever seen. This is Nvidia's latest AI supercomputer. And, and it's finally called Project Digits right now and if you have a good name for it, uh, reach out to us. Um, uh, this here's the amazing thing, this is an AI supercomputer. It runs the entire Nvidia AI stack. All of Nvidia software runs on this. DGX cloud runs on this. This sits Well, somewhere and it's wireless or you know connected to your computer, it's even a workstation if you like it to be and you could access it you could you could reach it like a like a cloud supercomputer and Nvidia's AI works on it and um it's based on a a super secret chip that we've been working on called GB 110, the smallest Grace Blackwell that we make, and this is the chip that's inside. It is it is in production. This top secret chip, uh, we did in collaboration with the CPU, the gray CPU was a, uh, is built for Nvidia in collaboration with MediaTech. Uh, they're the world's leading SOC company, and they worked with us to build this CPU, the CPU SOC, and connected with chip to chip and the link to the Blackwell GPU, and, uh, this little, this little thing here is in full production. Uh, we're expecting this computer to uh be available uh around May time frame. Up Next Everything Announced by Sony at CES 2025 in 6 Minutes Up Next Everything Announced by Sony at CES 2025 in 6 Minutes 05:46 Hands-On with Unitree G1 Humanoid Robot Hands-On with Unitree G1 Humanoid Robot 03:47 I Made a Robot Vacuum Pick Up My Socks I Made a Robot Vacuum Pick Up My Socks 02:50 Everything Announced by Samsung at CES 2025 in 7 Minutes Everything Announced by Samsung at CES 2025 in 7 Minutes 06:54 Displace TV's 55-Inch Television Hangs From a Wall Using Suction Cups Displace TV's 55-Inch Television Hangs From a Wall Using Suction Cups 03:15 Samsung Shows a Wild Stretchable Display and the Brightest Smartwatch Screen I've Ever Seen Samsung Shows a Wild Stretchable Display and the Brightest Smartwatch Screen I've Ever Seen 02:01 Everything Announced by AMD at CES 2025 in 7 Minutes Everything Announced by AMD at CES 2025 in 7 Minutes 07:07 Everything Announced by LG at CES 2025 in 11 Minutes Everything Announced by LG at CES 2025 in 11 Minutes 10:46 At CES 2025, TCL Debuts New TCL 60 Phone with E-Ink Display At CES 2025, TCL Debuts New TCL 60 Phone with E-Ink Display 01:39 Tech Shows One More Thing One More Thing All Things Mobile All Things Mobile What the Future What the Future Beta Test Beta Test Expert vs. AI Expert vs. AI Cover Story Cover Story One More Thing One More Thing All Things Mobile All Things Mobile What the Future What the Future Beta Test Beta Test Expert vs. AI Expert vs. AI Cover Story Cover Story Latest News Everything Announced at Nvidia's CES Event in 12 Minutes Everything Announced at Nvidia's CES Event in 12 Minutes 11:47 1 hour ago Everything Announced by Sony at CES 2025 in 6 Minutes Everything Announced by Sony at CES 2025 in 6 Minutes 05:46 3 hours ago Hands-On with Unitree G1 Humanoid Robot Hands-On with Unitree G1 Humanoid Robot 03:47 3 hours ago I Made a Robot Vacuum Pick Up My Socks I Made a Robot Vacuum Pick Up My Socks 02:50 4 hours ago Everything Announced by Samsung at CES 2025 in 7 Minutes Everything Announced by Samsung at CES 2025 in 7 Minutes 06:54 6 hours ago Displace TV's 55-Inch Television Hangs From a Wall Using Suction Cups Displace TV's 55-Inch Television Hangs From a Wall Using Suction Cups 03:15 6 hours ago Samsung Shows a Wild Stretchable Display and the Brightest Smartwatch Screen I've Ever Seen Samsung Shows a Wild Stretchable Display and the Brightest Smartwatch Screen I've Ever Seen 02:01 6 hours ago Everything Announced by AMD at CES 2025 in 7 Minutes Everything Announced by AMD at CES 2025 in 7 Minutes 07:07 9 hours ago Everything Announced by LG at CES 2025 in 11 Minutes Everything Announced by LG at CES 2025 in 11 Minutes 10:46 10 hours ago At CES 2025, TCL Debuts New TCL 60 Phone with E-Ink Display At CES 2025, TCL Debuts New TCL 60 Phone with E-Ink Display 01:39 10 hours ago Everything Announced at Nvidia's CES Event in 12 Minutes Everything Announced at Nvidia's CES Event in 12 Minutes 11:47 1 hour ago Everything Announced by Sony at CES 2025 in 6 Minutes Everything Announced by Sony at CES 2025 in 6 Minutes 05:46 3 hours ago Hands-On with Unitree G1 Humanoid Robot Hands-On with Unitree G1 Humanoid Robot 03:47 3 hours ago I Made a Robot Vacuum Pick Up My Socks I Made a Robot Vacuum Pick Up My Socks 02:50 4 hours ago Everything Announced by Samsung at CES 2025 in 7 Minutes Everything Announced by Samsung at CES 2025 in 7 Minutes 06:54 6 hours ago Displace TV's 55-Inch Television Hangs From a Wall Using Suction Cups Displace TV's 55-Inch Television Hangs From a Wall Using Suction Cups 03:15 6 hours ago Samsung Shows a Wild Stretchable Display and the Brightest Smartwatch Screen I've Ever Seen Samsung Shows a Wild Stretchable Display and the Brightest Smartwatch Screen I've Ever Seen 02:01 6 hours ago Everything Announced by AMD at CES 2025 in 7 Minutes Everything Announced by AMD at CES 2025 in 7 Minutes 07:07 9 hours ago Everything Announced by LG at CES 2025 in 11 Minutes Everything Announced by LG at CES 2025 in 11 Minutes 10:46 10 hours ago At CES 2025, TCL Debuts New TCL 60 Phone with E-Ink Display At CES 2025, TCL Debuts New TCL 60 Phone with E-Ink Display 01:39 10 hours ago
    0 Comments 0 Shares 36 Views
  • WWW.CNET.COM
    The Wait Is Over: Nvidia's Next-Gen RTX 50-Series GPUs Are Here
    Blackwell has arrived. At CES on Monday night, Nvidia unveiled its long-awaited next generation of GeForce RTX graphics cards based on its latest Blackwell microarchitecture, and the race will soon begin by AI outfits training large language models and PC gamers alike to gobble up these new cards as soon as they are released.Nvidia CEO Jensen Huang took the stage in Las Vegas and announced four new GeForce RTX desktop GPUs: the flagship RTX 5090 along with the RTX 5080, 5070 Ti and 5070.The GeForce RTX 5090 is twice as fast as the previous RTX 4090, according to Huang, thanks to the new Blackwell architecture and DLSS 4. It features 92 billion transistors capable of more than 3,352 trillion AI operations per second. DLSS 4 introduces Multi Frame Generation that improves frame rates by using AI to generate up to three frames per rendered frame. DLSS 4 will be supported in over 75 games and applications the day of launch.And launch day is coming soon. The RTX 5090 and 5080 will start shipping on Jan. 30, and the 5070 Ti and 5070 will follow sometime in February. Here's the pricing:RTX 5090: $1,999RTX 5080: $999RTX 5070 Ti: $749RTX 5070: $549 Screenshot by Matt Elliott/CNETIn addition to Nvidia's Founders Edition cards, versions using the GPUs will be available from Nvidia's usual partners, including Asus, Gigabyte, MSI, PNY and Zotac.Laptops with mobile versions of the RTX 5090, 5080, 5070 Ti and 5070 are slated to start shipping in March.In this age of AI, Huang focused on the AI TOPS figures of the new GPUs rather than CUDA cores. Here are the AI TOPS counts of the four new desktop GPUs:RTX 5090: 3,352 TOPSRTX 5080: 1,801 TOPSRTX 5070 Ti: 1,406 TOPSRTX 5070: 988 TOPSAnd AI is in the new graphics cards. In addition to DLSS 4, Nvidia's new RTX Neural Shaders uses AI to compress textures for greater memory efficiency, and RTX Neural Faces uses AI to improve the quality of characters' faces in games.
    0 Comments 0 Shares 36 Views