• NVIDIA Unveils Mega Omniverse Blueprint for Building Industrial Robot Fleet Digital Twins
    blogs.nvidia.com
    According to Gartner, the worldwide end-user spending on all IT products for 2024 was $5 trillion. This industry is built on a computing fabric of electrons, is fully software-defined, accelerated and now generative AI-enabled. While huge, its a fraction of the larger physical industrial market that relies on the movement of atoms.Todays 10 million factories, nearly 200,000 warehouses and 40 million miles of highways form the computing fabric of our physical world. But that vast network of production facilities and distribution centers is still laboriously and manually designed, operated and optimized.In warehousing and distribution, operators face highly complex decision optimization problems matrices of variables and interdependencies across human workers, robotic and agentic systems and equipment. Unlike the IT industry, the physical industrial market is still waiting for its own software-defined moment.That moment is coming.Choreographed integration of human workers, robotic and agentic systems and equipment in a facility digital twin. Image courtesy of Accenture, KION Group.NVIDIA today at CES announced Mega, an Omniverse Blueprint for developing, testing and optimizing physical AI and robot fleets at scale in a digital twin before deployment into real-world facilities.Advanced warehouses and factories use fleets of hundreds of autonomous mobile robots, robotic arm manipulators and humanoids working alongside people. With implementations of increasingly complex systems of sensor and robot autonomy, it requires coordinated training in simulation to optimize operations, help ensure safety and avoid disruptions.Mega offers enterprises a reference architecture of NVIDIA accelerated computing, AI, NVIDIA Isaac and NVIDIA Omniverse technologies to develop and test digital twins for testing AI-powered robot brains that drive robots, video analytics AI agents, equipment and more for handling enormous complexity and scale. The new framework brings software-defined capabilities to physical facilities, enabling continuous development, testing, optimization and deployment.Developing AI Brains With World Simulator for Autonomous OrchestrationWith Mega-driven digital twins, including a world simulator that coordinates all robot activities and sensor data, enterprises can continuously update facility robot brains for intelligent routes and tasks for operational efficiencies.The blueprint uses Omniverse Cloud Sensor RTX APIs that enable robotics developers to render sensor data from any type of intelligent machine in the factory, simultaneously, for high-fidelity large-scale sensor simulation. This allows robots to be tested in an infinite number of scenarios within the digital twin, using synthetic data in a software-inthe-loop pipeline with NVIDIA Isaac ROS.Operational efficiency is gained with sensor simulation. Image courtesy of Accenture, KION Group.Supply chain solutions company KION Group is collaborating with Accenture and NVIDIA as the first to adopt Mega for optimizing operations in retail, consumer packaged goods, parcel services and more.Jensen Huang, founder and CEO of NVIDIA, offered a glimpse into the future of this collaboration on stage at CES, demonstrating how enterprises can navigate a complex web of decisions using the Mega Omniverse Blueprint.At KION, we leverage AI-driven solutions as an integral part of our strategy to optimize our customers supply chains and increase their productivity, said Rob Smith, CEO of KION GROUP AG. With NVIDIAs AI leadership and Accentures expertise in digital technologies, we are reinventing warehouse automation. Bringing these strong partners together, we are creating a vision for future warehouses that are part of a smart agile system, evolve with the world around them and can handle nearly any supply chain challenge.Creating Operational Efficiencies With Mega Omniverse BlueprintCreating operational efficiencies, KION and Accenture are embracing the Mega Omniverse Blueprint to build next-generation supply chains for KION and its customers. KION can capture and digitalize a warehouse digital twin in Omniverse by using computer-aided design files, video, lidar, image and AI-generated data.KION uses the Omniverse digital twin as a virtual training and testing environment for its industrial AIs robot brains, powered by NVIDIA Isaac, tapping into smart cameras, forklifts, robotic equipment and digital humans. Integrating the Omniverse digital twin, KIONs warehouse management software can create and assign missions for robot brains, like moving a load from one place to another.Graphical data is easily introduced into the Omniverse viewport showcasing productivity and throughput among other desired metrics. Image courtesy of Accenture, KION Group.These simulated robots can carry out tasks by perceiving and reasoning in environments, and theyre capable of planning next motions and then taking actions that are simulated in the digital twin. The robot brains perceive the results deciding the next action, and this cycle continues with Mega precisely tracking the state and position of all the assets in the digital twin.Delivering Services With Mega for Facilities EverywhereAccenture, global leader in professional services, is adopting Mega as part of its AI Refinery for Simulation and Robotics, built on NVIDIA AI and Omniverse, to help organizations use AI simulation to reinvent factory and warehouse design and ongoing operations.With the blueprint, Accenture is delivering new services including Custom Robotics and Manufacturing Foundation Model Training and Finetuning; Intelligent Humanoid Robotics; and AI-Powered Industrial Manufacturing and Logistics Simulation and Optimization to expand the power of physical AI and simulation to the worlds factories and warehouse operators. Now, for example, an organization can explore numerous options for their warehouse before choosing and implementing the best one.As organizations enter the age of industrial AI, we are helping them use AI-powered simulation and autonomous robots to reinvent the process of designing new facilities and optimizing existing operations, said Julie Sweet, chair and CEO of Accenture. Our collaboration with NVIDIA and KION will help our clients plan their operations in digital twins, where they can run hundreds of options and quickly select the best for current or changing market conditions, such as seasonal market demand or workforce availability. This represents a new frontier of value for our clients to achieve using technology, data and AI.Join NVIDIA at CES.See notice regarding software product information.
    0 Comments ·0 Shares ·105 Views
  • Building Smarter Autonomous Machines: NVIDIA Announces Early Access for Omniverse Sensor RTX
    blogs.nvidia.com
    Generative AI and foundation models let autonomous machines generalize beyond the operational design domains on which theyve been trained. Using new AI techniques such as tokenization and large language and diffusion models, developers and researchers can now address longstanding hurdles to autonomy.These larger models require massive amounts of diverse data for training, fine-tuning and validation. But collecting such data including from rare edge cases and potentially hazardous scenarios, like a pedestrian crossing in front of an autonomous vehicle (AV) at night or a human entering a welding robot work cell can be incredibly difficult and resource-intensive.To help developers fill this gap, NVIDIA Omniverse Cloud Sensor RTX APIs enable physically accurate sensor simulation for generating datasets at scale. The application programming interfaces (APIs) are designed to support sensors commonly used for autonomy including cameras, radar and lidar and can integrate seamlessly into existing workflows to accelerate the development of autonomous vehicles and robots of every kind.Omniverse Sensor RTX APIs are now available to select developers in early access. Organizations such as Accenture, Foretellix, MITRE and Mcity are integrating these APIs via domain-specific blueprints to provide end customers with the tools they need to deploy the next generation of industrial manufacturing robots and self-driving cars.Powering Industrial AI With Omniverse BlueprintsIn complex environments like factories and warehouses, robots must be orchestrated to safely and efficiently work alongside machinery and human workers. All those moving parts present a massive challenge when designing, testing or validating operations while avoiding disruptions.Mega is an Omniverse Blueprint that offers enterprises a reference architecture of NVIDIA accelerated computing, AI, NVIDIA Isaac and NVIDIA Omniverse technologies. Enterprises can use it to develop digital twins and test AI-powered robot brains that drive robots, cameras, equipment and more to handle enormous complexity and scale.Integrating Omniverse Sensor RTX, the blueprint lets robotics developers simultaneously render sensor data from any type of intelligent machine in a factory for high-fidelity, large-scale sensor simulation.With the ability to test operations and workflows in simulation, manufacturers can save considerable time and investment, and improve efficiency in entirely new ways.International supply chain solutions company KION Group and Accenture are using the Mega blueprint to build Omniverse digital twins that serve as virtual training and testing environments for industrial AIs robot brains, tapping into data from smart cameras, forklifts, robotic equipment and digital humans.The robot brains perceive the simulated environment with physically accurate sensor data rendered by the Omniverse Sensor RTX APIs. They use this data to plan and act, with each action precisely tracked with Mega, alongside the state and position of all the assets in the digital twin. With these capabilities, developers can continuously build and test new layouts before theyre implemented in the physical world.Driving AV Development and ValidationAutonomous vehicles have been under development for over a decade, but barriers in acquiring the right training and validation data and slow iteration cycles have hindered large-scale deployment.To address this need for sensor data, companies are harnessing the NVIDIA Omniverse Blueprint for AV simulation, a reference workflow that enables physically accurate sensor simulation. The workflow uses Omniverse Sensor RTX APIs to render the camera, radar and lidar data necessary for AV development and validation.AV toolchain provider Foretellix has integrated the blueprint into its Foretify AV development toolchain to transform object-level simulation into physically accurate sensor simulation.The Foretify toolchain can generate any number of testing scenarios simultaneously. By adding sensor simulation capabilities to these scenarios, Foretify can now enable developers to evaluate the completeness of their AV development, as well as train and test at the levels of fidelity and scale needed to achieve large-scale and safe deployment. In addition, Foretellix will use the newly announced NVIDIA Cosmos platform to generate an even greater diversity of scenarios for verification and validation.Nuro, an autonomous driving technology provider with one of the largest level 4 deployments in the U.S., is using the Foretify toolchain to train, test and validate its self-driving vehicles before deployment.In addition, research organization MITRE is collaborating with the University of Michigans Mcity testing facility to build a digital AV validation framework for regulatory use, including a digital twin of Mcitys 32-acre proving ground for autonomous vehicles. The project uses the AV simulation blueprint to render physically accurate sensor data at scale in the virtual environment, boosting training effectiveness.The future of robotics and autonomy is coming into sharp focus, thanks to the power of high-fidelity sensor simulation. Learn more about these solutions at CES by visiting Accenture at Ballroom F at the Venetian and Foretellix booth 4016 in the West Hall of Las Vegas Convention Center.Learn more about the latest in automotive and generative AI technologies by joining NVIDIA at CES.See notice regarding software product information.
    0 Comments ·0 Shares ·101 Views
  • Now See This: NVIDIA Launches Blueprint for AI Agents That Can Analyze Video
    blogs.nvidia.com
    The next big moment in AI is in sight literally.Today, more than 1.5 billion enterprise level cameras deployed worldwide are generating roughly 7 trillion hours of video per year. Yet, only a fraction of it gets analyzed.Its estimated that less than 1% of video from industrial cameras is watched live by humans, meaning critical operational incidents can go largely unnoticed.This comes at a high cost. For example, manufacturers are losing trillions of dollars annually to poor product quality or defects that they couldve spotted earlier, or even predicted, by using AI agents that can perceive, analyze and help humans take action.Interactive AI agents with built-in visual perception capabilities can serve as always-on video analysts, helping factories run more efficiently, bolster worker safety, keep traffic running smoothly and even up an athletes game.To accelerate the creation of such agents, NVIDIA today announced early access to a new version of the NVIDIA AI Blueprint for video search and summarization. Built on top of the NVIDIA Metropolis platform and now supercharged by NVIDIA Cosmos Nemotron vision language models (VLMs), NVIDIA Llama Nemotron large language models (LLMs) and NVIDIA NeMo Retriever the blueprint provides developers with the tools to build and deploy AI agents that can analyze large quantities of video and image content.The blueprint integrates the NVIDIA AI Enterprise software platform which includes NVIDIA NIM microservices for VLMs, LLMs and advanced AI frameworks for retrieval-augmented generation to enable batch video processing thats 30x faster than watching it in real time.The blueprint contains several agentic AI features such as chain-of-thought reasoning, task planning and tool calling that can help developers streamline the creation of powerful and diverse visual agents to solve a range of problems.AI agents with video analysis abilities can be combined with other agents with different skill sets to enable even more sophisticated agentic AI services. Enterprises have the flexibility to build and deploy their AI agents from the edge to the cloud.How Video Analyst AI Agents Can Help Industrial BusinessesAI agents with visual perception and analysis skills can be fine-tuned to help businesses with industrial operations by:Increasing productivity and reducing waste: Agents can help ensure standard operating procedures are followed during complex industrial processes like product assembly. They can also be fine-tuned to carefully watch and understand nuanced actions, and the sequence in which theyre implemented.Boosting asset management efficiency through better space utilization: Agents can help optimize inventory storage in warehouses by performing 3D volume estimation and centralizing understanding across various camera streams.Improving safety through auto-generation of incident reports and summaries: Agents can process huge volumes of video and summarize it into contextually informative reports of accidents. They can also help ensure personal protective equipment compliance in factories, improving worker safety in industrial settings.Preventing accidents and production problems: AI agents can identify atypical activity to quickly mitigate operational and safety risks, whether in a warehouse, factory or airport, or at a traffic intersection or other municipal setting.Learning from the past: Agents can search through operations video archives, find relevant information from the past and use it to solve problems or create new processes.Video Analysts for Sports, Entertainment and MoreAnother industry where video analysis AI agents stand to make a mark is sports a $500 billion market worldwide, with hundreds of billions in projected growth over the next several years.Coaches, teams and leagues whether professional or amateur rely on video analytics to evaluate and enhance player performance, prioritize safety and boost fan engagement through player analytics platforms and data visualization. With visually perceptive AI agents, athletes now have unprecedented access to deeper insights and opportunities for improvement.During his CES opening keynote, NVIDIA founder and CEO Jensen Huang demonstrated an AI video analytics agent that assessed the fastball pitching skills of an amateur baseball player compared with a professionals. Using video captured from the ceremonial first pitch that Huang threw for the San Francisco Giants baseball team, the video analytics AI agent was able to suggest areas for improvement.https://blogs.nvidia.com/wp-content/uploads/2025/01/JHH-pitch-metropolis-trim-final.mp4The $3 trillion media and entertainment industry is also poised to benefit from video analyst AI agents. Through the NVIDIA Media2 initiative, these agents will help drive the creation of smarter, more tailored and more impactful content that can adapt to individual viewer preferences.Worldwide Adoption and AvailabilityPartners from around the world are integrating the blueprint for building AI agents for video analysis into their own developer workflows, including Accenture, Centific, Deloitte, EY, Infosys, Linker Vision, Pegatron, TATA Consultancy Services (TCS), Telit Cinterion and VAST.Apply for early access to the NVIDIA Blueprint for video search and summarization.See notice regarding software product information.Editors note: Omdia is the source for 1.5 billion enterprise-level cameras deployed.
    0 Comments ·0 Shares ·104 Views
  • CES 2025: HP Will Use AI to Handle Your Game Settings For You
    lifehacker.com
    Whenever I log into a game, the first thing I do is hop into its settings and toy around with my options to try to strike a good balance between performance and graphics. Its not exactly the most exciting first impression for a new title, but it can make my actual time spent playing a lot more seamless. Now, HP wants to automate that first hurdle away, so I can jump right into gameplay instead.Omen AI beta optimizes your settings for youAs part of its CES 2025 announcements, the company announced its "Omen AI Beta," coming to the HP Omen Gaming Hub this week. According to HP, this AI will use machine learning to provide customized adjustments to operating system settings, hardware settings, and in-game settings in just one click. Thats a lot to balance all at once, but the idea is that HP will scan your system to find out how to get max performance in your games, taking your individual setup into account, then make the needed changes for you.Like other AI, it'll need training data to pull that off. According to a spokesperson, Omen AI will use hardware specs, game configurations, and performance metrics from "millions of gaming systems" to come up with its optimizations, which it'll then apply across your game and certain compatible parts of your wider system. In order to avoid conflicts with other companies' products, its operating system changes will be limited to adjusting the "booster" settings already present in the company's gaming software, while hardware setting changes will initially only work with Omen PCs. In other words, your optimizations might be a little less effective if you don't have the right computer, but you also won't need to worry about the program breaking hardware it's not familiar with. (If you do encounter issues, there's an undo button to change back to your setting from before you enabled the AI.) Credit: HP Even with those limitations, the feature is only set to work with Counterstrike 2 at launch, so expect a measured rollout. Still, promotional material does show the tool working in Valorant as well, so hopefully HP wont abandon it after just a proof of concept.Thats really whatll make or break something like this. If it only works on a few titles, itll be great for headlines, but gamers will quickly forget about it. Integrations with other PC makers and companies like Nvidia are also possible down the line, I'm told, which could help widen the tool's user base. A wider user base also means more data to help train on, which does raise the issue of privacy. According to HP, Omen AI doesn't use any personally identifiable information to train its AI models, and users can manage their data collection consent options from within the Omen Gaming Hub software. Still, even with the promise that data is anonymous and aggregated, it's worth double checking your privacy settings if you have HP's Omen Gaming Hub installed, even if you don't plan on using AI. Credit: HP It's interesting seeing something like this come first from a computer manufacturer rather than Microsoft or Nvidia, although the tool will be open to anyone with Omen Gaming Hub installed, whether or not they have an HP PC. Both in and out-of-game, it looks like youll be able to use Omen AI to access a quick toggle that will tell you your current fps as well as what fps you can expect to get upon enabling the tool. Even if these are just mock-ups for now, they do show a commitment to ease of use. That said, maybe a bit of fine control would be helpful herewhat if Im OK with the program adjusting my in-game settings but would rather have my operating system and hardware left alone?How well Omen AI will work in reality depends on how the beta plays out and continued support. I'm rooting for it: Its also a good example of the type of AI I actually like, since it's more about eliminating tedium than replacing human creativity.New mice from HPHP also announced the HyperX Pulsefire Saga and Saga Pro mice at this year's CES, which both look pretty standard on the surface, with 8K polling, six programmable buttons, and support for up to 26,000 DPI. There are some premium features, like magnetic weights, but what really sets them apart is their 3D-printed cases. Youll get eight case parts in the box, which is enough for a full mouse, but youre also free to swap out and customize options at your leisure, by downloading new open-source case parts from HPs Printables account.The HyperX Pulsefire Saga and Saga Pro mice will both be available in March. The former costs $80, while the latter hits $120, thanks to its added wireless functionality.
    0 Comments ·0 Shares ·88 Views
  • CES 2025: Sony Just Revealed When 'The Last of Us' Season Two Is Dropping
    lifehacker.com
    Sony may have won CES as far as Im concernedand the expo hasnt even officially started.The company made a number of announcements during its big CES media event, including pricing and availability information for its highly anticipated AFEELA car (a joint partnership with Honda). But it was the last announcement of the night that particularly caught my attention and excitement.Sony surprised us all by welcoming Neil Druckmann, studio head of Naughty Dog, to the stage. Druckmann mentioned the studios upcoming space adventure game, The Intergalactic, but then revealed the company had one small announcement to make. Credit: Jake Peterson The Last of Us season two drops in April As you might expect after a CES introduction from Neil Druckmann, Sony presented a new teaser for The Last of Us season two. Of course, there's fresh footage we didn't see in the season's first trailer, as well as intriguing snippets of performances from Pedro Pascal, Bella Ramsey, Kaitlyn Dever, and Gabriel Luna. I don't want to spoil the story for anyone who hasn't played The Last of Us Part II, but if you're a fan of season one, you're in for a ride. All that said, the teaser closed with the most important announcement of all CES: Season two kicks off in April. Sony hasnt given an exact date for this April release, but its more information than weve had before, and Im here for it. Ill be watching this teaser on repeat until the show aires.Sony is turning more of its games into shows and moviesIn addition to The Last of Us second season, Sony also announced a Ghost of Tsushima anime, a movie adaptation of Helldivers 2, as well as a Horizon Zero Dawn feature film, however the latter is only in the early stages of development.Sony is clearly all-in on adapting its IP, and I don't blame them. The company touted the box office success of its Uncharted movie, and noted that it was proud of other adaptations like Grand Turismo, Twisted Metal, and, of course, The Last of Us. The company also reminded the audience of its upcoming film adaptation of Until Dawn, showing off a spot filmed with actor Peter Stormare to promote it.
    0 Comments ·0 Shares ·89 Views
  • CES 2025: Eufy's New 'Three-in-One' Robot Vacuum Left Me Scratching My Head
    lifehacker.com
    Today at CES, Eufy announced a new robot vacuum with a twist: it is also a stick and a hand vacuum. Its not only the first three-in-one robot, it is also the winner of a CES Best of Innovation award. Having said that, Ive had my hands on E20 the last week or so, and I have found it to be a bit confounding and underwhelming. Credit: Amanda Blum Eufy has made some of my favorite security cameras in the last few yearshowever, when it comes to vacuums, Im often left unimpressed. Much was made of the Omni S1 last year when I tested it, and while the design was interesting and it performed well enough, I felt it was overpriced. The most complimentary Ive ever been to Eufy was towards the X10 Pro Omni, which I said was pretty good for the price point.I was excited about the E20 from the jump. It came in a delightfully small, light box, which may not seem exciting but most robot vacuums arrive in hulking, heavy boxes. Once I unpacked the box, I kept digging back through looking for the missing lid to the robot. I soon realized there wasnt one. The robot has an exposed face, showing the handled, removable part of the stick vacuum. To me, it looks unfinished. When you remove the handle from the robot, it is unbalanced enough that the robot struggles to remain docked.I also found it odd that there is no place to dock the stick and attachments when not in use, which feels like a real missed opportunity. Credit: Amanda Blum Once I got it up and running, the E20 did a good job of mapping the space. However, it struggled to navigate it. Something about the weight of the handle makes the robot awkwardly balanced, and so it got stuck a number of times.As a stick vacuum, the E20 worked fine. It is a little short and since Ive spent a lot of the last year testing stick vacuums with computer displays from Dyson, Samsung, and Narwal, it was almost quaint to have such a simplistic stick in my hand. A drawback I noticed immediately was the E20's tiny capacity. The tower is tiny, too. I had to return to the dock multiple times vacuuming the main floor of my very small house, and each time you have to take the vacuum apart, dock the head of it, wait for it to empty and then reattach it.The handheld is simply the handle and any attachment without the stick. It performs about the same. There is a multitool, which is my favorite attachment, but Id rather have a handheld nearby that I could quickly grab and put back than to deal with the E20.More curious, at a time when every robot vacuum practically bursts with pride boasting about astronomical suction power (we saw it as high as 22,000Pa this week), the E20 only has 8,000 Pa. Whats odd is that the stick vacuum has 30,000Pa. So, the robot and the stick dont share a motor, which seems to mean the robot is merely a place to store the handle. It makes very little sense to me.What the E20 has going for it is price. Its only $540 and will be available for presale starting Jan. 6, for $50 off retail. Still, since it is merely a vacuum and not a robot vacuum/mop combo, I think there are better options at this price point, like the Switchbot K10+.
    0 Comments ·0 Shares ·88 Views
  • Engadget Podcast: We've survived two days of CES 2025
    www.engadget.com
    In this bonus episode, Cherlynn and Devindra discuss the latest innovations in robot vacuums, new AI PC hardware from AMD and Intel, and Dell's decision to nuke its PC brands in favor of Apple-esque "Dell Pro" and "Dell Pro Max" branding. (Note: We recorded this episode before NVIDIA announced its new RTX 5000 GPUs, but we'll have more to say on that soon!)Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!Subscribe!iTunesSpotifyPocket CastsStitcherGoogle PodcastsCreditsHosts: Devindra Hardawar and Cherlynn LowMusic: Dale NorthTranscriptDevindra: [00:00:00] What's up everyone, this is Devindra Hardwar, Senior Editor at Engadget.Cherlynn: I'm Deputy Editor Cherlynn Low.Devindra: We are here what is this, the beginning of night one of CES officially?Cherlynn: I guess, yeah. IDevindra: guess we have already suffered through basically day minus one. MinusCherlynn: one and today's zero.Devindra: One thing I want our listeners to understand is that we have already seen a lot of things we kind of know where the CES is headed. And, I think this is a cursed show Cherlynn. How do you feel about that? For all of us. For ourCherlynn: team. Yeah, I think I mean, Devindra, I'll let you speak to your situation, but we've had team members who have fallen deathly ill.We have also, like, people who have completely had to miss their flights, international flights. It's been quite Engadget team, but we have a really, really good team of people. Everyone's got great attitudes and, like, our spirits are high. Okay. You want to just get the stuff going. So, yeah, no, and Devindra, you have been struggling a little [00:01:00] bit.Devindra: So, yeah, update here is I basically threw my back out the the day before I had to fly. So, I kind of was mentally just preparing how to fly without caring much and just being really easy on my back. But, you know, I survived. AndCherlynn: sitting in a plane for as long as you did couldn't help either, right?Probably didn't help.Devindra: Thankfully I did a smart thing and I bought a Comfort Plus upgrade with my points ahead of time. And I was like, I was going to be chill on the flight and it turned out that was just necessary. Yeah, so CS is officially beginning. We have seen we've just went through CS Unveiled yesterday.A lot of embargoes and news came out today too. You know, some of the biggest news we've seen. Dell's rebrand away from its own PC names. To Dell, Dell Pro, Dell Pro Max. There's some new hardware from Intel and AMD. Yay! that they showed off and, you know, just kind of a typical CS stuff. What were the weird things you saw, Cherlynn, at at Unveiled?Because you were there amidst all the weird gadgets.Cherlynn: Yeah, and to be clear, given Devindra's [00:02:00] injury, we are, we are having Devindra stay in place where he is, you know, able to recover a little bit. So, Devindra wasn't at Unveiled with me, so I'm going to tell you about all these funny things we saw at Unveiled.Somehow the most intriguing thing so far is the trend of Putting things in your mouth at CES Unveiled.Speaker 3: Okay.Cherlynn: So, we have like, at least two things that are saliva detecting devices. Uh huh, huh. Or like, you put a drop of saliva or you put your like, a stick in your mouth or something. WeDevindra: are not going to call this the Hawktwa CES, let's not do that.DanCherlynn: Cooper definitely not coming up with a story based around that. But the idea is that using your saliva. Companies can tell how much cortisol or other types like progesterone types of things, hormones are inside your Or in you, right? And so it's a bit to help with burnout a bit to help with like stress and health and then there is It's the salt spoon that everyone was licking at CES on day one.ThatDevindra: doesn't seem like a good [00:03:00] idea at a, at a conference. ItCherlynn: was so, yeah, everyone's felt like it was, initially it seemed a little icky, but the booth was so crowded I went over and it turns out they actually had like individual disposable versions of this spoon, the salt spoon per its name. It's a gadget that will mimic or simulate the, the flavor of umami or salt made by a company called Kirin.Devindra: Okay.Cherlynn: Which I believe makes some kind of condiment. That's the,Devindra: they're a soy sauce company. Exactly.Cherlynn: And so, it's the idea that like, people want to live healthier, eat better, and not have such a high sodium diet. So, but they still crave this taste. We love it. WeDevindra: love umami. Exactly. Why don'tCherlynn: we, why don't we use electric on your tongue?Devindra: That is some dystopian, I hope the story about this is how it's made. That is very dystopian. That's very like, you know, Soylent Green or something where we're not really eating food, but we're feeling these sort of like electrical impulses of food.Cherlynn: Triggering your tongue to feel like it's tasting something.Just to feel alive. That's horrible. I know. I, it's, it's, you asked me weird. And I was like, yeah, that [00:04:00] is pretty horrifying. But I'm very intrigued. I almost, so I was kind of waiting in line, but it was so crowded always. And I had so much other stuff to check out that, I didn't really get around to it.There were other things, I think, that turned up that, as unveiled, that were very interesting. Our team saw a stringless guitar. There were, like, about a zillion robots that all kind of look very weird. And then, lots of mirrors that you can, like, stand in front of and scan yourself. And, finally, I think, the Stern Pinball Machine of the Year is themed Dungeons Dragons.Okay.Devindra: Really, just really hitting the nerd market perfectly. We did see Roborock's flagship new robot vacuum and that thing looks cool because they just added an arm to it. Like it has an extendable arm that can pick up socks and small things from the floor. And I am really interested in seeing the race between Roomba all these other companies.I think was one of the first to do like, okay, self cleaning. We're going to dump your vacuum into this bigger container than the vacuum [00:05:00] can keep going. Now everybody's doing that. Then Roomba and others people started doing like combo mops. And now it's just like, we're getting appendages. We're getting, I think one can climb stairs.I saw news about that.Cherlynn: So Carissa is on the robot vacuum for us, I guess. And she got a chance to check out the, yeah, the Roborock I can't remember the actual. name, how it's pronounced, Safi or Safu Z70 and it we have a video on the article on our website as well as on our Twitter. It'sDevindra: the Saros Z70, yeah.Cherlynn: So close, that was so close. And yeah, that video shows the robot's arm kind of coming out of its round disc like body and then picking up a sock that was in front of it. And not only that, I thought it would just pick it up and then like, wipe and then move away and put it back down. No, it took it to a basket nearby, like a laundry basket almost, And placed it in there.So basicallyDevindra: we're almost there. We're almost there to real robot helpers.Cherlynn: So close. This thing is very close. Who knew Roborock of all companies would do it. I was like,Devindra: it's a, it's really interesting to watch because Roborock, I think, yeah, it's a [00:06:00] Chinese company and these folks, like, especially when they're doing robotic stuff, like they're just barreling forward because they can invest more in R& D and stuff.I want, I've said this before. I want something that can like unload my dishwasher,Cherlynn: which is theDevindra: process that I think likeCherlynn: my dishwashing unloading therapy.Devindra: You know load it up clean the kitchen just like a real rosy robot situation. That's what I needCherlynn: I mean the other robot vacuum that you're talking about that can climb upstairs.I believe is the dreamy And yeah, it's interesting to see or I was like why why would we need a robot vacuum that can climb upstairs? But I guess there are actual functional uses for the x50 robot I don't know, man. 1, 700 just for it to, like, climb.Devindra: I think this is a bad idea. This is a bad idea, because Generally, you want your robot to be on one floor.Yeah! Once you have stairs in the equation, then, like, it could fall. There could be all sorts of issues. To me, that's not super useful. I've been room building for a while, and only recently with two floors. You pick it up, and you move it to another floor when you need to. If you're super [00:07:00] bougie, you have more than one Roomba.You have a Roomba port per floor. Or you haveCherlynn: the cheap one on the floor that doesn't matter as much as the expensive one in the place that matters. And they'reDevindra: cheap Roombas. You could get a refurb Roomba for like 200, 300 bucks.Cherlynn: Yeah.Devindra: Relatively, that's relatively cheap compared to how much they used to cost.Cherlynn: Well, this one, I mean, I guess the, the shtick with this dreamy robot is that it climbs up like a human. So it's not like sort of propelling itself up in some strange way, going up on an incline, getting his rollers. It's like, God, some kind of like climbing mechanic. That's like. Bipedal? Is it bipedal? Is it likeDevindra: I have to take a closer look.TheCherlynn: video looks like wild and I'm in such a CES fever dream that like, I have forgotten what it looks like. So muchDevindra: stuff. Another thing I want to talk about, the TVs seem like, it seems like wireless TVs are morbid thing right now. Like LG and Samsung are fully doing it. All their flagships have wireless boxes.They say the lag is pretty good for gaming. I would have to, I would have to see that to see how much it works. But I do think that's a good pain point for a lot of [00:08:00] people. People hate Wires. Moving behind their TVs. They hate, like, if you're mounting a TV, you have to, like, figure out where all the wires are going to go.So there's that that company Displace, which last year had the suction TV with the battery, which I think I called it vaporware last year. I don't think they actually shipped any. This year they're back. They have a soundbar. They say they're going to actually ship stuff. I don't believe it. But, they're back.They're here.Cherlynn: Is one year enough time to see if the TV that will stick itself to your wall has fallen off yet? You know what I mean? Like, is it time to call it safe if it hasn't fallen off in a year? Or should we give it another year? I don't know.Devindra: It's my whole thing about trusting gadgets and trusting devices, I will not trust it.Multi thousand dollar device that is just hanging by my wall by suction cup.Cherlynn: It's like one thing if it falls off and hurts itself, it's another if it like takes down my wall with it, right? Like, and my bed and my glass table or something. So yeah, there's a lot of stuff to be concerned about, I think.Devindra: Boy AIPC is still a running theme this year.AMD was really big on a whole bunch of [00:09:00] new chips. They announced the Ryzen AI Max chip, which they say is going to be in Halo products, Halo Copilot Plus PCs. It's supposed to be really powerful. It has more graphics than their other ones. They also say it does better rendering, like 3D rendering, better than Intel's chips.Because AMD's graphics tend to be better. They have like built in Radeon cores. So, you know, they're kind of killing it. Intel was just like, hey We have AI chips too. They're coming to gaming desktops. They're coming to other things.Speaker 3: Yeah,Devindra: they have core PC, core AI laptop chips that will be coming to gaming laptops as well.So, that's a thing. You know, the good thing about CES is that you can see people and talk to people. So, I had a good chat with Pavan Davaluri, who is like the head of Windows and Surface devices from Microsoft. That was an off the record chat, but I can say it was good to have. That conversation to see what they're thinking about AI PCs.Hopefully we'll have him on for another section of the gadget podcast, [00:10:00] but I guess like CS is happening. Like when news is happening, these companies are taking it seriously. We're talking to high level people. So it feels like a CS of your, I'd say, despite being so cursed early on,Speaker 3: I guess,Devindra: yeah, like stuff, it feels legitimate and real in a way that hasn't for the past couple of years.But I mean, forCherlynn: you, maybe two part of it is the return to the physical. Yes. Right, because it's been a while. And I think that my general sense is that interest in CES might have waned. I think this year too you know, we've, we've had different observations about shows from the recent years and this year feels even more like it is something you could, it's like commoditizing things for the sake of commoditizing things a little bit and more than ever actually.And it's very much like the Radio Shack show a little bit. But you know, I would say, I don't want to give away what we're working on. So I would say like, we're, Come to Engadget. com come to our social media channels where we've got a lot of videos going up We've actually are bringing back our youtube channel for a little bit And the live blog we I am in [00:11:00] live blog hell every day for a little bit but it is a fun time because live blogs allow me to be a bit more I think personal with our audience Which is fun like this podcast But I do want to shout out like to your point like amd and intel Both have made their announcements as of the time.We're recording this but We still don't know technically what NVIDIA is going to announce. And Nvidia has one of the, I wanna say the most hyped keynotes or speeches, this CES mm-hmm . What are you thinking that they'll do for CES?Devindra: I mean, for the keynote, they typically hype up their AI projects or robotics projects.And honestly, things that we don't typically report news on because it's kind of pie in the sky stuff that will only exist for a car manufacturers or something. They don't really touch consumers. We will eventually hear, most likely, about the new GeForce RTX GPUs. Maybe not tonight, but I have a good sense like sometime this week, NVIDIA will make that announcement.And that is the thing people are really waiting to see. And I think AMD sensed that a bit too. They briefly teased some information about the RDNA 4 [00:12:00] GPUs. Yeah, AMD also teased their RX 9070 GPUs. And that's interesting too, just the name is interesting. Because you know, AMD's used to follow a fully different Radeon naming scheme.Now they're kind of aligning with what NVIDIA's doing. So, this Radeon RX 9070 will be comparable to whatever NVIDIA announces as a 5070 video card. Okay. So, it should make shopping a little easier. So there's that. The RDNA 4 technology is going to have AI upscaling, which is a thing we've knocked AMD against before.Because their fidelity affects a super resolution for stuff. But just couldn't compete with NVIDIA NVIDIA's DLSS, so they're gonna have an answer to that. But again, just like, brief teases the news post I wrote is like the bare minimum we can even write because they didn't have much information.They're just like, yeah, we will have new video cards, we will have newCherlynn: graphics. Is it claiming a spot,Devindra: right? Like, kinda? Yeah. Basically. Whereas I think NVIDIA's gonna come here and show off new hardware, new actual things, so. We shall see.Cherlynn: Yeah.Devindra: And I want to do maybe one or two [00:13:00] more of these episodes, just like recapping where we are Oh, throughout the show?Yeah, yeah, yeah. Yeah, we are using the DJI mic 2, or mic mini, so we can like sit down and record anywhere. Yep. We're at our breakfast nook right now. InCherlynn: the hotel.Devindra: Yeah. And maybe you'll hear more ambient noise than normal, but it, this is a really good way to have conversations. Yeah, it's fast. We hope to have some interviews from folks up soon too.With otherCherlynn: members of the team. You'll hear more than just the two of us. I think, I promise you they all sound amazing and lovely. I'm trying to think of like, whether there's anything else that's of note in the news that we've seen so far, because to your point, right, CES is in full swing, really. And we've beenDevindra: like, headstabbed.Just like, so much stuff. Sherilyn's been managing so much of the like, practical stuff and the scheduling stuff. Yeah, I've had like a pile of embargoes. All of us, like, all the teamCherlynn: has had piles of embargoes, which is like, it is, like, to your point, kind of a return to form in that sense, but also feels like we've been covering this endlessly every CES.We saw a few I don't know. Lots of AI that [00:14:00] doesn't really need to be AI. We saw a lot of pet tech. We saw a lot of smart home. Man, send us your thoughts, really, so far as we are chugging along the show. Podcast at Engadget. com would be a great place to drop them. Oh my gosh LG's got all these weird products that I think we talked about even ahead of coming to CES.Where like, yes it's slapdick 2070 inch screen on a microwave, but then recently we found out what, it was a projector that looks like a stand fan or something? That's actuallyDevindra: kind of cool. Yeah, we gotta get some video of that stuff.Cherlynn: Yeah, so plenty, plenty to look outDevindra: for. Of the stories we've produced, I do want to shout out the stuff Sam and I did around Dell's rebranding.I wrote about Dell rebranding all of its PCs to sound more like Apple, so check out that post. But Sam had a really good rant called Dell killing the XPS name is an unforced error. And that whole story is wild because Dell's basically obliterating all of its brand names. They're just going to be Dell, Dell Pro and Dell Pro Max.And to both of us, that sounds very Apple y. Wait, can I insertCherlynn: myself a little bit here? Because it's not just Dell, Dell Pro, Dell Pro Max. [00:15:00] After I read both of your posts, it is the sub tiers that makes no sense. Like if they simplified it truly, it would just be Dell, Dell Pro, Dell Pro Max. Fine. But no, it would be Dell, Dell Pro, and then under each, there might be the premium label and the plus label.So it could be the Dell Pro Plus. Yep. But versus the Dell non pro premium. So the Dell premium is still worse than the Dell Pro Plus?Devindra: Yes.Cherlynn: My goodness. What? And then you throw in the numbers. There's numbers. They're coming back. Some of the numbersDevindra: are coming back. The desktops are kind of ridiculous because at the event Sam and I were at, they showed off The Dell Pro Max Micro and the Dell Pro Max Mini, which you have the same name within your name.You are conflicting what this device actually is, and I find that to be completely ridiculous. So, check out Sam and my rant about that thing. I also did a video up on YouTube, and for once, the YouTube commentators seem to be on our side. Yes, they're right.Cherlynn: They are right. We are right, and Dell [00:16:00] is not right.And so I am glad you pointed it out. It seemed like a lot of people resonated with that story on our side as well. It's a wholeDevindra: thing. And I will say I don't miss like the, I don't miss a lot of the brands like Inspiron and whatever, but it's more like XPS. Getting rid of XPS seems like a mistake. Falling in the footsteps of Apple seems like a really weak move.Cherlynn: Of all the things to do because look, I covered HP's pivot to one brand as well when that happened last year. And HP had a good sense to just, when they say simplify, they mean. Simplify to their own brand. So they did Omnibook, right? Which is not Pro Max. Fine. It's their own name. For Dell to tell you that they're not copying Apple, and I'm not saying they did say that to you, but like, they more or less suggested that these are industry terms.They did say thatSpeaker 3: to me. ButCherlynn: like, if HP can do so without invoking the terms Pro and Max, why can't you, Dell?Devindra: That's basically what I asked Michael Dell, the CEO of Dell, at this event where Dell. com. He was there to announce this whole thing and they were asking [00:17:00] questions from the audience. So yeah, I shot my hand up and I was like, my direct question to him was, what does Dell have to gain by copying Apple?And Michael Dell did not look too pleased.Cherlynn: Of course he did. HeDevindra: I mean for him too, it's like, oh, now my names are all Dell, Dell, Dell. So it's like better for him and his ego. Yeah, yeah, hisCherlynn: name, yeah, yeah. And IDevindra: feel like that may be part of it, but I've talked to a lot of people at Dell, like Other people, people working within the PC design stuff and nobody was excited about this change.Of courseSpeaker 3: not, why? BecauseDevindra: their babies are all gone. Like the people who work on Inspiron and Precision and everything, the brands they devote their lives to are gone. And now they have to live with these new brands and I don't know if people are going to be as excited. So anyway, that's going to be a long ongoing story.Check out our coverage in all of its many forms. I think that's going to be one of the big takeaways from the CS. Del sort of, just shooting itself in the foot here. And nobody seems to like it except Del, except Michael Del.Cherlynn: Yeah. I want to quickly shout out that the Samsung press conference just wrapped and we learned two things of note.One, that the [00:18:00] Bali rolling robot is going to actually retail this year, they say. But they did say that last year too. And then we don't know a price yet. We just know it's going to be the first half of the year is what they said on stage. And then the second thing is they announced the dates of Galaxy Unpacked.It will happen? January 22nd. So thanks a lot, Samsung, because right after CES, some of us will be heading straight into preparation for Samsung Galaxy S8. LetDevindra: Cherlynn take a break. That's the message of this year. Never,Cherlynn: never happening. Alright, weDevindra: will, we'll be back with more updates about CES. Drop us an email, folks, podcastinggadget.com. No live stream this week, because we are here, but you'll get a bunch of episodes from us. And check out our social channels, too. A lot of fun videos are going up. Send usCherlynn: music recommendations! OhDevindra: yeah, maybe we should just open up a playlist and have people add songs to it. Anyway, we're out folks, thank you.Cherlynn: Bye!This article originally appeared on Engadget at https://www.engadget.com/computing/engadget-podcast-weve-survived-two-days-of-ces-2025-052543789.html?src=rss
    0 Comments ·0 Shares ·90 Views
  • Everything you missed on Day One of CES 2025
    www.engadget.com
    CES 2025 has begun, which means a whole fleet of new gadgets has been unleashed onto the world. As usual, team Engadget has battled jet lag, sleep deprivation and the static shocks of those horrible casino carpets to bring you all the news thats fit to print.But if youre too busy to keep your browser locked on the site (or our handy dandy liveblog) then heres a recap. This may not be everything we covered, but it's a rundown of the biggest, most important and generally interesting news for your delectation.There was a strong showing from the biggest names in the PC space, with Intel showing off its latest crop of Arrow Lake chips. These are AI and gaming-friendly slices of silicon that should pop up in PCs and laptops from major manufacturers in the next three months.Speaking of which, Dell turned up to the show to announce it was killing off the bulk of its brands in favor of copying Apples naming strategy. Rather than XPS, Inspiron and Latitude, youll have Dell, Dell Pro and Dell Max which in Sam and Devindras minds, is a massive unforced error.On AMDs side of the chip war, it announced the new Ryzen Z2, which will power the next crop of gaming handhelds. The rumor mill was suggesting the Z2 would sit at the heart of Valves next Steam Deck, which Valve moved quickly to kibosh.But on the subject of handhelds, Acer wanted to show off its supersized Steam Deck rival, the Nitro Blaze 11. As the name implies, it's packing an 11-inch display, kickstand and detachable controllers, like a Switch that got out of its cage and found your secret stash of human growth hormone.Samsung rocked up at the show to flaunt the Galaxy Book5 Pro with Intels new Arrow Lake chips. But its real focus was on its new range of home entertainment gear, including its new soundbars and 8K Neo QLED screens, which is also what youll find inside its new Frame Pro TVs.CES isnt a mobile-friendly show, but Samsung did announce that its first Unpacked keynote of 2025 will drop on January 22. But, psh, whatever: The real Samsung mobile device news we are about is that its ball-shaped robot, Ballie, will go on sale later this year.On the subject of things scuttling around your floors, plenty of companies are trying to find a way to make their robovacs stand out. Dreames X50 can avoid getting stuck on tricky door thresholds since it can vault obstacles as tall as 6cm, via its ProLeap System. Given most robovacs can run aground on a threshold between one room and another, its a useful feature.Roborocks Saros Z70, meanwhile, has a little robotic arm in its lid that can pick up and move small objects found in its way. As a parent whose kids have some sort of obsession with leaving their socks in obtuse places, I already want one.Speaking of things I want, despite my longstanding hatred of AI, Im quite partial to the idea of Hallidays AI Glasses. Theyre designed to help you navigate life, proactively answering your questions, helping you remember key information and generally giving your tired brain a rest.Yukai Engineering is also looking to tend to your tired brain, with its Mirumi robot designed to make you smile. The theory being if youre feeling low, itll stare at you until you have a brief moment of bemused joy thatll kick you out of your funk.It wouldnt be CES without an appearance by will.i.am, who LG recently appointed as its new Chief Being will.i.am Officer. The company was showing off its new TVs and soundbars, as well as its new will.i.am-infused xboom speakers with built-in boom, boom and pow.Moving onto the bodily fluids part of our presentation: two different companies turned up to Las Vegas with saliva-testing gadgets asking consumers to spit on that thing to monitor their stress. cortiSense and Hormometer are two products thatll monitor the cortisol (the stress hormone) levels in your saliva.Day one rounded out with press conferences from a couple of heavy hitters: Sony and NVIDIA. Sony showed off very little in the way of consumer electronics, instead giving us a (eye-wateringly expensive) price for the car it's making with Honda and then talking about broadcast stuff for an hour. Hey, at least we have a date for The Last of Us season two. As for NVIDIA, CEO Jensen Huang talked about AI for 30 minutes, then announced some (eye-wateringly expensive) new GPUs, then talked for AI for about 30 minutes. Thrilling stuff!This article originally appeared on Engadget at https://www.engadget.com/everything-you-missed-on-day-one-of-ces-2025-050018086.html?src=rss
    0 Comments ·0 Shares ·92 Views
  • NVIDIA DLSS 4 is coming to all RTX GPUs
    www.engadget.com
    NVIDIA has introduced DLSS 4, the latest version of its real-time image upscaling technology, at CES 2025. It is coming to all RTX GPUs, including the RTX 20 series that was discontinued back in 2020, but the older models aren't getting all its features. In the new GeForce RTX 50 series models, DLSS 4 will enable Multi Frame Generation. The feature generates up to three additional frames for every traditionally rendered one and can help multiply frame rates by up to eight times more than traditional brute-force rendering. NVIDIA says the improvements brought by Multiple Frame Generation on the GeForce RTX 5090 graphics card, its new $1,999 flagship GPU arriving this month, will enable 4K 240 FPS fully ray-traced gaming.In addition, DLSS 4 represents what the company is calling the "biggest upgrade to its AI models" since the release of DLSS 2. DLSS Ray Reconstruction, DLSS Super Resolution and DLAA will now be powered by the same advanced architecture powering AI models, such as ChatGPT and Google's Gemini. The company says that translates to improved temporal stability, less ghosting and higher detail of objects in motion.A total of 75 games and apps will support DLSS 4 from day zero. When the new RTX 50 cards come out, games like Alan Wake and Cyberpunk 2077 will be updated with the ability to take advantage of the technology's Multi Frame Generation feature. More titles will be updated with support for Multi Frame in the future, including Black Myth: Wukong, while upcoming ones like Doom: The Dark Ages and Dune: Awakening will support the feature at launch.The GeForce RTX 40 series GPUs aren't getting Multi Frame Generation, but they are getting DLSS 4's enhanced frame generation, enhanced ray reconstruction, super resolution and deep leaning anti-aliasing capabilities. Meanwhile, GeFore RTX 30 series and RTX 20 series GPUs are getting the last three.NVIDIAThis article originally appeared on Engadget at https://www.engadget.com/computing/nvidia-dlss-4-is-coming-to-all-rtx-gpus-044835216.html?src=rss
    0 Comments ·0 Shares ·92 Views
  • AI jobs are among the fastest-growing in the UK
    www.techradar.com
    Many of LinkedIn's fastest-growing jobs in the UK didnt even exist 25 years ago, and AI features more than once.
    0 Comments ·0 Shares ·79 Views