AMD Believes “Inference” on Mobile Phones and Laptops Is the Future; Claims It Is An Opportunity to Challenge NVIDIA’s AI Dominance
Menu
Home
News
Hardware
Gaming
Mobile
Finance
Deals
Reviews
How To
Wccftech
HardwareIndustry
AMD Believes “Inference” on Mobile Phones and Laptops Is the Future; Claims It Is An Opportunity to Challenge NVIDIA’s AI Dominance
Muhammad Zuhair •
Apr 20, 2025 at 02:31pm EDT
AMD has bet that inference at data centers will soon become "obsolete" and will instead be used by consumer-oriented devices like phones and laptops.
AMD's CTO Says Tech Giants Are Moving Towards AI Inference Workloads As Consumers Are Preferring Edge AI Devices
Well, the start of the "AI frenzy" was marked by the craze for model training, and how companies built up huge computational capabilities to train their respective LLMs, but it seems like the markets are switching towards inference, and AMD believes that it is at the very forefront of this transition. In an interview with Business Insider, AMD's CTO, Mark Papermaster, revealed that inference is moving towards "edge devices", and given that Team Red believes that it can provide NVIDIA a tough time in this segment.
Question: OK, say it's 2030 — how much inference is done at the edge?
AMD's CTO: Over time, it'll be a majority. I can't say when the switch over is because it's driven by the applications — the development of the killer apps that can run on edge devices. We're just seeing the tip of the spear now, but I think this moves rapidly.
Papermaster believes that the soaring costs of AI compute with data centers will ultimately force tech giants like Microsoft, Meta, and Google to rethink their approach, making edge AI much more common. He believes that this is one of the major reasons AMD takes the "AI PC" hype with much more seriousness than its competitors, like Intel and Qualcomm. Team Red's latest APU lineups, such as the Strix Point and Strix Halo, are a testament to the company's commitment to bringing AI computational capabilities into small form factors, that too at a fraction of the cost.
When asked about the build-up of compute resources and how it is going to turn out, AMD's CTO responded by saying that there's a special focus on increasing the accuracy and efficiency of models out there, and with the release of DeepSeek, tech giants are resorting to optimized alternatives. Ultimately, in a larger time span, devices would be capable enough to run sophisticated AI models locally, allowing users to experience the fullest regarding AI capabilities.
AMD's CTO's remarks remind me of when Intel's former CEO Pat Gelsinger claimed that inference is the way to go in the future. This shows that NVIDIA's competitors cannot compete in the "AI training" markets, since Team Green has monopolized them by staying ahead of the curve. The only challenge they can give NVIDIA is by competing in future markets like AI inferencing, and it seems like AMD has already started on this, by introducing processors capable enough of edge AI.
Subscribe to get an everyday digest of the latest technology news in your inbox
Follow us on
Topics
Sections
Company
Some posts on wccftech.com may contain affiliate links. We are a participant in the Amazon Services LLC
Associates Program, an affiliate advertising program designed to provide a means for sites to earn
advertising fees by advertising and linking to amazon.com
© 2025 WCCF TECH INC. 700 - 401 West Georgia Street, Vancouver, BC, Canada