How Dell’s AI Infrastructure Updates Deliver Choice, Control And Scale
Dell Technologies World focuses its keynote on Inventing the Future with AIDell Technologies
Dell Technologies unveiled a significant expansion of its Dell AI Factory platform at its annual Dell Technologies World conference today, announcing over 40 product enhancements designed to help enterprises deploy artificial intelligence workloads more efficiently across both on-premises environments and cloud systems.
The Dell AI Factory is not a physical manufacturing facility but a comprehensive framework combining advanced infrastructure, validated solutions, services, and an open ecosystem to help businesses harness the full potential of artificial intelligence across diverse environments—from data centers and cloud to edge locations and AI PCs.
The company has attracted over 3,000 AI Factory customers since launching the platform last year. In an earlier call with industry analysts, Dell shared research stating that 79% of production AI workloads are running outside of public cloud environments—a trend driven by cost, security, and data governance concerns. During the keynote, Michael Dell provided more color on the value of Dell’s AI factory concept. He said, "The Dell AI factory is up to 60% more cost effective than the public cloud, and recent studies indicate that about three-fourths of AI initiatives are meeting or exceeding expectations. That means organizations are driving ROI and productivity gains from 20% to 40% in some cases.
Making AI Easier to Deploy
Organizations need the freedom to run AI workloads wherever makes the most sense for their business, without sacrificing performance or control. While IT leaders embraced the public cloud for their initial AI services, many organizations are now looking for a more nuanced approach where the company can control over their most critical AI assets while maintaining the flexibility to use cloud resources when appropriate. Over 80 percent of the companies Lopez Research interviewed said they struggled to find the budget and technical talent to deploy AI. These AI deployment challenges have only increased as more AI models and AI infrastructure services were launched.
Silicon Diversity and Customer Choice
A central theme of Dell's AI Factory message is how Dell makes AI easier to deploy while delivering choice. Dell is offering customers choice through silicon diversity in its designs, but also with ISV models. The company announced it has added Intel to its AI Factory portfolio with Intel Gaudi 3 AI accelerators and Intel Xeon processors, with a strong focus on inferencing workloads.
Dell also announced its fourth update to the Dell AI Platform with AMD, rolling out two new PowerEdge servers—the XE9785 and the XE9785L—equipped with the latest AMD Instinct MI350 series GPUs. The Dell AI Factory with NVIDIA combines Dell's infrastructure with NVIDIA's AI software and GPU technologies to deliver end-to-end solutions that can reduce setup time by up to 86% compared to traditional approaches. The company also continues to strengthen its partnership with NVIDIA, announcing products that leverage NVIDIA's Blackwell family and other updates launched at NVIDIA GTC. As of today, Dell supports choice by delivering AI solutions with all of the primary GPU and AI accelerator infrastructure providers.
Client-Side AI Advancements
At the edge of the AI Factory ecosystem, Dell announced enhancements to the Dell Pro Max in a mobile form factor, leveraging Qualcomm’s AI 100 discrete NPUs designed for AI engineers and data scientists who need fast inferencing capabilities. With up to 288 TOPs at 16-bit floating point precision, these devices can power up to a 70-billion parameter model, delivering 7x the inferencing speed and 4x the accuracy over a 40 TOPs NPU. According to Dell, the Pro Max Plus line can run a 109-billion parameter AI model.
The Dell Pro Max Plus targets AI engineers and data scientist with with the ability to run a ... More 109-billion parameter modelDell Technologies
The Pro Max and Plus launches follow Dell's previous announcement of AI PCs featuring Dell Pro Max with GB 10 and GB 300 processors powered by NVIDIA's Grace Blackwell architecture. Overall, Dell has simplified its PC portfolio but made it easier for customers to choose the right system for their workloads by providing the latest chips from AMD, Intel, Nvidia, and Qualcomm.
On-Premise AI Deployment Gains Ecosystem Momentum
Following the theme of choice, organizations need the flexibility to run AI workloads on-premises and in the cloud. Dell is making significant strides in enabling on-premise AI deployments with major software partners. The company announced it is the first provider to bring Cohere capabilities on-premises, combining Cohere's generative AI models with Dell's secure, scalable infrastructure for turnkey enterprise solutions.
Similar partnerships with Mistral and Glean were also announced, with Dell facilitating their first on-premise deployments. Additionally, Dell is supporting Google's Gemini on-premises with Google Distributed Cloud.
To simplify model deployment, Dell now offers customers the ability to choose models on Hugging Face and deploy them in an automated fashion using containers and scripts. Enterprises increasingly recognize that while public cloud AI has its place, a hybrid AI infrastructure approach could deliver better economics and security for production workloads.
The imperative for scalable yet efficient AI infrastructure at the edge is a growing need. As Michael Dell said during his Dell Technologies World keynote, “Over 75% of enterprise data will soon be created and processed at the edge, and AI will follow that data; it's not the other way around. The future of AI will be decentralized, low latency, and hyper-efficient.”
Dell's ability to offer robust hybrid and fully on-premises solutions for AI is proving to be a significant advantage as companies increasingly seek on-premises support and even potentially air-gapped solutions for their most sensitive AI workloads. Key industries adopting the Dell AI Factory include finance, retail, energy, and healthcare providers.
Scaling AI Requires a Focus on Energy Efficiency
Simplifying AI also requires product innovations that deliver cost-effective, energy-efficient technology. As AI workloads drive unprecedented power consumption, Dell has prioritized energy efficiency in its latest offerings. The company introduced the Dell PowerCool Enclosed Rear Door Heat Exchangerwith Dell Integrated Rack Controller. This new cooling solution captures nearly 100% of the heat coming from GPU-intensive workloads. This innovation lowers cooling energy requirements for a rack by 60%, allowing customers to deploy 16% more racks with the same power infrastructure.
Dell's new systems are rated to operate at 32 to 37 degrees Celsius, supporting significantly warmer temperatures than traditional air-cooled or water-chilled systems, further reducing power consumption for cooling. The PowerEdge XE9785L now offers Dell liquid cooling for flexible power management. Even if a company isn't aiming for a specific sustainability goal, every organization wants to improve energy utilization.
Early Adopter Use Cases Highlight AI's Opportunity
With over 200 product enhancements to its AI Factory in just one year, Dell Technologies is positioning itself as a central player in the rapidly evolving enterprise AI infrastructure market. It offers the breadth of solutions and expertise organizations require to successfully implement production-grade AI systems in a secure and scalable fashion. However, none of this technology matters if enterprises can't find a way to create business value by adopting it. Fortunately, examples from the first wave of enterprise early adopters highlight ways AI can deliver meaningful returns in productivity and customer experience. Let's look at two use cases presented at Dell Tech World.Michael Dell, CEO of Dell Technologies, interviews Larry Feinsmith, the Managing Director and Head ... More of Global Tech Strategy, Innovation & Partnerships at JPMorgan Chase at Dell Technologies WorldDell Technologies
The Power of LLMs in Finance at JPMorgan Chase
JPMorgan Chase took the stage to make AI real from a customer’s perspective. The financial firm uses Dell's compute hardware, software-defined storage, client, and peripheral solutions. Larry Feinsmith, the Managing Director and Head of Global Tech Strategy, Innovation & Partnerships at JPMorgan Chase, said, "We have a hybrid, multi-cloud, multi-provider strategy. Our private cloud is an incredibly strategic asset for us. We still have many applications and data on-premises for resiliency, latency, and a variety of other benefits."
Feinsmith also spoke of the company's Large Language Modelstrategy. He said, “Our strategy is to use a constellation of models, both foundational and open, which requires a tremendous amount of compute in our data centers, in the public cloud, and, of course, at the edge. The one constant thing, whether you're training models, fine-tuning models, or finding a great use case that has large-scale inferencing, is that they all will drive compute. We think Dell is incredibly well positioned to help JPMorgan Chase and other companies in their AI journey.”
Feinsmith noted that using AI isn't new for JPMorgan Chase. For over a decade, JPMorgan Chase has leveraged various types of AI, such as machine learning models for fraud detection, personalization, and marketing operations. The company uses what Feinsmith called its LLM suite, which over 200,000 people at JPMorgan Chase use today. The generative AI application is used for QA summarization and content generation using JPMorgan Chase's data in a highly secure way. Next, it has used the LLM suite architecture to build applications for its financial advisors, contact center agents, and any employee interacting with its clients. Its third use case highlighted changes in the software development area. JPMorgan Chase rolled out code generation AI capabilities to over 40,000 engineers. It has achieved as much as 20% productivity in the code creation and expects to leverage AI throughout the software development life cycle. Going forward, the financial firm expects to use AI agents and reasoning models to execute complex business processes end-to-end.Seemantini Godbole, EVP and Chief Digital and Information Officer at Lowe's shares the retailers AI ... More strategy at Dell Technologies WorldDell Technologies.
How AI Makes It Easier For Employees to Serve Customers at Lowe's
Lowe's Home Improvement Stores provided another example of how companies are leveraging Dell Technology and AI to transform the customer and employee experience. Seemantini Godbole, EVP and Chief Digital and Information Officer at Lowe's, shared insights on designing the strategy for AI when she said, "How should we deploy AI? We wanted to do impactful and meaningful things. We did not want to die a death of 1000 pilots, and we organized our efforts across how we sell, how we shop, and how we work. How we sell was for our associates. How we shop is for our customers, and how we work is for our headquarters employees. For whatever reason, most companies have begun with their workforce in the headquarters. We said, No, we are going to put AI in the hands of 300,000 associates."
For example, she described a generative AI companion app for store associates. "Every store associate now has on his or her zebra device a ChatGPT-like experience for home improvement.", said Godbole. Lowe's is also deploying computer vision algorithms at the edge to understand issues such as whether a customer in a particular aisle is waiting for help. The system will then send notifications to the associates in that department. Customers can also ask various home improvement questions, such as what paint finish to use in a bathroom, at Lowes.com/AI.
Designing A World Where AI Infrastructure Delivers Human Opportunity
Michael Dell said, "We are entering the age of ubiquitous intelligence, where AI becomes as essential as electricity, with AI, you can distill years of experience into instant insights, speeding up decisions and uncovering patterns in massive data. But it’s not here to replace humans. AI is a collaborator that frees your teams to do what they do best, to innovate, to imagine, and to solve the world's toughest problems."
While there are many AI deployment challenges ahead, the customer examples shared at Dell Technologies World provide a glimpse into a world where AI infrastructure and services benefits both customers and employees. The challenge now is to do this sustainably and ethically at scale.
#how #dells #infrastructure #updates #deliver
How Dell’s AI Infrastructure Updates Deliver Choice, Control And Scale
Dell Technologies World focuses its keynote on Inventing the Future with AIDell Technologies
Dell Technologies unveiled a significant expansion of its Dell AI Factory platform at its annual Dell Technologies World conference today, announcing over 40 product enhancements designed to help enterprises deploy artificial intelligence workloads more efficiently across both on-premises environments and cloud systems.
The Dell AI Factory is not a physical manufacturing facility but a comprehensive framework combining advanced infrastructure, validated solutions, services, and an open ecosystem to help businesses harness the full potential of artificial intelligence across diverse environments—from data centers and cloud to edge locations and AI PCs.
The company has attracted over 3,000 AI Factory customers since launching the platform last year. In an earlier call with industry analysts, Dell shared research stating that 79% of production AI workloads are running outside of public cloud environments—a trend driven by cost, security, and data governance concerns. During the keynote, Michael Dell provided more color on the value of Dell’s AI factory concept. He said, "The Dell AI factory is up to 60% more cost effective than the public cloud, and recent studies indicate that about three-fourths of AI initiatives are meeting or exceeding expectations. That means organizations are driving ROI and productivity gains from 20% to 40% in some cases.
Making AI Easier to Deploy
Organizations need the freedom to run AI workloads wherever makes the most sense for their business, without sacrificing performance or control. While IT leaders embraced the public cloud for their initial AI services, many organizations are now looking for a more nuanced approach where the company can control over their most critical AI assets while maintaining the flexibility to use cloud resources when appropriate. Over 80 percent of the companies Lopez Research interviewed said they struggled to find the budget and technical talent to deploy AI. These AI deployment challenges have only increased as more AI models and AI infrastructure services were launched.
Silicon Diversity and Customer Choice
A central theme of Dell's AI Factory message is how Dell makes AI easier to deploy while delivering choice. Dell is offering customers choice through silicon diversity in its designs, but also with ISV models. The company announced it has added Intel to its AI Factory portfolio with Intel Gaudi 3 AI accelerators and Intel Xeon processors, with a strong focus on inferencing workloads.
Dell also announced its fourth update to the Dell AI Platform with AMD, rolling out two new PowerEdge servers—the XE9785 and the XE9785L—equipped with the latest AMD Instinct MI350 series GPUs. The Dell AI Factory with NVIDIA combines Dell's infrastructure with NVIDIA's AI software and GPU technologies to deliver end-to-end solutions that can reduce setup time by up to 86% compared to traditional approaches. The company also continues to strengthen its partnership with NVIDIA, announcing products that leverage NVIDIA's Blackwell family and other updates launched at NVIDIA GTC. As of today, Dell supports choice by delivering AI solutions with all of the primary GPU and AI accelerator infrastructure providers.
Client-Side AI Advancements
At the edge of the AI Factory ecosystem, Dell announced enhancements to the Dell Pro Max in a mobile form factor, leveraging Qualcomm’s AI 100 discrete NPUs designed for AI engineers and data scientists who need fast inferencing capabilities. With up to 288 TOPs at 16-bit floating point precision, these devices can power up to a 70-billion parameter model, delivering 7x the inferencing speed and 4x the accuracy over a 40 TOPs NPU. According to Dell, the Pro Max Plus line can run a 109-billion parameter AI model.
The Dell Pro Max Plus targets AI engineers and data scientist with with the ability to run a ... More 109-billion parameter modelDell Technologies
The Pro Max and Plus launches follow Dell's previous announcement of AI PCs featuring Dell Pro Max with GB 10 and GB 300 processors powered by NVIDIA's Grace Blackwell architecture. Overall, Dell has simplified its PC portfolio but made it easier for customers to choose the right system for their workloads by providing the latest chips from AMD, Intel, Nvidia, and Qualcomm.
On-Premise AI Deployment Gains Ecosystem Momentum
Following the theme of choice, organizations need the flexibility to run AI workloads on-premises and in the cloud. Dell is making significant strides in enabling on-premise AI deployments with major software partners. The company announced it is the first provider to bring Cohere capabilities on-premises, combining Cohere's generative AI models with Dell's secure, scalable infrastructure for turnkey enterprise solutions.
Similar partnerships with Mistral and Glean were also announced, with Dell facilitating their first on-premise deployments. Additionally, Dell is supporting Google's Gemini on-premises with Google Distributed Cloud.
To simplify model deployment, Dell now offers customers the ability to choose models on Hugging Face and deploy them in an automated fashion using containers and scripts. Enterprises increasingly recognize that while public cloud AI has its place, a hybrid AI infrastructure approach could deliver better economics and security for production workloads.
The imperative for scalable yet efficient AI infrastructure at the edge is a growing need. As Michael Dell said during his Dell Technologies World keynote, “Over 75% of enterprise data will soon be created and processed at the edge, and AI will follow that data; it's not the other way around. The future of AI will be decentralized, low latency, and hyper-efficient.”
Dell's ability to offer robust hybrid and fully on-premises solutions for AI is proving to be a significant advantage as companies increasingly seek on-premises support and even potentially air-gapped solutions for their most sensitive AI workloads. Key industries adopting the Dell AI Factory include finance, retail, energy, and healthcare providers.
Scaling AI Requires a Focus on Energy Efficiency
Simplifying AI also requires product innovations that deliver cost-effective, energy-efficient technology. As AI workloads drive unprecedented power consumption, Dell has prioritized energy efficiency in its latest offerings. The company introduced the Dell PowerCool Enclosed Rear Door Heat Exchangerwith Dell Integrated Rack Controller. This new cooling solution captures nearly 100% of the heat coming from GPU-intensive workloads. This innovation lowers cooling energy requirements for a rack by 60%, allowing customers to deploy 16% more racks with the same power infrastructure.
Dell's new systems are rated to operate at 32 to 37 degrees Celsius, supporting significantly warmer temperatures than traditional air-cooled or water-chilled systems, further reducing power consumption for cooling. The PowerEdge XE9785L now offers Dell liquid cooling for flexible power management. Even if a company isn't aiming for a specific sustainability goal, every organization wants to improve energy utilization.
Early Adopter Use Cases Highlight AI's Opportunity
With over 200 product enhancements to its AI Factory in just one year, Dell Technologies is positioning itself as a central player in the rapidly evolving enterprise AI infrastructure market. It offers the breadth of solutions and expertise organizations require to successfully implement production-grade AI systems in a secure and scalable fashion. However, none of this technology matters if enterprises can't find a way to create business value by adopting it. Fortunately, examples from the first wave of enterprise early adopters highlight ways AI can deliver meaningful returns in productivity and customer experience. Let's look at two use cases presented at Dell Tech World.Michael Dell, CEO of Dell Technologies, interviews Larry Feinsmith, the Managing Director and Head ... More of Global Tech Strategy, Innovation & Partnerships at JPMorgan Chase at Dell Technologies WorldDell Technologies
The Power of LLMs in Finance at JPMorgan Chase
JPMorgan Chase took the stage to make AI real from a customer’s perspective. The financial firm uses Dell's compute hardware, software-defined storage, client, and peripheral solutions. Larry Feinsmith, the Managing Director and Head of Global Tech Strategy, Innovation & Partnerships at JPMorgan Chase, said, "We have a hybrid, multi-cloud, multi-provider strategy. Our private cloud is an incredibly strategic asset for us. We still have many applications and data on-premises for resiliency, latency, and a variety of other benefits."
Feinsmith also spoke of the company's Large Language Modelstrategy. He said, “Our strategy is to use a constellation of models, both foundational and open, which requires a tremendous amount of compute in our data centers, in the public cloud, and, of course, at the edge. The one constant thing, whether you're training models, fine-tuning models, or finding a great use case that has large-scale inferencing, is that they all will drive compute. We think Dell is incredibly well positioned to help JPMorgan Chase and other companies in their AI journey.”
Feinsmith noted that using AI isn't new for JPMorgan Chase. For over a decade, JPMorgan Chase has leveraged various types of AI, such as machine learning models for fraud detection, personalization, and marketing operations. The company uses what Feinsmith called its LLM suite, which over 200,000 people at JPMorgan Chase use today. The generative AI application is used for QA summarization and content generation using JPMorgan Chase's data in a highly secure way. Next, it has used the LLM suite architecture to build applications for its financial advisors, contact center agents, and any employee interacting with its clients. Its third use case highlighted changes in the software development area. JPMorgan Chase rolled out code generation AI capabilities to over 40,000 engineers. It has achieved as much as 20% productivity in the code creation and expects to leverage AI throughout the software development life cycle. Going forward, the financial firm expects to use AI agents and reasoning models to execute complex business processes end-to-end.Seemantini Godbole, EVP and Chief Digital and Information Officer at Lowe's shares the retailers AI ... More strategy at Dell Technologies WorldDell Technologies.
How AI Makes It Easier For Employees to Serve Customers at Lowe's
Lowe's Home Improvement Stores provided another example of how companies are leveraging Dell Technology and AI to transform the customer and employee experience. Seemantini Godbole, EVP and Chief Digital and Information Officer at Lowe's, shared insights on designing the strategy for AI when she said, "How should we deploy AI? We wanted to do impactful and meaningful things. We did not want to die a death of 1000 pilots, and we organized our efforts across how we sell, how we shop, and how we work. How we sell was for our associates. How we shop is for our customers, and how we work is for our headquarters employees. For whatever reason, most companies have begun with their workforce in the headquarters. We said, No, we are going to put AI in the hands of 300,000 associates."
For example, she described a generative AI companion app for store associates. "Every store associate now has on his or her zebra device a ChatGPT-like experience for home improvement.", said Godbole. Lowe's is also deploying computer vision algorithms at the edge to understand issues such as whether a customer in a particular aisle is waiting for help. The system will then send notifications to the associates in that department. Customers can also ask various home improvement questions, such as what paint finish to use in a bathroom, at Lowes.com/AI.
Designing A World Where AI Infrastructure Delivers Human Opportunity
Michael Dell said, "We are entering the age of ubiquitous intelligence, where AI becomes as essential as electricity, with AI, you can distill years of experience into instant insights, speeding up decisions and uncovering patterns in massive data. But it’s not here to replace humans. AI is a collaborator that frees your teams to do what they do best, to innovate, to imagine, and to solve the world's toughest problems."
While there are many AI deployment challenges ahead, the customer examples shared at Dell Technologies World provide a glimpse into a world where AI infrastructure and services benefits both customers and employees. The challenge now is to do this sustainably and ethically at scale.
#how #dells #infrastructure #updates #deliver
·274 Views