Microsoft Build 2025: From chatbots to digital coworkers, and the "agentic" web
Something to look forward to: As exciting as the world of large language models may be, it's clear that the tech industry is moving fast and beyond the capabilities current GenAI tools to enter a new phase focused on AI-powered agents. At the developer-focused Microsoft Build event, the company made this next stage in AI evolution evident through a wide range of announcements that point to how software agents can extend LLM capabilities into more sophisticated and far-reaching applications.
The buzzword Microsoft used at Build was the "agentic web." However, the agent-based opportunities the company described aren't limited to the web or cloud-based applications – they also extend to Windows and other client-based environments.
In building out its vision for agents, Microsoft introduced a variety of developer tools to easily create agents and unveiled several new prebuilt ones. The company also discussed capabilities for organizing and orchestrating the actions of multiple agents. Most notably, Microsoft introduced mechanisms for treating agents as "digital employees" – complete with identities and access rights managed through the company's Entra digital identity and authentication framework.
On the development front, Microsoft debuted the GitHub Copilot coding agent, designed to streamline the creation of AI applications and agents. Described as "an agentic partner," the Copilot coding agent was likened to a coworker who can assist with parts of a development project, such as refactoring old code or fixing bugs.
For non-programmers, Microsoft also showcased a set of low-code/no-code tools for agent creation, including Copilot Studio. Additionally, the company introduced the concept of Computer Use Agents, which can perform actions across a computer screen as a human would. CUAs are capable of interacting with websites and applications in ways not possible through traditional APIs alone.
With the launch of Copilot Tuning, Microsoft is making it easier for users to fine-tune existing LLMs using their own content, enabling the creation of personalized agents tailored to specific tasks. For example, an agent could learn to write in an individual's style or incorporate an organization's specialized knowledge into content generation. This capability opens up new possibilities for a broader range of users.
Conceptually, this is similar to the idea of a personal RAGtool, a concept that garnered attention over the past year but never quite went mainstream. Microsoft's agent-based approach through Copilot Tuning simplifies the process by allowing users to select documents to augment the model's training set – potentially making a bigger impact.
One of the key themes Microsoft emphasized at Build was how coordinating multiple agents can unlock even more powerful capabilities. The company showcased orchestration mechanisms for linking and synchronizing different agents' actions. In Copilot Studio, for instance, developers can connect several agents to handle more complex tasks collaboratively.
// Related Stories
Perhaps the most striking announcement was the ability to register agents within Entra. This seemingly minor detail carries significant implications – it effectively elevates autonomous software into the role of a digital employee.
While the real-world deployment and limitations of these "digital employees" remain to be seen, the fact that this concept is under serious consideration underscores just how groundbreaking – and potentially disruptive – agent-based AI could become. Notably, Nvidia CEO Jensen Huang also spoke about digital agents as employees in his keynote at Computex, highlighting the broader industry momentum behind the idea.
Microsoft also made several announcements around developing standards. The company strongly endorsed both the Model Context Protocoland Agent-to-Agentstandards. MCP provides a unified method for interacting with LLMs across different models and environments, while A2A defines a common protocol for agent communication and collaboration.
In keeping with this open approach, Microsoft announced broader model support across most of its development tools. While the company hasn't formally released its own LLMs yet – aside from the Phi family of Small Language Models– the inclusion of hundreds of models in Azure AI Foundry suggests Microsoft is moving away from its initial reliance on OpenAI and embracing greater model diversity. It wouldn't be surprising if Microsoft introduces its own family of LLMs in the near future.
For Windows developers, Microsoft introduced several new features to simplify building and running AI agents and applications on PCs. These tools are designed to leverage the diverse silicon now available in Copilot+ PCs. Windows Foundry – the successor to Windows ML Runtime – addresses a key challenge: supporting the varied NPU and GPU architectures from Qualcomm, Intel, AMD, and Nvidia. By providing a translation layer that optimizes app code for the available hardware, Windows Foundry should encourage more development of AI-accelerated Windows apps.
Microsoft also introduced Local Foundry, which expands the range of models developers can use and supports integration with external platforms such as Nvidia NIMs. Thanks to Nvidia's newly announced TensorRT for RTX PCs, developers can now run CUDA applications on PCs with Nvidia RTX GPUs, opening up yet another mechanism for bringing AI-accelerated applications to PCs.
Finally, with MCP support in Windows 11, AI agents can now serve as intermediaries across different applications registered as MCP servers. This opens the door to automating complex, multi-step workflows across multiple applications. While this will likely start on a single PC, MCP also enables distributing tasks across various environments – paving the way for advanced, hybrid AI applications.
As with most Microsoft Build events, the sheer volume of announcements can be overwhelming.
What's becoming increasingly clear is that agents – and the tools and protocols enabling them – are ushering in a new era of AI development. These next-gen agents move beyond chatbots and toward more powerful, structured AI applications. They're even laying the groundwork for digital "coworkers" who could dramatically reshape how organizations operate and how work gets done.
Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on X @bobodtech
Should there be less focus on programming skills because of generative AI?
#microsoft #build #chatbots #digital #coworkers
Microsoft Build 2025: From chatbots to digital coworkers, and the "agentic" web
Something to look forward to: As exciting as the world of large language models may be, it's clear that the tech industry is moving fast and beyond the capabilities current GenAI tools to enter a new phase focused on AI-powered agents. At the developer-focused Microsoft Build event, the company made this next stage in AI evolution evident through a wide range of announcements that point to how software agents can extend LLM capabilities into more sophisticated and far-reaching applications.
The buzzword Microsoft used at Build was the "agentic web." However, the agent-based opportunities the company described aren't limited to the web or cloud-based applications – they also extend to Windows and other client-based environments.
In building out its vision for agents, Microsoft introduced a variety of developer tools to easily create agents and unveiled several new prebuilt ones. The company also discussed capabilities for organizing and orchestrating the actions of multiple agents. Most notably, Microsoft introduced mechanisms for treating agents as "digital employees" – complete with identities and access rights managed through the company's Entra digital identity and authentication framework.
On the development front, Microsoft debuted the GitHub Copilot coding agent, designed to streamline the creation of AI applications and agents. Described as "an agentic partner," the Copilot coding agent was likened to a coworker who can assist with parts of a development project, such as refactoring old code or fixing bugs.
For non-programmers, Microsoft also showcased a set of low-code/no-code tools for agent creation, including Copilot Studio. Additionally, the company introduced the concept of Computer Use Agents, which can perform actions across a computer screen as a human would. CUAs are capable of interacting with websites and applications in ways not possible through traditional APIs alone.
With the launch of Copilot Tuning, Microsoft is making it easier for users to fine-tune existing LLMs using their own content, enabling the creation of personalized agents tailored to specific tasks. For example, an agent could learn to write in an individual's style or incorporate an organization's specialized knowledge into content generation. This capability opens up new possibilities for a broader range of users.
Conceptually, this is similar to the idea of a personal RAGtool, a concept that garnered attention over the past year but never quite went mainstream. Microsoft's agent-based approach through Copilot Tuning simplifies the process by allowing users to select documents to augment the model's training set – potentially making a bigger impact.
One of the key themes Microsoft emphasized at Build was how coordinating multiple agents can unlock even more powerful capabilities. The company showcased orchestration mechanisms for linking and synchronizing different agents' actions. In Copilot Studio, for instance, developers can connect several agents to handle more complex tasks collaboratively.
// Related Stories
Perhaps the most striking announcement was the ability to register agents within Entra. This seemingly minor detail carries significant implications – it effectively elevates autonomous software into the role of a digital employee.
While the real-world deployment and limitations of these "digital employees" remain to be seen, the fact that this concept is under serious consideration underscores just how groundbreaking – and potentially disruptive – agent-based AI could become. Notably, Nvidia CEO Jensen Huang also spoke about digital agents as employees in his keynote at Computex, highlighting the broader industry momentum behind the idea.
Microsoft also made several announcements around developing standards. The company strongly endorsed both the Model Context Protocoland Agent-to-Agentstandards. MCP provides a unified method for interacting with LLMs across different models and environments, while A2A defines a common protocol for agent communication and collaboration.
In keeping with this open approach, Microsoft announced broader model support across most of its development tools. While the company hasn't formally released its own LLMs yet – aside from the Phi family of Small Language Models– the inclusion of hundreds of models in Azure AI Foundry suggests Microsoft is moving away from its initial reliance on OpenAI and embracing greater model diversity. It wouldn't be surprising if Microsoft introduces its own family of LLMs in the near future.
For Windows developers, Microsoft introduced several new features to simplify building and running AI agents and applications on PCs. These tools are designed to leverage the diverse silicon now available in Copilot+ PCs. Windows Foundry – the successor to Windows ML Runtime – addresses a key challenge: supporting the varied NPU and GPU architectures from Qualcomm, Intel, AMD, and Nvidia. By providing a translation layer that optimizes app code for the available hardware, Windows Foundry should encourage more development of AI-accelerated Windows apps.
Microsoft also introduced Local Foundry, which expands the range of models developers can use and supports integration with external platforms such as Nvidia NIMs. Thanks to Nvidia's newly announced TensorRT for RTX PCs, developers can now run CUDA applications on PCs with Nvidia RTX GPUs, opening up yet another mechanism for bringing AI-accelerated applications to PCs.
Finally, with MCP support in Windows 11, AI agents can now serve as intermediaries across different applications registered as MCP servers. This opens the door to automating complex, multi-step workflows across multiple applications. While this will likely start on a single PC, MCP also enables distributing tasks across various environments – paving the way for advanced, hybrid AI applications.
As with most Microsoft Build events, the sheer volume of announcements can be overwhelming.
What's becoming increasingly clear is that agents – and the tools and protocols enabling them – are ushering in a new era of AI development. These next-gen agents move beyond chatbots and toward more powerful, structured AI applications. They're even laying the groundwork for digital "coworkers" who could dramatically reshape how organizations operate and how work gets done.
Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on X @bobodtech
Should there be less focus on programming skills because of generative AI?
#microsoft #build #chatbots #digital #coworkers
·131 مشاهدة