Microsoft’s AI security chief accidentally reveals Walmart’s AI plans after protest Microsoft’s head of security for AI, Neta Haiby, accidentally revealed confidential messages about Walmart’s use of Microsoft’s AI tools during..."> Microsoft’s AI security chief accidentally reveals Walmart’s AI plans after protest Microsoft’s head of security for AI, Neta Haiby, accidentally revealed confidential messages about Walmart’s use of Microsoft’s AI tools during..." /> Microsoft’s AI security chief accidentally reveals Walmart’s AI plans after protest Microsoft’s head of security for AI, Neta Haiby, accidentally revealed confidential messages about Walmart’s use of Microsoft’s AI tools during..." />

Upgrade to Pro

Microsoft’s AI security chief accidentally reveals Walmart’s AI plans after protest

Microsoft’s head of security for AI, Neta Haiby, accidentally revealed confidential messages about Walmart’s use of Microsoft’s AI tools during a Build talk that was disrupted by protesters. The Build livestream was muted and the camera pointed down, but the session resumed moments later after the protesters were escorted out. In the aftermath, Haiby then accidentally switched to Microsoft Teams while sharing her screen, revealing confidential internal messages about Walmart’s upcoming use of Microsoft’s Entra and AI gateway services.Haiby was co-hosting a Build session on best security practices for AI, alongside Sarah Bird, Microsoft’s head of responsible AI, when two former Microsoft employees disrupted the talk to protest against the company’s cloud contracts with the Israeli government.“Sarah, you are whitewashing the crimes of Microsoft in Palestine, how dare you talk about responsible AI when Microsoft is fueling the genocide in Palestine,” shouted Hossam Nasr, an organizer with the protest group No Azure for Apartheid, and a former Microsoft employee who was fired for holding a vigil outside Microsoft’s headquarters for Palestinians killed in Gaza.Walmart is one of Microsoft’s biggest corporate customers, and already uses the company’s Azure OpenAI service for some of its AI work. “Walmart is ready to rock and roll with Entra Web and AI Gateway,” says one of Microsoft’s cloud solution architects in the Teams messages. The chat session also quoted a Walmart AI engineer, saying: “Microsoft is WAY ahead of Google with AI security. We are excited to go down this path with you.”We asked Microsoft to comment on this protest and the Teams messages, but the company did not respond in time for publication.The private Microsoft Teams messages shown during the disrupted Build session. Image: MicrosoftBoth of the protesters involved in this latest Microsoft Build disruption were former Microsoft employees, with Vaniya Agrawal appearing alongside Nasr. Agrawal interrupted Microsoft co-founder Bill Gates, former CEO Steve Ballmer, and CEO Satya Nadella later during the company’s 50th anniversary event last month. Agrawal was dismissed shortly after putting in her two weeks’ notice at Microsoft before the protest, according to an email seen by The Verge.This is the third interruption of Microsoft Build by protesters, after a Palestinian tech worker disrupted Microsoft’s head of CoreAI on Tuesday, and a Microsoft employee interrupted the opening keynote of Build while CEO Satya Nadella was talking on stage.This latest protest comes days after Microsoft announced last week that it had conducted an internal review and used an unnamed external firm to assess how its technology is used in the war in Gaza. Microsoft says that its relationship with Israel’s Ministry of Defenseis “structured as a standard commercial relationship” and that it has “found no evidence that Microsoft’s Azure and AI technologies, or any of our other software, have been used to harm people or that IMOD has failed to comply with our terms of service or our AI Code of Conduct.”See More:
#microsofts #security #chief #accidentally #reveals
Microsoft’s AI security chief accidentally reveals Walmart’s AI plans after protest
Microsoft’s head of security for AI, Neta Haiby, accidentally revealed confidential messages about Walmart’s use of Microsoft’s AI tools during a Build talk that was disrupted by protesters. The Build livestream was muted and the camera pointed down, but the session resumed moments later after the protesters were escorted out. In the aftermath, Haiby then accidentally switched to Microsoft Teams while sharing her screen, revealing confidential internal messages about Walmart’s upcoming use of Microsoft’s Entra and AI gateway services.Haiby was co-hosting a Build session on best security practices for AI, alongside Sarah Bird, Microsoft’s head of responsible AI, when two former Microsoft employees disrupted the talk to protest against the company’s cloud contracts with the Israeli government.“Sarah, you are whitewashing the crimes of Microsoft in Palestine, how dare you talk about responsible AI when Microsoft is fueling the genocide in Palestine,” shouted Hossam Nasr, an organizer with the protest group No Azure for Apartheid, and a former Microsoft employee who was fired for holding a vigil outside Microsoft’s headquarters for Palestinians killed in Gaza.Walmart is one of Microsoft’s biggest corporate customers, and already uses the company’s Azure OpenAI service for some of its AI work. “Walmart is ready to rock and roll with Entra Web and AI Gateway,” says one of Microsoft’s cloud solution architects in the Teams messages. The chat session also quoted a Walmart AI engineer, saying: “Microsoft is WAY ahead of Google with AI security. We are excited to go down this path with you.”We asked Microsoft to comment on this protest and the Teams messages, but the company did not respond in time for publication.The private Microsoft Teams messages shown during the disrupted Build session. Image: MicrosoftBoth of the protesters involved in this latest Microsoft Build disruption were former Microsoft employees, with Vaniya Agrawal appearing alongside Nasr. Agrawal interrupted Microsoft co-founder Bill Gates, former CEO Steve Ballmer, and CEO Satya Nadella later during the company’s 50th anniversary event last month. Agrawal was dismissed shortly after putting in her two weeks’ notice at Microsoft before the protest, according to an email seen by The Verge.This is the third interruption of Microsoft Build by protesters, after a Palestinian tech worker disrupted Microsoft’s head of CoreAI on Tuesday, and a Microsoft employee interrupted the opening keynote of Build while CEO Satya Nadella was talking on stage.This latest protest comes days after Microsoft announced last week that it had conducted an internal review and used an unnamed external firm to assess how its technology is used in the war in Gaza. Microsoft says that its relationship with Israel’s Ministry of Defenseis “structured as a standard commercial relationship” and that it has “found no evidence that Microsoft’s Azure and AI technologies, or any of our other software, have been used to harm people or that IMOD has failed to comply with our terms of service or our AI Code of Conduct.”See More: #microsofts #security #chief #accidentally #reveals
WWW.THEVERGE.COM
Microsoft’s AI security chief accidentally reveals Walmart’s AI plans after protest
Microsoft’s head of security for AI, Neta Haiby, accidentally revealed confidential messages about Walmart’s use of Microsoft’s AI tools during a Build talk that was disrupted by protesters. The Build livestream was muted and the camera pointed down, but the session resumed moments later after the protesters were escorted out. In the aftermath, Haiby then accidentally switched to Microsoft Teams while sharing her screen, revealing confidential internal messages about Walmart’s upcoming use of Microsoft’s Entra and AI gateway services.Haiby was co-hosting a Build session on best security practices for AI, alongside Sarah Bird, Microsoft’s head of responsible AI, when two former Microsoft employees disrupted the talk to protest against the company’s cloud contracts with the Israeli government.“Sarah, you are whitewashing the crimes of Microsoft in Palestine, how dare you talk about responsible AI when Microsoft is fueling the genocide in Palestine,” shouted Hossam Nasr, an organizer with the protest group No Azure for Apartheid, and a former Microsoft employee who was fired for holding a vigil outside Microsoft’s headquarters for Palestinians killed in Gaza.Walmart is one of Microsoft’s biggest corporate customers, and already uses the company’s Azure OpenAI service for some of its AI work. “Walmart is ready to rock and roll with Entra Web and AI Gateway,” says one of Microsoft’s cloud solution architects in the Teams messages. The chat session also quoted a Walmart AI engineer, saying: “Microsoft is WAY ahead of Google with AI security. We are excited to go down this path with you.”We asked Microsoft to comment on this protest and the Teams messages, but the company did not respond in time for publication.The private Microsoft Teams messages shown during the disrupted Build session. Image: MicrosoftBoth of the protesters involved in this latest Microsoft Build disruption were former Microsoft employees, with Vaniya Agrawal appearing alongside Nasr. Agrawal interrupted Microsoft co-founder Bill Gates, former CEO Steve Ballmer, and CEO Satya Nadella later during the company’s 50th anniversary event last month. Agrawal was dismissed shortly after putting in her two weeks’ notice at Microsoft before the protest, according to an email seen by The Verge.This is the third interruption of Microsoft Build by protesters, after a Palestinian tech worker disrupted Microsoft’s head of CoreAI on Tuesday, and a Microsoft employee interrupted the opening keynote of Build while CEO Satya Nadella was talking on stage.This latest protest comes days after Microsoft announced last week that it had conducted an internal review and used an unnamed external firm to assess how its technology is used in the war in Gaza. Microsoft says that its relationship with Israel’s Ministry of Defense (IMOD) is “structured as a standard commercial relationship” and that it has “found no evidence that Microsoft’s Azure and AI technologies, or any of our other software, have been used to harm people or that IMOD has failed to comply with our terms of service or our AI Code of Conduct.”See More:
·109 Views