Tech »  Topic »  Aim Labs uncovers EchoLeak, a zero-click AI flaw in Microsoft 365 Copilot that allows data theft via email. Learn how this vulnerability enables sensitive information exfiltration without user interaction and its implications for AI security.

Aim Labs uncovers EchoLeak, a zero-click AI flaw in Microsoft 365 Copilot that allows data theft via email. Learn how this vulnerability enables sensitive information exfiltration without user interaction and its implications for AI security.


Cybersecurity firm Aim Labs has uncovered a serious new security problem, named EchoLeak, affecting Microsoft 365 (M365) Copilot, a popular AI assistant. This flaw is a zero-click vulnerability, meaning attackers can steal sensitive company information without user interaction.

Aim Labs has shared details of this vulnerability and how it can be exploited with Microsoft’s security team, and so far, it is not aware of any customers being affected by this new threat.

How “EchoLeak” Works: A New Kind of AI Attack

For your information, M365 Copilot is a RAG-based chatbot, which means it gathers information from a user’s company environment like emails, files on OneDrive, SharePoint sites, and Teams chats to answer questions. While Copilot is designed to only access files the user has permission for, these files can still hold private or secret company data.

The main issue with EchoLeak is a new type of attack Aim ...


Copyright of this story solely belongs to hackread.com . To see the full text click HERE