EchoLeak is a security vulnerability in Microsoft 365 Copilot that allowed attackers to access sensitive company data. No action on the part of the Copilot user was required. Rather, a malicious instruction to Copilot was smuggled into an email that the victim only had to receive to be exposed to the security problem.
Introduction
Microsoft's Copilot is a general-purpose AI that appears useful in some aspects, but often must be considered a failure. A simple test with a very simple task revealed the blatant failure of Copilot on a functional level.
It is well-known that Microsoft extensively collects data from customers without any shame, in fact, Microsoft even publicly announces this (probably for legal reasons).
Recently, there is a great danger for companies and private users from Copilot. Copilot accesses almost all data on computers where this AI thing is installed. Through the Microsoft 365 platform, into which Copilot is “seamlessly” integrated, data is sent around the world, for example via the Teams app.

EchoLeak takes advantage of this cleverly. A Copilot user didn't have to make a mistake to be affected. It was enough to be a Copilot user to be attacked and exposed to data losses. Microsoft quietly hushed up the security flaw and closed it only four months after being reported by a “whistleblower”.
What is EchoLeak?
EchoLeak is a Zero-Click Security Vulnerability. That means: You're the fool, even if you've done nothing wrong and haven't clicked anywhere where you shouldn't have clicked. Fault lies not with you, but Microsoft 365 Copilot.
The result of EchoLeak is exfiltrated data from your system. Through this exfiltration, unauthorized persons obtain your data, because of Copilot and the great Microsoft 365 ecosystem. No user interaction is necessary for the attacker to be successful.

Microsoft 365 Copilot systems in all configurations were affected. Data access was possible via Word, Excel, PowerPoint, Outlook, Teams, SharePoint sites, OneDrive content and Microsoft Graph.
Technical analysis
The EchoLeak attack was successful because the Copilot language model (LLM) did not work with sufficient restrictions. Instead, all possible data from workstations running Copilot was simply fed into the Microsoft AI and analyzed. In this way, trustworthy internal data was mixed with untrustworthy external input in the same “thought process”.

Thus, an email received from an attacker (even a spam email) could cause Copilot to access internal data and forward this data to the attacker.
There is a database that lists security vulnerabilities of relevant software products. This is called CVE (Common Vulnerabilities and Exposures). Each vulnerability is evaluated for its criticality, with 10 being the highest value.

The Microsoft 365 Copilot vulnerability called EchoLeak received a score of 9.3, which shows how critical the problem was that Microsoft has kept quiet about.
Even if Microsoft had informed all customers, its actions would still be highly questionable. The very fact that the issue existed is a sign of Microsoft's strange understanding of data protection and data security. Full access to all data on a system should at least not happen by default. Perhaps a proper, transparent and easily understandable request to Microsoft-dependent customers would have been better?
That Microsoft worked on the problem for a whole four months, until it was hopefully resolved, shows another facet of the disaster. Microsoft cannot or will not solve extreme large security holes quickly enough that can threaten the existence of companies.
Conclusion
One can turn off Copilot. That would be best for many users. There are other and above all better systems that are significantly safer and also functionally better.
Many federal states and countries are moving away from Microsoft. Convenience is standing in the way for many of them. But there are increasingly more who recognize the structural danger that comes with Microsoft's cloud products.
Nowadays, great cloud products can not only be purchased from your own country, but can also be developed with much less effort than in the past. The effort required, compared to three years ago, is probably 10 times less.
There will definitely be no more security problems with Microsoft products in the future. I promise!




My name is Klaus Meffert. I have a doctorate in computer science and have been working professionally and practically with information technology for over 30 years. I also work as an expert in IT & data protection. I achieve my results by looking at technology and law. This seems absolutely essential to me when it comes to digital data protection. My company, IT Logic GmbH, also offers consulting and development of optimized and secure AI solutions.
