Drücke „Enter”, um zum Inhalt zu springen.
Hinweis zu diesem Datenschutz-Blog:
Anscheinend verwenden Sie einen Werbeblocker wie uBlock Origin oder Ghostery, oder einen Browser, der bestimmte Dienste blockiert.
Leider wird dadurch auch der Dienst von VG Wort blockiert. Online-Autoren haben einen gesetzlichen Anspruch auf eine Vergütung, wenn ihre Beiträge oft genug aufgerufen wurden. Um dies zu messen, muss vom Autor ein Dienst der VG Wort eingebunden werden. Ohne diesen Dienst geht der gesetzliche Anspruch für den Autor verloren.

Ich wäre Ihnen sehr verbunden, wenn Sie sich bei der VG Wort darüber beschweren, dass deren Dienst anscheinend so ausgeprägt ist, dass er von manchen als blockierungswürdig eingestuft wird. Dies führt ggf. dazu, dass ich Beiträge kostenpflichtig gestalten muss.

Durch Klick auf folgenden Button wird eine Mailvorlage geladen, die Sie inhaltlich gerne anpassen und an die VG Wort abschicken können.

Nachricht an VG WortMailtext anzeigen

Betreff: Datenschutzprobleme mit dem VG Wort Dienst(METIS)
Guten Tag,

als Besucher des Datenschutz-Blogs Dr. DSGVO ist mir aufgefallen, dass der VG Wort Dienst durch datenschutzfreundliche Browser (Brave, Mullvad...) sowie Werbeblocker (uBlock, Ghostery...) blockiert wird.
Damit gehen dem Autor der Online-Texte Einnahmen verloren, die ihm aber gesetzlich zustehen.

Bitte beheben Sie dieses Problem!

Diese Nachricht wurde von mir persönlich abgeschickt und lediglich aus einer Vorlage generiert.
Wenn der Klick auf den Button keine Mail öffnet, schreiben Sie bitte eine Mail an info@vgwort.de und weisen darauf hin, dass der VG Wort Dienst von datenschutzfreundlichen Browser blockiert wird und dass Online Autoren daher die gesetzlich garantierten Einnahmen verloren gehen.
Vielen Dank,

Ihr Klaus Meffert - Dr. DSGVO Datenschutz-Blog.

PS: Wenn Sie meine Beiträge oder meinen Online Website-Check gut finden, freue ich mich auch über Ihre Spende.
Ausprobieren Online Webseiten-Check sofort das Ergebnis sehen

Microsoft Copilot as a danger: EchoLeak reveals vulnerability

0
Dr. DSGVO Newsletter detected: Extended functionality available
More articles · Website-Checks · Live Offline-AI
📄 Article as PDF (only for newsletter subscribers)
🔒 Premium-Funktion
Der aktuelle Beitrag kann in PDF-Form angesehen und heruntergeladen werden

📊 Download freischalten
Der Download ist nur für Abonnenten des Dr. DSGVO-Newsletters möglich

EchoLeak is a security vulnerability in Microsoft 365 Copilot that allowed attackers to access sensitive company data. No action on the part of the Copilot user was required. Rather, a malicious instruction to Copilot was smuggled into an email that the victim only had to receive to be exposed to the security problem.

Introduction

Microsoft's Copilot is a general-purpose AI that appears useful in some aspects, but often must be considered a failure. A simple test with a very simple task revealed the blatant failure of Copilot on a functional level.

It is well-known that Microsoft extensively collects data from customers without any shame, in fact, Microsoft even publicly announces this (probably for legal reasons).

Recently, there is a great danger for companies and private users from Copilot. Copilot accesses almost all data on computers where this AI thing is installed. Through the Microsoft 365 platform, into which Copilot is “seamlessly” integrated, data is sent around the world, for example via the Teams app.

The timeline of the vulnerability, from discovery to fix and announcement of the alleged solution.

EchoLeak takes advantage of this cleverly. A Copilot user didn't have to make a mistake to be affected. It was enough to be a Copilot user to be attacked and exposed to data losses. Microsoft quietly hushed up the security flaw and closed it only four months after being reported by a “whistleblower”.

What is EchoLeak?

EchoLeak is a Zero-Click Security Vulnerability. That means: You're the fool, even if you've done nothing wrong and haven't clicked anywhere where you shouldn't have clicked. Fault lies not with you, but Microsoft 365 Copilot.

The result of EchoLeak is exfiltrated data from your system. Through this exfiltration, unauthorized persons obtain your data, because of Copilot and the great Microsoft 365 ecosystem. No user interaction is necessary for the attacker to be successful.

A large amount of data was affected by the security breach.

Microsoft 365 Copilot systems in all configurations were affected. Data access was possible via Word, Excel, PowerPoint, Outlook, Teams, SharePoint sites, OneDrive content and Microsoft Graph.

Technical analysis

The EchoLeak attack was successful because the Copilot language model (LLM) did not work with sufficient restrictions. Instead, all possible data from workstations running Copilot was simply fed into the Microsoft AI and analyzed. In this way, trustworthy internal data was mixed with untrustworthy external input in the same “thought process”.

This is how EchoLeak proceeded.

Thus, an email received from an attacker (even a spam email) could cause Copilot to access internal data and forward this data to the attacker.

There is a database that lists security vulnerabilities of relevant software products. This is called CVE (Common Vulnerabilities and Exposures). Each vulnerability is evaluated for its criticality, with 10 being the highest value.

EchoLeak: The AI security vulnerability in Copilot.

The Microsoft 365 Copilot vulnerability called EchoLeak received a score of 9.3, which shows how critical the problem was that Microsoft has kept quiet about.

Even if Microsoft had informed all customers, its actions would still be highly questionable. The very fact that the issue existed is a sign of Microsoft's strange understanding of data protection and data security. Full access to all data on a system should at least not happen by default. Perhaps a proper, transparent and easily understandable request to Microsoft-dependent customers would have been better?

That Microsoft worked on the problem for a whole four months, until it was hopefully resolved, shows another facet of the disaster. Microsoft cannot or will not solve extreme large security holes quickly enough that can threaten the existence of companies.

Conclusion

One can turn off Copilot. That would be best for many users. There are other and above all better systems that are significantly safer and also functionally better.

Many federal states and countries are moving away from Microsoft. Convenience is standing in the way for many of them. But there are increasingly more who recognize the structural danger that comes with Microsoft's cloud products.

Nowadays, great cloud products can not only be purchased from your own country, but can also be developed with much less effort than in the past. The effort required, compared to three years ago, is probably 10 times less.

There will definitely be no more security problems with Microsoft products in the future. I promise!

About the author on dr-dsgvo.de
My name is Klaus Meffert. I have a doctorate in computer science and have been working professionally and practically with information technology for over 30 years. I also work as an expert in IT & data protection. I achieve my results by looking at technology and law. This seems absolutely essential to me when it comes to digital data protection. My company, IT Logic GmbH, also offers consulting and development of optimized and secure AI solutions.

Understanding AI hallucinations: Causes & examples of artificial fiction