Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Photo by Cheng Xin/Getty Images

Microsoft’s Copilot was secretly reading confidential emails for weeks, and what it did with them is every company’s worst nightmare

Your "confidential" emails weren't so confidential.

Microsoft’s Copilot was recently caught summarizing confidential emails without the proper permissions, completely bypassing the security policies designed to keep that sensitive information protected. This is a significant concern for companies that rely on AI assistants to handle their data.

Recommended Videos

According to The Mashable, the issue specifically affected Copilot Chat for some Microsoft 365 enterprise users. Copilot Chat rolled out to Microsoft 365 apps like Word, Excel, Outlook, and PowerPoint for business customers last fall, and is marketed as a content-aware AI assistant that helps users create documents and process information.

The bug, which Microsoft tracked internally as CW1226324, caused emails labeled as confidential to be “incorrectly processed by Microsoft 365 Copilot chat.” Copilot Chat was pulling in and summarizing emails from users’ Sent Items and Drafts folders, even though these messages had sensitivity labels specifically designed to block automated access.

Integrating AI into the workplace comes with serious security risks that businesses cannot afford to ignore

This kind of incident highlights the risks of using AI in the workplace. Businesses that use AI assistants could face serious problems, including prompt injection vulnerabilities and major data compliance violations, when these tools access information they are not supposed to. Microsoft has been expanding its AI tools across multiple industries, which makes security issues like this one even more consequential.

Microsoft confirmed that a code issue was the cause of this bug. They began rolling out a fix in early February and are actively monitoring the deployment of this patch. The company is also reaching out to some of the affected users to make sure everything is working correctly. Microsoft has not disclosed how many organizations were impacted by this security bypass, but noted that the scope might change as their investigation continues.

The incident is a reminder that even large tech companies face unforeseen challenges when integrating powerful new technologies into their products. Microsoft has also drawn attention recently for quietly pushing software updates to smart TVs, raising further questions about how the company handles user consent. 

Companies that depend on these systems to handle sensitive data take on real risk when the tools do not behave as expected. For businesses using Microsoft 365, this serves as a clear example of why data security policies need to be tested and verified regularly, especially when AI tools are involved in handling confidential communications.


Attack of the Fanboy is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Sadik Hossain
Sadik Hossain
Freelance Content Writer
Sadik Hossain is a professional writer with over 7 years of experience in numerous fields. He has been following political developments for a very long time. To convert his deep interest in politics into words, he has joined Attack of the Fanboy recently as a political news writer and wrote quite a lot of journal articles within a very short time. His keen enthusiasm in politics results in delivering everything from heated debate coverage to real-time election updates and many more.