The company says it has addressed the issue and it "did not provide anyone access to information they weren't already ...
Microsoft said the bug meant that its Copilot AI chatbot was reading and summarizing paying customers' confidential emails, ...
M365 Copilot Chat was summarizing your emails, whether you granted it access or not. This bug affected Sent and Draft folders ...
Microsoft says a Microsoft 365 Copilot bug has been causing the AI assistant to summarize confidential emails since late January, bypassing data loss prevention (DLP) policies that organizations rely ...
The company is working on deploying a fix for impacted users.
Did your shopping list just get exposed? Or your location got tracked? This could very well happen to a Co-Pilot user. Recent news suggests that Co-Pilot AI is ...
Copilot caught peeking? A bug reportedly let the AI read confidential enterprise emails it was never meant to see. A complete fix is still not available.
Microsoft has confirmed a bug that allowed Copilot AI to access users’ confidential emails without proper permission.
Microsoft has confirmed a bug in its Copilot AI, which inadvertently exposed customers' confidential emails for several weeks.
Microsoft just rolled out Copilot upgrades that make it easier to pick up old chats, handle longer prompts, and use Copilot ...
Meta Description: Complete guide to Microsoft Copilot for Education. Learn about the Teach feature, Learning Accelerators, ...
Microsoft has confirmed that a bug in its Office suite allowed Copilot AI to surface private email content from users’ Exchange Online accounts, even when data loss prevention policies were in place.