Microsoft has acknowledged a significant bug that inadvertently enabled its Copilot AI to summarize confidential emails from customers without their consent for several weeks.
This issue, initially highlighted by industry sources, permitted Copilot Chat to access and outline email contents since January, regardless of existing data loss prevention measures that were intended to safeguard sensitive information from being integrated into Microsoft's expansive language model.
Copilot Chat is a feature available to Microsoft 365 subscribers, allowing them to leverage AI capabilities within Office applications such as Word, Excel, and PowerPoint.
The tech giant indicated that the bug, which can be tracked by administrators under the identifier CW1226324, incorrectly processed draft and sent emails marked with a confidential label within the Microsoft 365 Copilot chat interface.
Microsoft announced that it began deploying a solution to rectify this issue earlier in February. However, the company has not disclosed how many customers might have been impacted by this bug.
In a related development, the IT department of the European Parliament recently informed lawmakers that it has disabled built-in AI functionalities on their work devices due to concerns that these tools could inadvertently upload confidential communications to the cloud.