Copilot cranked up — suddenly everyone sees everything

40% of IT leaders have delayed their Copilot rollout by three months or more, according to Gartner. Not because Copilot failed — but because it did exactly what it was built to do: find information.

The pattern is always the same: Copilot activated for everyone, across the entire data inventory. Employees ask questions — and Copilot answers. With everything they have technical access to. And in most companies, that's significantly more than anyone realizes.

Imagine an office building with hundreds of filing cabinets. Many of them unlocked — some containing salary data, contracts, strategy papers. Until now, this was rarely a problem. Because an employee would have to walk through three floors, flip through hundreds of folders, and hope no one notices. Nobody does that. Copilot handles it in seconds, silently, from the desk.

How real this is was shown by an incident with Microsoft-365 customers in early 2026: A code error in Copilot's retrieval pipeline bypassed Sensitivity-Labels and processed confidentially marked emails — including contract negotiations and legal texts — despite active protection policies.

Microsoft now takes this issue seriously. The official rollout guide recommends checking the 100 most active SharePoint sites for oversharing before broad deployment. The fact that this is necessary shows the scale of the problem.

This isn't an AI problem. This is a governance problem that becomes visible through AI. And honestly: the fact that it becomes visible is the best thing that can happen. Those who clean this up now don't just solve the Copilot problem — they lay the foundation for AI to actually work in the enterprise.

← All Observations