Nearly one in three organisations using Microsoft 365 Copilot report sensitive data exposures, survey finds

29% of organizations have had AI tools expose sensitive data they shouldn't have accessed, per a ShareGate survey of 850+ IT leaders. HR records, customer data, and financial files were among the most common exposures.

Categorized in: AI News Human Resources
Published on: Apr 23, 2026
Nearly one in three organisations using Microsoft 365 Copilot report sensitive data exposures, survey finds

Nearly one-third of organizations report AI-driven data exposure incidents

Twenty-nine percent of organizations across the U.S., Canada and Europe have experienced incidents where AI tools surfaced sensitive data they should not have accessed, according to a ShareGate survey of more than 850 IT and security leaders. Another 8% said they did not know whether such incidents had occurred.

The exposed information includes HR records, customer data, financial information, and proprietary intellectual property. These are not edge cases-they are documents organizations typically lock down.

The confidence gap

Ninety-three percent of IT leaders say their Microsoft 365 governance framework is ready to support AI responsibly. Yet 29% of those same organizations have already reported data exposure incidents.

The disconnect reveals a visibility problem, not a competence problem. Most teams are confident because they completed the work they know about. The problem is the work they do not know about-forgotten shares, inherited permissions, and content that is technically accessible but practically invisible.

How the incidents happen

In most cases, nobody broke any rules to access the data. The permissions were already there. Copilot simply followed them.

Benjamin Niaulin, vice-president of product at ShareGate, said the introduction of AI is revealing existing weaknesses. "Every oversharing group and forgotten permission is one Copilot prompt away from becoming a real incident," he said. "You can't govern what you can't see, and right now, most teams can't see it."

What data exposure looks like

The survey found the following types of sensitive information were exposed:

  • Customer records (36%)
  • Sensitive internal documents (31%)
  • Personal data and personally identifiable information (30%)
  • HR records (30%)
  • Financial data (25%)
  • Proprietary intellectual property (21%)

Preparation falls short

Eighty-six percent of organizations conducted a content cleanup in preparation for AI deployment. Only about half completed an organization-wide review of content and permissions.

Partial cleanups leave gaps. AI does not respect rollout phases-it indexes everything it can see.

Governance workload and automation

Seventy-one percent of respondents said their governance workload has grown since enabling AI tools. Nearly a quarter reported significant increases.

Only 37% described their governance as highly automated and continuously monitored. Twenty-six percent have operationalized governance with consistent enforcement. The rest rely largely on manual or reactive processes. Manual governance does not scale when AI is surfacing risks faster than teams can review them.

Governance affects ROI and confidence

Seventy-eight percent of organizations said governance activities directly affect their confidence in AI investments. Fifty-one percent cited cost visibility and 47% cited governance complexity as barriers to measuring AI return on investment.

Forty-nine percent of organizations say AI-related costs account for 11% or more of their IT budget. Without governance clarity, that money lacks measurable return.

Ownership and accountability

Forty-eight percent of organizations have a clearly defined AI governance owner with formal policies and consistent enforcement. Twenty-six percent have an owner but varying enforcement by department. The rest have shared ownership or none.

No owner means no accountability. No accountability means no one to champion the business case for AI investments.

HR leaders should review their organization's permission settings and content accessibility now. The data exposure incidents documented in this survey suggest that inherited permissions and forgotten shares pose real risks to employee records and sensitive HR information.

For HR professionals responsible for data governance, AI for Human Resources covers how to manage these risks responsibly. For HR executives developing AI strategy, the AI Learning Path for CHROs addresses governance, data protection, and responsible implementation.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)