- Microsoft released tools to address security issues with its AI assistant Copilot.
- Copilot’s indexing of internal data led to oversharing of sensitive company information.
- Some corporate customers delayed Copilot deployment due to security and oversharing concerns.
You know when a colleague overshares at work? It’s awkward at best.
Microsoft’s Copilot has been doing an AI version of this behavior, which has unnerved corporate customers so much that some have delayed deploying the product, as Business Insider first reported last week.
Now, the software giant is trying to fix the problem. On Tuesday, Microsoft released new tools and a guide to help customers mitigate a Copilot security issue that inadvertently let employees access sensitive information, such as CEO emails and HR documents.
These updates are designed “to identify and mitigate oversharing and ongoing governance concerns,” the company explained in a new blueprint for Microsoft’s 365 productivity software suite.
“Many data governance challenges in the context of AI were not caused by AI’s arrival,” a Microsoft spokesperson told BI on Wednesday.
AI is simply the latest call to action for enterprises to take proactive management of their internal documents and other information, the spokesman added.
These decisions are controlled by each company’s unique situation. Factors such as specific industry regulations and varying risk tolerance should inform these decisions, according to the Microsoft spokesperson. For instance, different employees should have access to different types of files, workspaces, and other resources.
“Microsoft is helping customers enhance their central governance of identities and permissions, to help organizations continuously update and manage these fundamental controls,” the spokesman said.
Copilot’s magic — its ability to create a 10-slide road-mapping presentation, or to summon up a list of your company’s most profitable products — works by browsing and indexing all of your company’s internal information, like the web crawlers used by search engines.
Historically, IT departments at some companies have set up lax permissions for who can access internal documents — selecting “allow all,” say, for the company’s HR software, rather than going through the trouble of selecting specific users.
That never created much of a problem, because there wasn’t a tool that an average employee could use to identify and retrieve sensitive company documents — until Copilot.
As a result, some customers have deployed Copilot, only to discover that it can enable employees to read an executive’s inbox or access sensitive HR documents.
“Now, when Joe Blow logs into an account and kicks off Copilot, they can see everything,” said one Microsoft employee familiar with customer complaints. “All of a sudden Joe Blow can see the CEO’s emails.”
Are you a Microsoft employee or someone else with insight to share?
Contact Ashley Stewart via the encrypted messaging app Signal (+1-425-344-8242) or email ([email protected]). Use a nonwork device.