Microsoft clarified this week that it does not use Microsoft 365 users’ documents to train AI models. The company addressed the issue after a wave of social media posts misinterpreted its “connected experiences” feature, causing concern among users about potential misuse of their personal or business data.
The controversy stemmed from a Microsoft 365 feature called “optional connected experiences,” which is nestled within multiple layers of privacy settings. This feature provides cloud-powered functionalities, like searching for online images or defining terms. By default, it is enabled, though users can deactivate it by unchecking a box and restarting the app. Although Microsoft’s documentation on connected experiences does not mention AI, machine learning, or large language models, many users feared that enabling the feature would subject their data to AI training. The concern gained traction on social media, with IT professionals and lawyers advising others to disable it. After a Linux blog with 374,000 followers highlighted the issue on X (formerly Twitter), Microsoft stepped in to clarify.
“In the M365 apps, we do not use customer data to train LLMs,” Microsoft’s 365 account responded on X. The company explained that the setting merely activates cloud-connected features, such as document co-authoring. Frank Shaw, Microsoft’s head of communications, also refuted the claims on Bluesky, pointing users to the documentation for further clarification.
This incident highlights how quickly misinformation can spread, even among well-meaning individuals. Users’ fears are not unfounded, as many tech companies have defaulted to using customer data for AI training. For instance, X uses posts to train its chatbot, Grok, while Google leverages user data for its Document AI service. Meta similarly auto-enrolls users of Facebook, Instagram, and Threads into AI training unless they manually opt out.
Microsoft’s proactive clarification may ease some concerns, but the broader debate over data privacy in the AI age remains unresolved.