
Somewhere between Satya Nadella's earnings calls and the product pages promising to transform how you work, Microsoft inserted a sentence into Copilot's Terms of Use that reads rather differently from the rest of its AI pitch.
Under a section labeled in bold capital letters "IMPORTANT DISCLOSURES & WARNINGS," updated October 24, 2025 and surfacing widely this week, Microsoft's Copilot terms state: "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don't rely on Copilot for important advice. Use Copilot at your own risk."
The same document states that Microsoft makes "no warranty or representation of any kind about Copilot," that users are solely responsible if they publish or share the AI's responses, and that users must indemnify Microsoft against claims arising from their use.
The Gap Between Marketing and Legal
This is not subtle hedging. "Entertainment purposes only" is a phrase typically reserved for horoscopes and personality quizzes - not a product Microsoft has positioned as the cornerstone of enterprise productivity, integrated across Windows 11, Microsoft 365, Word, Excel, and Teams, and priced at a significant premium for corporate customers.
A Microsoft spokesperson told PCMag the language is "legacy" and "no longer reflective of how Copilot is used today," with an update planned for the next revision. No timeline was given.
The disclaimer looks less like a legal technicality in context. In August 2024, Copilot falsely identified German court reporter Martin Bernklau as a convicted child abuser and fraudster and provided his home address. In January 2026, it generated false claims about football-related violence. Recon Analytics found Copilot's accuracy Net Promoter Score deteriorated from -3.5 in July 2025 to -24.1 by September 2025 - and that 44.2% of lapsed users cited distrust of answers as their primary reason for abandoning the product.
The Adoption Problem
The reliability concerns connect directly to a commercial challenge. As of Microsoft's FY2026 Q2 earnings, only 15 million of its approximately 450 million paid Microsoft 365 seats have upgraded to paid Copilot - a 3.3% conversion rate. US paid subscriber market share contracted 39% in six months, from 18.8% in July 2025 to 11.5% in January 2026. When given a free choice among Copilot, ChatGPT, and Gemini, only 8% of workers select the Microsoft product.
Microsoft is not alone in hedging through fine print. OpenAI warns users not to rely on its outputs as a sole source of truth and caps liability at $100 or the amount paid in the prior 12 months. Google's Gemini terms caution against relying on it for medical, legal, or financial advice. Neither, however, applied the phrase "entertainment purposes only" to a product they sell to enterprises at scale.
In my experience advising executives on AI adoption, the gap between a vendor's marketing and their legal terms is always worth reading carefully before deployment. Microsoft is telling enterprise customers in its contracts what its marketing does not say out loud: verify everything, trust nothing critical, and the responsibility is yours.



