Affiliate links on Android Authority may earn us a commission. Learn more.
Microsoft put the same disclaimer on Copilot that a psychic uses to avoid getting sued

- Microsoft’s Copilot terms of use explicitly state, “Copilot is for entertainment purposes only.”
- While other AI companies warn users to double-check AI output, this Copilot disclaimer goes quite a bit further.
- Microsoft has been heavily promoting Copilot’s business uses despite the entertainment-only message.
For all the complaints people make about AI replacing human skills, there’s another side to it: The rise of AI has also forced humans to develop new skills, specifically in terms of being able to sort useful AI output from incorrect, hallucinated garbage. Over the past couple years, many of us have gotten pretty good at this, and have leaned to make the most of the many limitations we experience with so many AI agents. While the companies behind these projects are similarly aware of the limitations we’re up against, one of them seems to be overcompensating a bit in the legal department, as Copilot users notice some concerning language in Microsoft’s terms of service.
Don’t want to miss the best from Android Authority?
- Set us as a favorite source in Google Discover to never miss our latest exclusive reports, expert analysis, and much more.
- You can also set us as a preferred source in Google Search by clicking the button below.
Anyone using AI for anything even remotely serious should know by now to sanity check the program’s output — AI will confidently share mistakes as truths, and users need to be vigilant to not take its output at face value. Correspondingly, all the major players make disclaimers to this effect, trying to promote their products’ benefits while also acknowledging their limitations. Google’s Gemini overview is a good example of this, explaining how Gemini does what it does, while also drawing attention to places where it still needs improvement.
And then there’s Microsoft. Like many of the other companies doing AI, it likes to advertise all the important tasks Copilot can aid you with, like coming up with new strategies for your business:
On its face, there’s nothing unusual there — pretty par for the course with your modern AI platform. We just hope that the business customers Microsoft’s going after with spots like this aren’t reading the full Copilot terms of use (via Tom’s Hardware). Because if they did, they’d see this concerning disclaimer:
Copilot is for entertainment purposes only.
That “only” is doing a whole lot of work there. Microsoft continues:
It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.
Now, that bit actually sounds relatively in line with the disclaimers we’ve seen from other AI firms: Be careful, and verify the correctness of any AI output before acting on it. On its own, we wouldn’t be stopping and thinking twice about that popping up in Copilot’s terms.
But for whatever reason, Microsoft was compelled to go back and add in what’s basically an “LOL JK” to the entire document.
Honestly, this is probably just a lawyer feeling the need to overcorrect and cover Microsoft’s liabilities — they just happened to go much too far in the process, inviting ridicule. If only someone had asked Copilot to scan over the terms and spot any potential embarrassments — except, as we now know, that’s not what it’s made for!
Thank you for being part of our community. Read our Comment Policy before posting.

