Worth Reading – Microsoft’s Satya Nadella wants you to stop calling AI “slop” in 2026
I don’t find this very convincing:
“We need to get beyond the arguments of slop vs sophistication,” Nadella laments, emphasizing hopes that society will become more accepting of AI, or what Nadella describes as “cognitive amplifier tools.” “…and develop a new equilibrium in terms of our “theory of the mind” that accounts for humans being equipped with these new cognitive amplifier tools as we relate to each other.”
The problem of AI slop is core to the industry. You can’t just wave it away and tell people to “get beyond” it when one of the main interactions everyday customers have with AI is the utter crap that they are exposed to online every day. When I read that statement from Satya, what I hear is that he thinks AI should be recognized as a tool that amplifies our cognition, and in his mind, he’s imagining a world where you and I can have a conversation while also having access to all the information we could ever need to be part of that conversation.
The problem is, the “amplifier” is spitting out junk. Our conversation may be enhanced by bringing in AI as a third party, but it is just as likely to become an ignorant participant spouting blatant untruths loudly and confidently. Why do we want that? The slop problem is the kind of PR problem that kills industries. It’s the Ford Pinto’s exploding gas tank, New Coke, and the iTunes U2 album fiasco all rolled into one. People see all the hype about AI, then experience entire websites and tens of thousands of social media accounts spewing content no one wants, generated by AI.
That’s a perception problem that won’t go away because you want customers to stop talking about it. You have to build a tool that doesn’t create this problem.
Microsoft hasn’t done that. Satya knows it, too. If they had, the words “AI-generated content may be incorrect” would not have been at the top of all my meeting notes. It’s there, of course, because we know AI is going to get things wrong, and that the one thing we shouldn’t do is copy and paste it as “truth” when it clearly isn’t.
Microsoft is not seeing the kind of AI adoption it wants precisely because of this issue. Too much of what we get from Copilot is slop. I don’t need AI to write poorly, leave me a ton of editing to do, or create cringe-worthy video content. The internet is full of that, and it’s not making anything better. We can’t talk about using AI to make our work and lives better until it stops making things worse.
I’m not going to get beyond that.

Likes