You’ve probably noticed it. Your Facebook feed full of images that look slightly wrong. A six-fingered Jesus made of shrimp. A YouTube Shorts scroll that feels like wading through an ocean of animated monkeys, AI cat soap operas, and zombie football stars. Books on Amazon written by authors who don’t exist. Google itself suggesting you eat rocks.
There’s a word for all of this now. AI slop.
The term exploded across the internet in 2024, got named Word of the Year by three separate dictionaries in 2025, and has become the default way people describe the flood of low-quality AI content clogging up the internet. If you’ve seen it used and wondered what exactly it means, where it came from, and why it struck such a nerve, here’s the full story.
What Does “AI Slop” Mean?
AI slop is low-quality content generated by artificial intelligence and published in large quantities, usually without meaningful human oversight or creative intent. Merriam-Webster’s official definition: “digital content of low quality that is produced usually in quantity by means of artificial intelligence.”
But the definition only gets you halfway there. The word carries a specific vibe.
Not all AI-generated content is slop. If someone uses AI as a tool with care and creative direction, that’s just using a tool. Slop is what happens when someone points an AI at a platform and lets it spray content everywhere with zero concern for quality. The distinction matters.
Tech blogger Simon Willison, who helped push the term into mainstream discourse with a May 2024 blog post, put it this way: if it’s “mindlessly generated and thrust upon someone who didn’t ask for it,” that’s slop. He compared it to spam. Not all email is spam. But unsolicited, mass-produced junk email is. Same logic. Not all AI content is slop. But unsolicited, mass-produced junk AI content absolutely is.
The food metaphor is doing a lot of heavy lifting here, and that’s the point. Nobody wants to be served slop. The word is visceral, gross, and dismissive. It was chosen for exactly those reasons.
Where Did the Term “AI Slop” Come From?
The word “slop” is old. It dates back to the 1700s, when it meant soft mud. By the 1800s it had shifted to mean food waste (as in pig slop), and from there it became general slang for anything cheap and worthless.
What’s new is applying it to AI.
References to AI-generated content as “slop” started appearing in online spaces around 2022 and 2023, but the term didn’t have a single breakthrough moment. In 2024, a poet and technologist writing under the name “deepfates” described it as “the term for unwanted AI generated content” in a widely shared post on X. That framing clicked.
Then came Willison’s blog post in May 2024, titled “Slop is the new name for unwanted AI-generated content.” Willison has been clear that he didn’t invent the term. He amplified it. But his spam analogy gave the word a framework that made it stick. After that post, “slop” spread from tech circles into mainstream media within months.
Why “Slop” Became Word of the Year (Three Times)
This is the part that signals just how deeply the term has embedded itself in the culture.
Merriam-Webster named “slop” its 2025 Word of the Year. Dictionary president Greg Barlow said people “want things that are real, they want things that are genuine. It’s almost a defiant word when it comes to AI.”
The American Dialect Society selected “slop” as its 2025 Word of the Year on January 9, 2026, in its 36th annual vote with over 300 attendees. The society noted that by 2025, “slop” could stand alone without the “AI” prefix. The context was understood.
Macquarie Dictionary in Australia also named “AI slop” its 2025 Word of the Year, defining it as “low-quality content created by generative AI, often containing errors, and not requested by the user.”
Three major language institutions, independently, on three different continents, all landed on the same word. That doesn’t happen unless a term has captured something real about how people are experiencing the internet.
What Does AI Slop Actually Look Like?
The examples are everywhere. That’s kind of the problem.
Social Media Slop
The most iconic example is Shrimp Jesus. In March 2024, Facebook was flooded with bizarre AI-generated images of Jesus Christ fused with sea creatures. Crabs, shrimp, lobsters. The images were generated using tools like Midjourney and Leonardo, based on prompts that spammers had ChatGPT write for maximum engagement. Captions read things like “Say Amen to Shrimp Jesus for seven years of luck.”
Facebook’s recommendation algorithm promoted this content aggressively to anyone who interacted with it, creating a feedback loop. Shrimp Jesus became the poster child for AI slop on social media, but it wasn’t alone. AI-generated images of injured soldiers, sick children, and impossible animals flooded the platform throughout 2024.
YouTube Slop
A November 2025 study by Kapwing found that 21% of YouTube Shorts shown to new users were AI-generated slop. Among the 100 most-watched trending channels per country, 278 channels published exclusively AI-generated content. Those channels had accumulated 63 billion views, 221 million subscribers, and an estimated $117 million in annual ad revenue.
The content itself is surreal: animated monkeys in absurd scenarios, AI cat soap operas, zombie football stars, babies trapped in space. India’s Bandar Apna Dost channel alone earns an estimated $4.25 million yearly from AI-generated monkey videos.
YouTube CEO Neal Mohan declared managing AI slop a top priority for 2026 in his January annual letter. When the CEO of YouTube calls your content category a problem, the problem is real.
Books and Publishing
Amazon has been flooded with AI-generated books. Author Jane Friedman discovered multiple AI-generated books listed under her name and likeness on Amazon, which she had to fight to get removed. Scammers take public domain works, slap AI-generated summaries on them, and relist them with new covers. Even worse, AI-generated cookbooks and foraging guides have appeared with hallucinated information that poses genuine safety risks. Following a recipe that doesn’t exist is annoying. Following a foraging guide that hallucinates which mushrooms are safe to eat could be dangerous.
Search Results
Google’s AI Overviews launched in May 2024 and immediately produced viral failures. The system suggested users eat rocks for minerals and add glue to pizza, pulling from old Reddit jokes it had mistaken for real advice. Critics pointed out the irony: Google was adding its own layer of AI slop directly on top of search results, the very place people go to find reliable information.
Academic Publishing
Nature reported that AI slop is “causing a crisis in computer science.” NeurIPS submissions jumped from under 10,000 in 2020 to over 21,500, with a growing share AI-generated. A December 2025 study in Science analyzing approximately 2 million papers found that researchers using LLMs published roughly 36-60% more output depending on field, but quality metrics dropped. More papers, less impact.
“Your AI Slop Bores Me” and the Meme Backlash
The frustration with AI slop hasn’t just produced a vocabulary. It’s produced a whole meme culture.
The biggest one: “Your AI Slop Bores Me.” The phrase comes from a riff on the “Your Politics Bore Me” reaction image (a kid sitting on a throne of Pepsi cases). On October 17, 2025, the Artists Against Generative AI Facebook page posted the AI slop version. It racked up 4,300+ reactions and 3,200+ shares, and became a go-to reply any time someone posted AI-generated content.
By early 2026, the phrase had spread across Reddit, X, Facebook, and Instagram as a reaction meme. In March 2026, developer Mihir Maroju built an interactive web game inspired by the catchphrase. The concept: you submit a question you’d normally ask ChatGPT, and a real human “LARPs” as an AI to answer it. It went viral on Hacker News.
Then there’s “AI;DR” (AI, didn’t read), which emerged in early 2026 as a riff on TL;DR. Coined on Threads by developer David Minnigerode, it’s used to dismiss anything that reads like AI-generated text.
And then there’s “Microslop.” In late December 2025, Microsoft CEO Satya Nadella published a blog post urging the tech industry to move beyond “arguments of slop vs sophistication.” The internet responded exactly how you’d expect. “Microslop” trended immediately, a portmanteau of Microsoft and slop mocking the company’s aggressive AI integration. Classic Streisand effect. Ask people to stop using a word, and they’ll use it twice as hard.
The frustration with AI slop has also produced new slang beyond these memes. Clanker, the internet’s go-to insult for AI and robots, emerged from the same cultural moment of anti-AI sentiment.
Why “Slop” Resonated So Deeply
Before “slop” existed as a label, people had the feeling but not the word. You’d scroll past a weird AI image on Facebook and think “this is garbage,” but there wasn’t a shared term for the specific experience of encountering low-quality AI content you never asked for.
“Slop” gave that feeling a name. And the food waste metaphor made it intuitive. You don’t need anyone to explain why slop is bad. The word does the work on its own.
It also draws a useful line. Using AI as a tool? Fine. Dumping AI-generated content on people without care or craft? That’s slop. The word doesn’t condemn AI itself. It condemns laziness, carelessness, and the flood of junk that results from treating content as something to be produced rather than created.
If brain rot is the effect of consuming too much low-quality content, AI slop is one of its biggest fuel sources. And slop joins rage bait (Oxford’s 2024 Word of the Year) in a growing vocabulary for describing how the internet manipulates attention and erodes quality.
Some have drawn comparisons between AI slop and human-created low-effort content like NPC streaming, though the scale and automation of AI slop puts it in a different category entirely. A human NPC streamer is one person doing one thing. An AI slop operation can generate thousands of pieces of content per day.
The word isn’t going anywhere. If anything, it’s growing. Linguists have already tracked an expanding family of related terms: “sloppers” (people who produce slop), “slopocalypse” (the feared future where slop overwhelms everything), “sloptimized” (content engineered to be maximally sloppy for engagement). A whole vocabulary is building around a single frustration.
The Bottom Line
AI slop is 2024-2026’s defining internet phenomenon. It names the exact moment when AI content generation outpaced AI content quality, and the internet noticed. The word captured a shared frustration so precisely that three dictionaries independently named it Word of the Year.
The slop isn’t slowing down. YouTube is scrambling to manage it. Amazon can’t keep up with fake books. Google accidentally created its own slop with AI Overviews. Academic journals are drowning in AI-generated submissions. And the meme backlash, from “Your AI Slop Bores Me” to “Microslop,” shows that people aren’t just annoyed. They’re fighting back, one dismissive reply at a time.
If you’ve looked at something online and thought “a robot made this and nobody checked,” you’ve encountered AI slop. Now you have the word for it.
Frequently Asked Questions
Is all AI-generated content considered slop?
No. “Slop” specifically refers to AI-generated content that is low-quality, mass-produced, and pushed on people who didn’t ask for it. AI used thoughtfully as a tool, with human oversight and creative intent, isn’t slop. Simon Willison’s analogy is the clearest framework: not all email is spam, but unsolicited mass-produced junk email is. Same principle applies here.
Why was “slop” named Word of the Year in 2025?
Three major dictionaries independently named it: Merriam-Webster, the American Dialect Society, and Australia’s Macquarie Dictionary. The term captured a widely shared frustration with the flood of low-quality AI content across social media, search results, publishing, and the web. Merriam-Webster’s president said people “want things that are real” and called it “almost a defiant word when it comes to AI.”
What is the “Your AI Slop Bores Me” meme?
It’s a reaction meme and catchphrase used to dismiss AI-generated content. It originated from an October 2025 post by the Artists Against Generative AI Facebook page, based on the “Your Politics Bore Me” Pepsi throne image. By early 2026, it became a widespread reaction across Reddit, X, and Instagram. Developer Mihir Maroju turned it into a viral web game where humans answer questions instead of AI.
What are common examples of AI slop?
The biggest examples include: “Shrimp Jesus” and other bizarre AI-generated images flooding Facebook in 2024; AI-generated books and fake author profiles on Amazon; YouTube channels producing exclusively AI-generated Shorts that have accumulated 63 billion views globally; Google AI Overviews suggesting users eat rocks and add glue to pizza (May 2024); and AI-written academic papers overwhelming peer review at major conferences like NeurIPS.
What does “Microslop” mean?
“Microslop” is a portmanteau of “Microsoft” and “slop.” It trended on social media in January 2026 after Microsoft CEO Satya Nadella published a blog post asking the tech industry to move beyond “arguments of slop vs sophistication.” The internet responded by coining “Microslop” to mock Microsoft’s aggressive push of AI into its products. It’s a textbook Streisand effect: asking people not to use a word guaranteed they’d use it more.