AI EDUCATION: What Is Enshittification?

724

Each week we find a new topic for our readers to learn about in our AI Education column. 

I’ve never been a big fan of platforms, whether they be for trains, games or businesses. Sooner or later, without proper upkeep and attention, platforms turn to crap.  

Sometimes literally. Just go to New York’s lovely subway system and enjoy the odors. 

And that, ladies and gentlemen, is what will bring us to this week’s AI Education topic, enshittification, a lovely, profane little neologism that we’re coincidentally talking about during the holiest week in the Western Christian calendar. 

Enshittification, sometimes called crapification or platform decay, is the process by which online platforms and two-way products and services tend to decline in quality over time. The term was coined in 2022 by Cory Doctorow to describe the gradual rot of the big legacy social media platforms like Twitter (now X) and Facebook. Doctorow believed these platforms had devolved from something useful and interesting to something shitty—hence “enshittification.” 

Two things: One, we could have called this article “What Is Platform Decay” and spared everyone the profanity, but enshittification is by leaps and bounds the more entertaining term. Thank you Cory Doctorow. Two, it’s actually kind of hilarious how many articles about or citing enshittification are stuck behind paywalls, or plastered with popup and chumbox advertisements. The internet of 2026 is blind to its coincidences, and yes, the media has been “enshittified.” Especially the financial media. 

Everything Is Getting Worse, Isn’t It? 

Here’s how Doctorow summarized enshittification at the top of an oft-quoted January 2023 blog post: 

“Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die.” 

As examples, Doctorow and others have raised some familiar online platforms as examples—Facebook, once a place to connect with classmates, then friends (and then family), was invaded first by news and advertisers, and shortly thereafter by bots. Changing algorithms and issues with associated apps (does anyone legitimate use Messenger anymore?) have exacerbated user frustrations—but no other platform really offers waht Facebook offers to users.  

Behind the scenes, Facebook was also making revenue-generating decisions seemingly at users’ expense, while making itself nearly indispensable to publishers and media outlets. When Facebook wants to revise its arrangements, content creators and publishers don’t really have any choice but to comply. Twitter, once a useful place for collecting and disseminating fast-moving information and opinions, degenerated in a similar process that started long before it changed ownership and names. 

Amazon, which once operated at a loss to provide users with goods at low prices and free shipping, eventually became a platform provider for third parties offering their products, generating revenue by ever growing layers of fees and promotion schemes, overwhelming users with sponsored search responses, according to Doctorow. Google Search has declined in a similar manner. 

From a user perspective, we don’t think these examples are wrong—all of these platforms have declined in usefulness over time. Search is overwhelmed by ads and sponsored responses. Social networks aren’t really a place to network anymore, and social media is just an overwhelming glut of repetitive low-quality content.  

From an investor perspective, however, these are some of the most powerful and successful companies in history—and companies with significant footprints in AI. Is AI inevitably going to be, uh, enshittified? 

AI Is Connected to Enshittification 

Doctorow himself is publishing a book about AI’s connections to platform decay, but we’ll give an outline of what we’ve seen here that suggests AI will inevitably succumb. 

We’ve covered the growing aversion to AI-generated content among some quarters of society in our columns before, but we again have to mention here that for some people, the proliferation of AI images and documents is itself an element of enshittification. We agree to an extent—content generated by or with the help of AI doesn’t have to make the internet or the world itself worse, but let’s face it, there’s already too much content out there, and AI is only going to make that problem worse. We don’t need more content, we need more exceptional content. 

Some of the early AI chatbots that were released into the wild—on social media networks, for example—were victims of a kind of mass social enshittification, as users force-fed the bots prompts to evoke racist or lewd behavior. 

But more importantly, look at the business models by which AI is being offered to users and businesses—AI as a service is pretty much how all of us are accessing the technology. A lot of the generative AI tools we first used were originally offered freely, and it was only as they were developed and refined that they were placed behind paywalls. Recent history with technology suggests that the user experience on all of these platforms will decay over time as AI developers also seek to develop ongoing revenue streams. At the same time, we’re going to become more dependent on these platforms in our business and our personal lives—which will enable the platform providers to squeeze revenue by degrading services, if they so choose. 

Doctorow’s Ways Out 

In a 2024 edition of the annual Marshall McLuhan Lecture at the Transmediale festval in Berlin, Doctorow offered four constraints that could potentially limit the process of enshittification: competition, regulation, self-help and labor. 

  • Competition works because companies care more about keeping prices reasonable and quality high when consumers can take their business elsewhere. Doctorow points out that today’s technology giants do not really have much in the way of competition. Similarly, ever-more powerful AI seems to be concentrating into fewer hands, and is not necessarily being made accessible to more people. 
  • Regulation helps avoid enshittification because companies will avoid cheating if there are well-written and well-enforced guidelines to keep them in check. 
  • Self help, where users avail themselves of the power of technology to avoid or combat enshittification. Think of pop-up and advertisement blockers on your web browser. If the experience gets bad enough, users might find a way to “disenshittify” their technology. 
  • Labor, or workers themselves. Technology workers might just refuse to participate in the enshittification of their products and services. Of course, as more of the technology industry becomes automated by artificial intelligence… well, you hopefully get the picture. Another intersection of AI and enshittification.