TikTokification and Enshitification

Apr 19, 2025

Work in progress

TikTokification after Enshitification

I. Introduction

My most successful post on LinkedIn is one of my shortest. A one-liner, "Is it just me, or is LinkedIn now like 80% garbage AI posts?" Engagement-wise, it is quite successful. However, besides a quick smirk in an endless sea of corporate-friendly sales posts, I couldn't discern the value I was providing to society. This (followed by a Hacker News discussion post) got me thinking: why have all platforms converged to promoting short-form content? Why is the quantity of user content on platforms rapidly increasing without the equivalent increase in quality? Why do I feel like I'm doom-scrolling on every app? Why is everything TikTokifying?

TikTokification is the denouement of Enshitification, where engagement metrics-driven growth models and an unregulated dopamine market force companies and platforms to offer algorithmically driven endless doom-scrolling as their core product over socially beneficial and innovative products.

Enshitification, famously analyzed by Cory Doctorow, describes the broader multi-stage process of platform decay. After locking in users, these platforms inevitably pivot to squeezing value first from those users, then from business customers, ultimately leaving behind a degraded service and user experience. A key mechanism driving this decline involves relentlessly catering to the fleeting attention and minimal agency of the 'marginal user', prioritizing growth metrics by optimizing for the simplest, most passively engaged user, stripping away complexity and user control. These conditions pave the way for TikTokification to emerge as the dominant, engagement-maximizing endpoint. TikTokification, then, is the specific, convergent form this decay now takes: a landscape dominated by addictive, algorithmically-controlled feeds prioritizing fleeting engagement over substance or user agency.

TikTokification is a problem. First, when every platform TikTokifies, we as a society lose out on innovative and socially beneficial products and are instead forced to consume cheap, short-form content. Where do we go for actual utility? Second, short-form content is shown to be addictive and harmful. We risk drowning on Adderall in a few years and being unable to read a book. And, with AI becoming more common, the barrier to entry to make content has never been lower, which means that engagement-driven product choices will force more TikTokification. This creates a never-ending vicious circle.

II. Body

A. Defining the Terms

Enshitification and TikTokification are related terms. To understand them, let's look at the typical user journey through the lifecycle of the company. The company, as a startup, creates an innovative product that attracts enthusiasts of that product. Over time, more people join the club. The company now starts locking people in, be it via hard-to-cancel subscriptions, subtle psychological tricks, or outright creating vendor lock-ins. As more consumers are locked into the product or platform, companies start to maximize value extraction.

Netflix went from only having ad-less streaming to offering an ad-supported subscription option, lowering the latter's price while increasing the former's. As consumer value extraction plateaus, the platforms start to decay. In Netflix, this was a competition taking key titles. As the platform decays, innovation at the company stops. The product quality remains stagnant (e.g., Adobe Photoshop), but is permissible due to a lack of competition.

The "Tyranny of the Marginal User" is a key mechanism here. To keep growth metrics going up, platforms must appeal to wider and wider populations, eventually targeting users who don't care about the service. But if you can hook them with dopamine hits from 5-second videos or mysterious content that disappears from feeds, you've gained one more user for your metrics, and the VCs give you a treat. These marginal users don't care about the service, and they're the worst to cater to, but everyone does because they're the only ones left in the growth frontier.

TikTokification is the current form of enshitification. Somewhere along the maximally extracting value and decay stage, the company starts optimizing for engagement metrics. Engagement metrics (DAU/MAU, average session duration, conversion rate, customer retention rate, NPS, LTV/CLTV, etc. ) lead to prioritizing the marginal user. The pressure is often top-down: investors, VCs, board members, founders, and managers all chasing growth. Growth needs to be measured. It is easier for employees down the food chain to say that they have increased time spent on the platform by 15% compared to last month, rather than saying they shipped a product that increases the satisfaction score by 20%.

B. How We Got Here: From Enshitification to TikTokification

To increase engagement metrics, companies create algorithms, infinite scrolls, introduce short-form content, and push users towards an experience like TikTok. All social media converges into one model. A model perfected by TikTok. A race to the bottom to get the top growth metrics creates this fundamental shift in platform goals, going from consumer utility to monetization. Yes, the social responsibility of a business is to make a profit. What about the moral responsibility?

We see this everywhere. LinkedIn once used to be a professional network and job posting site. Now, it is a corporate and professional TikTok, not only housing short-form videos but switching its entire business model from making money on job postings to making money primarily off advertisements.

TikTok's algorithm seems to be hitting the sweet spot in the unregulated dopamine market that's competing for our attention. The human mind clearly cannot self-regulate when it comes to dopamine hits. Tech companies know this. Hence, the blatant exploitation. There appears to be an arms race for attention. Just look at platforms recently:

  • Twitter (now X) has shifted from chronological text posts to algorithm-driven video content.
  • Instagram pivoted from a photography-sharing app to a short-form video platform mimicking TikTok's features.
  • YouTube has heavily promoted 'Shorts' while its algorithm increasingly favors shorter content.
  • Even Amazon now features TikTok-like product videos in its shopping experience.

This isn't merely coincidence - it's competitive convergence. As TikTok demonstrated unprecedented user growth through its algorithm-driven short-form content model, other platforms have frantically followed suit, believing it's the only way to survive in the engagement economy. The algorithmic supremacy represents a fundamental shift from user control to passive consumption - we're no longer choosing what we see; it's being chosen for us.

C. Why Are They a Problem? The Harms and Consequences

I firmly believe that TikTok & TikTokification are a net public harm. TikTokification degrades quality, utility, and innovation. Catering to the average user leads to "slop." Crucially, when companies aggregate over the average, they cannot innovate. They lose out on core differentiability. As one Hacker News user elegantly puts it:

"For any given thing or category of thing, a tiny minority of the human population will be enthusiasts of that thing, but those enthusiasts will have an outsize effect in determining everyone else's taste for that thing...If you're designing a consumer-oriented web service like Netflix or Spotify or Instagram, you will probably add in some user analytics service, and use the insights from that analysis to inform future development. However, that analysis will aggregate its results over all your users, and won't pick out the enthusiasts, who will shape discourse and public opinion about your service. Consequently, your results will be dominated by people who don't really have an opinion, and just take whatever they're given...So if you develop your product by following your analytics, you'll inevitably converge on something that just shoves content into the faces of an indiscriminating userbase, because that's what the median user of any given service wants."

This metric-focused growth stifles innovation. Is it possible that some things are just beyond metrics? I believe so.

The loss of purpose and utility is evident across platforms. LinkedIn’s job board has become so ineffective that I've yet to hear of anyone in 2024 who applied to a job posted there and actually heard back. Netflix and Spotify have transformed from searchable libraries into opaque "experiences" where user control is minimized. OKCupid abandoned detailed matching for mindless swipes.

The documented harms go beyond my grievances. Platforms' internal documents reveal the intentional design for addiction. TikTok's internal memos admit to "compulsive usage," coercive design patterns targeting teens, and deliberate limitations on user agency. Research has consistently correlated usage with anxiety, depression, loss of analytical skills, and diminished empathy. High "leakage rates" for harmful content expose vulnerable users to inappropriate materials, while safety features are known internally to be ineffective. Leaders prioritize metrics over safety, treating the latter as merely PR "talking points."

The corrosive role of metrics and AI amplifies these problems. When DAUs and engagement dictate design decisions, they override considerations of user well-being. As one internal document revealed, "When we make changes, we make sure core metrics aren't affected" - even when those changes might address serious safety concerns.

AI serves as a perfect amplifier for this crisis. Large language models excel at "predicting the next word," making them ideal for creating the endless stream of mediocre content that feeds these engagement machines. As AI lowers the barrier to content creation, platforms will flood with algorithmically optimized bait, further degrading the quality of our digital experiences. Of all the transformative possibilities AI offers, using it primarily to generate engagement-optimized content feels like a profound misuse of our civilization's technical capabilities.

D. My Duality: The Participant-Critic Paradox

I find myself in a contradictory position. On one hand, I hate social media and the TikTokification of everything. On the other hand, I need social media to run a business and build professional credibility. Even I am guilty of TikTokification - writing content hoping for more reach, sales, and engagement. I track these metrics to understand what content performs well, what drives views, impressions, and engagement.

This internal conflict mirrors larger tensions in our digital economy. In capitalism, it is often better to be a producer than a consumer - to harness rather than be harnessed by these systems. Yet I can't escape the feeling that there's something profoundly unsatisfying about this arrangement.

The AI dilemma intensifies this paradox. I can't imagine that 80% of AI applications should be devoted to content generation, yet I recognize the need to leverage some of these technologies to remain relevant. I want AI to solve substantive problems, not just generate more noise in an already oversaturated attention economy.

E. The Alternative: Reclaiming Agency & Utility

So, what is the alternative? Where market forces and profit motives reign, agents will always find ways to make money by satisfying demand - and unfortunately, there seems to be excess demand for brain-rotting content. The human mind struggles to self-regulate dopamine exposure, and tech companies staffed with brilliant people know this perfectly well.

If the free market isn't willing or able to regulate itself regarding addictive, attention-destroying platforms, perhaps government intervention becomes necessary. Without some form of regulation, we risk a future where we're all "drowning on Adderall in a few years and unable to read a book."

On an individual level, we can take steps to reclaim our digital agency:

  • Own your media where possible (like using Plex/Jellyfin instead of streaming platforms)
  • Practice conscious consumption: Delete social media apps from your phone and only engage with them through browsers. When you realize how much the web UI intentionally degrades the experience, you'll naturally reduce consumption
  • Seek real utility: Find platforms that still value function over addiction. As one commenter noted, sometimes what we need is the functional "horse," not the flashy but ultimately less useful alternative

On a systemic level, regulation focused on algorithmic transparency, data usage, and engagement optimization practices may be necessary, alongside potential antitrust measures to prevent the monopolistic tendencies that enable these behaviors.

III. Conclusion

TikTokification represents the harmful endpoint of platform decay - the final form of enshitification where metrics and the dopamine economy triumph over genuine utility and innovation. When every platform converges to the same model of algorithmically driven, endless scrolling of short-form content, we lose both innovation and agency.

The stakes are high - not just individual attention spans, but our collective ability to create and benefit from truly useful technologies. The documented psychological and social harms, combined with the potential misuse of powerful technologies like AI, impact not just individuals but the entire potential of our digital sphere.

We stand at a crossroads. Will we accept a digital landscape optimized for addiction rather than utility? Or will we demand better, both from ourselves through conscious consumption and from our systems through appropriate regulation? I believe we must do both if we want to preserve the true promise of technology: enhancing human capability rather than exploiting human vulnerability.

The irony isn't lost on me that I'm writing this lengthy analysis in an age defined by shrinking attention spans. But perhaps that's precisely the point - some ideas require space to develop, some thoughts deserve more than a scroll-through. In a world racing to the shortest possible content format, sometimes the real innovation is remembering the value of depth.


Acknowledgements

Many of the ideas explored in this essay were sparked and refined by the insightful and lively discussions within the Hacker News community, particularly in this thread which initially resonated with my own observations. Further inspiration and critical framing came from Cory Doctorow's concept of Enshittification and Ivan Vendrov's essay, The Tyranny of the Marginal User. The evidence regarding platform harms draws significantly from the compilation presented by Jonathan Haidt and Zach Rausch in their substack blog post analyzing Attorney General briefs.

Nitesh Pant