Imagine Europe were to decriminalize cocaine, remove its stigma, and distribute it for free to everyone aged 16+. The societal and health consequences would be catastrophic: a drastic rise in addiction, widespread neurological harm, and the collapse of preventive healthcare structures. The substance hijacks our reward system and creates artificial dopaminergic activity without real achievement—context-less neurochemical stimulation that sends the brain into overdrive without anything substantial having happened.
This thought experiment is not mere speculation. A comparable mechanism already exists in our reality, is neither criminalized nor stigmatized, and is freely accessible to anyone with internet access: the recommendation algorithms of major social media platforms. A glance at the screen-time on our smartphones is enough to grasp the scale.
The business model of the attention economy
Instagram, TikTok, YouTube & Co. are optimized down to the last detail to keep users in the app at any cost. The underlying business model is strikingly simple: free usage financed by advertising. The longer the retention time, the higher the ad revenue. Profit maximization through attention capitalization.
“So Instagram just shows me the content I like,” some may object. But this is precisely where the fatal misconception lies. It is not about content we like, but content that triggers us, hooks us, and doesn’t let go. Whistleblower Frances Haugen revealed in leaked internal documents in 2021 that Facebook fundamentally changed its algorithm in 2018: from then on, polarizing content was preferentially promoted because it demonstrably generated more engagement. Ahead of the 2020 U.S. election, these mechanisms were temporarily disabled - a remarkable admission of their democracy-threatening impact. Immediately after the election: reactivated.
The consequences go far beyond political polarization. Meta’s internal studies on the health of children and adolescents uncovered alarming results: 13.5 percent of teenage girls surveyed reported that Instagram intensified their suicidal thoughts. Seventeen percent said the platform worsened their eating disorders. Meta’s response? None. Growth before health.
A band-aid on a gaping wound
Returning to the dystopia introduced at the beginning: If the only measure against freely accessible cocaine were a ban for those under 16, would we have solved the problem? It might remove the most vulnerable from the immediate line of fire, but the systemic problem would remain unchanged. The drug would remain available, its use socially normalized, its production profitable.
This is precisely what the European Parliament has now called for with regard to toxic social-media algorithms: a usage ban for those under 16. It is better than nothing - undeniably. Yet it is like trying to cover a deep flesh wound with a band-aid while the bleeding continues.
Worse still: the measure could be counterproductive. By relieving platforms of formal responsibility for minors, it creates an incentive for even less moderation, fact-checking, and content curation. In the future, social-media giants will claim their apps are “16+,” while simultaneously shifting responsibility to overwhelmed parents. Minors who nonetheless gain access - and many will - will be confronted with even more toxic, even less regulated content.
Smoking ban 2.0
Confronting an addictive phenomenon with the same measures used against other addictive substances may appear logical at first glance. Brussels likely hopes for a success similar to the increase of the minimum age for tobacco consumption from 16 to 18 introduced in Germany in 2007. Back then, the share of adolescent smokers dropped by about half in the following years - a remarkable public-health achievement.
But this comparison overlooks essential differences. Tobacco is a regulated, paid product with social stigma and declining usage among adults. Social media, by contrast, is used by about 75% of 18- to 64-year-olds at least once a week, is free, ubiquitous, and not merely socially accepted, but often indispensable for social and professional participation.
If 75% of adults smoked and cigarettes were free at every corner, we would hardly have succeeded with an age limit of 18. The structural conditions would be fundamentally different, and policymakers would - hopefully - have had to resort to fundamentally different measures.
The surrender to Big Tech
Which brings us to the decisive question: why are we able to strictly regulate drugs and rein in tobacco corporations, while appearing to capitulate to Meta, TikTok, and Google? These companies endanger not only the mental health of an entire generation, but also the foundations of our democracy and social cohesion.
The answer is uncomfortable: we do not regulate the algorithms themselves, nor the business models, nor the systemic incentive structures. Instead, we shift the burden onto individuals and families, while the true beneficiaries continue undisturbed.
Effective youth protection is desirable. But what we really need is the courage to pursue structural solutions:
Regulation of recommendation algorithms
mandatory chronological feeds as the default
transparency of ranking mechanisms
liability for demonstrable harm
Measures that protect not only minors, but all of us. Because digital cocaine is freely available to everyone. And as long as that remains the case, a minimum age is merely cosmetic correction on a toxic system.