You Are the Product

The Transaction You Did Not Agree To

Every time you open a social media app, you enter an economic transaction. You receive content β€” posts, videos, news, arguments, entertainment. In return, you provide something more valuable than money: your attention. Your attention is then packaged, quantified, and sold to advertisers who pay the platform for the privilege of placing their messages in front of your eyes.

This is not a secret. It is the explicit business model of every major social media platform, every free news website, and every ad-supported media company. But the implications of this model for the quality of information that reaches you are rarely stated plainly.

The platform's customer is the advertiser. The platform's product is your attention. The content you see is not selected because it is true, important, or useful. It is selected because it is likely to keep you on the platform long enough to see the next advertisement. Truth, importance, and usefulness are sometimes compatible with engagement. They are not optimised for.

The Mechanics of Engagement

The algorithms that determine what appears in your feed are not designed to inform you. They are designed to predict, with increasing precision, what you will click on, react to, share, or spend time viewing. They are trained on billions of data points from billions of users, and they are extremely good at their job.

Research has consistently shown that the content most likely to generate engagement is content that provokes emotional reactions β€” particularly anger, outrage, fear, and tribal identification. A headline that makes you angry is more likely to be clicked than a headline that makes you think. A post that confirms your existing beliefs is more likely to be shared than a post that challenges them. A video that shocks you is more likely to be watched to completion than a video that methodically explains a complex situation.

This is not a design flaw. It is the design. The algorithm does not have a bias toward misinformation. It has a bias toward engagement. The problem is that misinformation is often more engaging than truth, because truth is complicated, nuanced, and frequently unsatisfying, while misinformation can be crafted to be simple, emotionally resonant, and perfectly aligned with what the audience wants to hear.

The Attention Market

The numbers involved are staggering. Global digital advertising revenue exceeded $600 billion in 2024. Meta's advertising revenue alone was approximately $160 billion. Google's exceeded $300 billion. These are not technology companies in any meaningful sense. They are advertising companies that happen to use technology as their delivery mechanism.

The competition for attention is not between platforms and users. It is between platforms. Facebook competes with TikTok competes with YouTube competes with Instagram competes with X for the same finite resource: hours of human attention per day. The platform that captures more attention sells more advertising. The platform that sells more advertising generates more revenue. The platform that generates more revenue invests more in the algorithms that capture more attention.

The result is an arms race in which the weapons are psychological manipulation techniques refined at a scale and speed that no previous advertising industry could have imagined. Every scroll, every pause, every click is a data point that makes the next prediction more accurate and the next piece of content more precisely calibrated to hold your attention.

What This Costs

The cost is not primarily financial, though the financial dimension is real β€” people buy things they see advertised, and the advertising is increasingly targeted to moments of vulnerability and impulse.

The deeper cost is informational. When the primary mechanism for distributing news is a system optimised for engagement rather than accuracy, the news that reaches you is systematically distorted. Not by a conspiracy, but by an incentive structure.

Stories about conflict are amplified over stories about cooperation. Stories that provoke outrage are amplified over stories that explain context. Stories that confirm partisan identity are amplified over stories that complicate it. Stories that are simple and wrong are amplified over stories that are complex and right.

Over time, this distortion changes not just what people know but how they think. The expectation of being outraged becomes the default emotional state for consuming news. The expectation of complexity β€” of stories that require patience, that resist easy conclusions, that demand the reader hold two contradictory ideas simultaneously β€” erodes. The audience trained by engagement algorithms expects information to feel a certain way: urgent, simple, and emotionally satisfying. When it does not, they scroll past.

The Doom Scroll

The term "doom-scrolling" entered common usage during the COVID-19 pandemic, but the behaviour it describes is older than the term. It is the compulsive consumption of negative news content driven by an anxiety loop: you feel anxious, you seek information to reduce the anxiety, the information increases the anxiety, you seek more information.

Platforms are not neutral conduits for this behaviour. They are accelerants. The algorithm detects that you are engaged with negative content and provides more of it. The infinite scroll mechanism removes natural stopping points. The notification system pulls you back when you leave. The design is not incidental. It is the result of billions of dollars of research and development aimed at maximising the time you spend on the platform.

The health consequences are documented. Increased anxiety. Reduced attention span. Disrupted sleep. A distorted sense of how dangerous the world is, because the content that reaches you is disproportionately alarming β€” not because the world is more alarming than it used to be, but because alarming content generates more engagement than reassuring content.

The Alternative

The attention economy is not inevitable. It is a business model. Other models exist and function.

Subscription-funded journalism removes the advertiser from the relationship between the newsroom and the reader. The incentive shifts from maximising engagement to maximising the perceived value of the journalism. This does not eliminate all distortions β€” subscription models can create incentives to serve the preferences of the subscriber base β€” but it removes the most destructive one: the algorithmic amplification of outrage for profit.

Public broadcasting, funded by licence fees or government allocation, removes both the advertiser and the subscriber from the equation. Its incentive is, in theory, to serve the public interest. In practice, public broadcasters face their own distortions β€” political pressure, institutional conservatism, the temptation to compete with commercial media for ratings. But they are not structurally incentivised to make you angry in order to show you advertisements.

Curation models β€” in which editorial judgment rather than algorithmic prediction determines what reaches the reader β€” offer a different tradeoff. The curator's biases are human and visible. The algorithm's biases are computational and invisible. Both distort. But the human distortion is at least legible, accountable, and subject to correction.

The Choice

You cannot opt out of the attention economy entirely. It is the infrastructure of modern information. But you can make choices about how you interact with it.

You can choose to get your news from sources whose business model does not depend on your emotional engagement. You can choose to notice when you are being manipulated β€” when the content you are consuming is designed to make you feel rather than think. You can choose to stop scrolling.

These are small choices. They do not change the system. But they change what the system does to you. And in an economy that runs on your attention, the decision about where to direct it is the most consequential economic choice you make every day.