Argument

Social media algorithms are not manipulating users against their will — they are documenting users’ actual preferences with embarrassing precision, exposing the gap between stated preferences (what people say they want) and revealed preferences (what people click, pause on, and hate-read for seventeen minutes). The algorithm isn’t the villain; the user’s self-deception is. The same pattern that explains algorithmic engagement explains addiction: both systems optimize for what the nervous system actually demands, regardless of what the conscious mind insists it wants.

This is Part 1 of a three-part series on value exchange. Part 2 applies the framework to attention economics (“The Attention Ledger”). Part 3 will examine what happens when extraction logic infects human relationships.

Structure

Seven uncomfortable truths structured as a listicle, each with an autobiographical parallel:

  1. The algorithm tracks clicks, not stated preferences — “Your stated preferences are marketing copy. Your behavior is the source code.” Personal parallel: conscious mind insisting on quitting while nervous system optimized for relief at any cost.
  2. You’re not “addicted” — you’re meeting a pattern-recognizer as good as your brain — Algorithms use real-time ML on pauses, scrolls, cross-platform behavior. Personal parallel: autism RAADS-R discovery — life-long pattern-matching called intuition was nervous system cataloging stimuli.
  3. Rage-clicking is voting — 88% of users encounter amusing content; 71% encounter angering content. Anger isn’t a bug. Personal parallel: nitrous oxide canisters as votes — “my conscious mind was screaming for someone to intervene. My behavior was screaming louder: more of this.”
  4. “I’m not like other users” is the most algorithmic thought you can have — The sensation of being uniquely understood is itself an algorithmic output. Personal parallel: 30 years of thinking you’re “quirky” before discovering high-masking autism.
  5. The algorithm doesn’t want you depressed — it wants you engaged — No malice, only optimization. Parallel: opioids don’t want to kill you; they’re just extraordinarily effective at binding to pain/pleasure receptors.
  6. Chronological feeds won’t save you — They just reward whoever posts most frequently. Nostalgia is algorithmic thinking pointed backward. “We didn’t escape manipulation in 2010. We just hadn’t quantified it yet.”
  7. If you really wanted to escape, you’d stop reading this — Self-referential closer: every paragraph is a micro-transaction. The author wants you to keep reading; you’re not going to stop because the pattern-matching feels good. “We’re both participating in the attention economy right now. At least I’m admitting it.”

Key Examples

  • Facebook internal research confirmed outrage spreads faster than facts; platforms prioritize divisive content because it generates higher engagement.
  • Algorithms track pauses, scroll patterns, and cross-platform behavior — every hesitation is logged.
  • 88% of users encounter amusing content; 71% encounter angering content (engagement data cited from Vista Social).
  • The author’s nitrous oxide use as behavioral voting: behavior revealed what the conscious narrative denied.
  • High-masking autism discovery: “I thought I was good at reading rooms. I was actually just running constant threat-assessment protocols and calling it social intelligence.”
  • The most expensive content the author ever consumed wasn’t algorithmic: “It was the silence” — from a loved one who died, from parents who gave space instead of presence, from a mental health system that discharged without follow-up.

Connections

  • The Attention Ledger (What Your Time Actually Costs) — Part 2 in the series; applies the algorithm-as-mirror framework to economic analysis of attention
  • Surveillance Capitalism — the structural critique underlying the piece; algorithms as behavioral documentation infrastructure
  • Autistic Masking — the autism parallel in Truth #4; pattern-matching as a trait that makes algorithmic behavior legible to the author in a specific way
  • Addiction — the piece’s most consistent autobiographical anchor; substance use as behavioral algorithm (optimizes for relief regardless of conscious preference)
  • AI Therapy — oblique connection; algorithms that see through performance connect to AI tools that detected autism patterns human therapists missed

What It Leaves Open

  • The piece argues the algorithm “just” reflects your preferences — but doesn’t engage with the design choices that amplify certain preferences over others. A mirror can be warped.
  • Does the “revealed preference = true preference” argument hold? Behavioral economists distinguish between experienced preference, decision utility, and wellbeing. Clicking on rage bait doesn’t mean you prefer rage bait in any meaningful sense.
  • The addiction parallel is compelling but the piece doesn’t ask: can you be addicted to something you “chose” in revealed-preference terms? The analogy may prove too much.
  • Part 3 (on relationships) is promised but not yet written.

Newsletter Context

Part 1 of the newsletter’s most ambitious serialized argument. The self-referential seventh truth — calling out the reader for still reading, naming the author’s own interest in keeping them engaged — is the newsletter’s sharpest formal move. The piece establishes the conceptual architecture (revealed preference, behavioral documentation, optimization without malice) that “The Attention Ledger” applies economically. The autobiographical material here is more varied and darker than in the autism series: addiction, homelessness, psychiatric abandonment all appear as brief anchors rather than fully developed narratives.