Sia: Of course it is! Is it criminal because it is designed tp be addictive? I'm not sure. A court case will decide that legally. However, isn't there an ethical and moral side to this too? Adults can choose how much time to spend or waste on endless scrolling, but children access it too. There is no reasonable way to stop children from getting lost on Instagram, Facebook, YouTube, etc. So what is the solution? Thoughts?
What’s happening in that Los Angeles courtroom isn’t some esoteric academic exercise; it’s the first time somebody has finally called out the industry that built the hate machine for what it actually is: an algorithmic opiate engineered to hook brains and harvest attention for profit.
The plaintiffs are arguing that Meta’s Instagram and Google’s YouTube designed their products to be addictive, exploiting psychological vulnerabilities, particularly in kids and teens, to keep them scrolling, reacting, comparing, and obsessing. That’s not “problematic use,” that’s systemic exploitation baked into the product.
And who’s on the stand defending this? Adam Mosseri, the head of Instagram, is insisting under oath that social media isn’t “clinically addictive” (the word “clinically” is doing a lot of work here) and dodging responsibility by insisting what they call “engagement” shouldn’t be labeled addiction. That’s like a tobacco exec insisting cigarettes aren’t addictive because they don’t cause clinical disease instantly.
It is absurd and insulting.
In researching my new book, The Hate Machine, I’ve been deep in this rabbit hole, and the internal and external academic and behavioral research is unequivocal: the platforms are addictive by design. Endless scroll, algorithmic feeds, reward-like notifications, personalized hooks that learn what makes each user stay longer… it all adds up to addiction.
It isn’t an afterthought or a happy accident; it’s the product’s design spec. This is the structural core of the Hate Machine: feed the beast with more eyeballs, more outrage, more time, and the machine gets more powerful, more enraging, more normalized.
The tech lobbyists and executives will squeal about semantics, “problematic use,” “engagement,” “user choice,” but that’s just legal gaslighting to mask responsibility. The software is designed to be addictive; the business model profits from addiction; and the users, especially the youngest and most vulnerable, are the collateral damage.