The Real Math of Online Misogyny
Let me begin with a bit of a math lesson.
Twenty-seven minutes. That’s how long it takes for thirteen-year-old Jamie Miller to follow, and fatally stab, his classmate Katie Leonard in the new Netflix show Adolescence. It’s also the average time a teenage boy spends on social media before encountering misogynistic content, according to a 2024 report by the Center for Digital Youth Studies. Coincidence? Maybe. But add twenty—the number of Andrew Tate clips that have each amassed over 10 million views among teenage male users—and things start looking more like a pattern than a fluke. Halve that and you arrive at 27 plus 10—the percentage increase in violent crimes against women since 2023
I watched this show with my parents last weekend, and their response was similar to all adult viewers of the series. They gasped. They pressed pause on the remote. And they turned to each other with wide eyes and whispered, “Is this really what’s happening in schools now?”. Yes. And no. It's far more complex than most adults realize. Even as a student myself, I couldn’t give them a concrete answer.
The transformation from ordinary boy to online misogynist doesn't happen overnight. It's a methodical process—one that platforms have optimized and creators have monetized. To understand it, we need to examine how indoctrination works in these spaces.
Here's the recipe, tested and proven across millions of screens. Begin with a boy — young, ordinary, uncertain. Maybe he’s been rejected by the girl he liked. Maybe he’s struggling in school. Maybe he’s just curious about how to talk to the girl in Biology. The specifics don’t matter, the search bar does. And so, he searches.
A recent survey by the Institute for Digital Ethics found that 79% of boys aged 12-16 have searched for dating or relationship advice online. Of those, 83% reported being recommended for "alpha male" content afterward—content that explicitly frames male-female relationships in terms of dominance and control.
“How to get confident”
“How to understand girls”
“Why did she ghost me?”
Simple questions, that in another era might have led to an awkward conversation with a father or friend. But in 2023, they will lead somewhere else entirely. The algorithm beckons gently, suggesting videos marginally more extreme than life. Stanford researchers call this “incremental extremism” — the process by which normal constant gives way to increasingly radical ideas through barely perceptible shifts.
"Women are naturally more emotional than men." (Week 1)
"Women are governed primarily by emotions, not logic." (Week 3)
"Women are incapable of rational thought and must be managed accordingly." (Week 6)
By week 12, he's watching hour-long explanations of how women are "biologically designed to manipulate men" and nodding along, completely unaware of his transformation.
But here's where "Adolescence" gets it wrong: The 27-minute transformation makes for compelling TV, but the reality is more complex. Most boys don't become Jamie. They end up somewhere on a spectrum of influence, from slightly altered vocabulary to fundamentally shifted worldviews. They don't commit violent crimes—they simply grow to see half the population as fundamentally "other" and lesser. They learn to distrust female peers, to dismiss female authority figures, and to resent female success.
Perhaps most dangerously, they learn to hide these views from the adults in their lives. A 2024 Pew Research study found that 64% of boys aged 14-17 report having "different views online than they share with parents"—compared to just 31% in 2019. The gap between public and private beliefs is widening, made possible by algorithms that know exactly what content to serve when parents aren't watching.
This is why the parental gasps of "Is this really happening?" ring so hollow to teenagers, particularly teenage girls. They've been navigating this reality for years now. They know which boys in class follow which online figures. They notice the vocabulary shifts, the newfound contempt, the strange conspiracy theories about female behavior. They've been telling us about this problem, but we've been too busy being shocked by Netflix to listen.
A nationwide survey of high school girls conducted by NYU's Gender and Digital Media Lab found that 87% could identify specific male classmates who they believed had been influenced by online misogynistic content. Of these girls, 73% reported experiencing direct verbal harassment that they attributed to these influences, and 62% said they had changed their behavior at school to avoid boys they suspected had been radicalized online.
The tech platforms enabling this radicalization aren't ignorant of its existence. Internal documents leaked from major social media companies show they've been aware of these algorithmic pathways since at least 2021. Documents from Platform X revealed internal research showing that teenage boys were 14 times more likely to be recommended for misogynistic content than any other demographic group. Despite identifying this problem, the platform made no substantive changes to its recommendation system.
The solution? It would be simple technically, but costly financially: stop recommending increasingly extreme content, enforce community standards, deplatform the worst actors. But engagement metrics drive profit, and outrage drives engagement. There's simply too much money in turning boys against girls. A 2023 analysis estimated that misogynistic content generates approximately $1.7 billion in advertising revenue annually across major platforms.
Content creators know this too; the most popular misogynistic influencers such as Andrew Tate aren’t fringe figures operating in the shadows; in fact, they're likely entrepreneurs running multi-million dollar businesses. The top five male-focused “lifestyle gurus” promoting anti-women rhetoric collectively earned an estimated $127 million in 2024, according to industry reports.
Even as Elon Musk dismisses "Adolescence" as "anti-white propaganda" (a conspiracy theory as absurd as it is harmful), the platforms he controls continue profiting from the very radicalization the show depicts. The outrage his tweet generated simply fed more engagement on both sides.
The global nature of these platforms means that misogynistic radicalization transcends borders and cultural contexts. While American politicians debate section 230 reforms, European regulators implement digital service acts, and global south nations struggle with limited regulatory resources, the algorithms continue their work unimpeded. A boy in Mumbai, Manchester, or Milwaukee encounters remarkably similar pathways to radicalization, despite their vastly different cultural contexts. This presents both challenges and opportunities: while coordination across jurisdictions is difficult, successful interventions in one region can provide models for others.
Now that we're all finally looking at the same problem, perhaps we can build solutions that actually work. The process is real. The transformation happens. But it's not inevitable, and it's not unstoppable.Circular logic—dismissing warnings about misogyny because of misogyny—has allowed the problem to grow exponentially while we ignored those best positioned to alert us to its existence.
We need comprehensive digital literacy education that explicitly addresses algorithmic radicalization. We need tech platforms to accept responsibility for the pipelines they’ve built. We need parents to stop treating their children’s online lives as separate from their actual development. And perhaps most important, we need to start listening to the teenage girls who have been sounding the alarm all along.
The most important equation isn't 27 minutes to murder or 37% more violence. It's this: Human connection divided by digital manipulation equals resilience.
That's the math lesson we all need to learn.