WHERE ABUSE BECOMES INSTRUCTION
- Ventzi Nelson
- 7 days ago
- 5 min read
The story isn’t a website. It isn’t one case in France. It isn’t even the investigation itself. The story is that a system exists—quietly, persistently, and globally—that teaches men how to violate women, shows them how to get away with it, and gives them an audience when they do. What happened in the case of Dominique Pelicot should have broken something wide open. A husband drugged his wife and invited strangers to assault her, coordinating it online, documenting it, repeating it. That alone should have forced a reckoning. Instead, it became a headline cycle. The site connected to it, Coco.fr, was shut down. Attention moved on. The structure stayed exactly where it was.
Now the reporting from CNN pulls the lens back just enough to see it clearly. Not a one-off. Not a monster in isolation. A network. Forums. Chat groups. Video archives. Payment channels. Instructions. Language. A culture that already existed before anyone bothered to look. Inside that world, there are categories for this kind of abuse. Labels. Tags. Ways to signal what’s happening and how it should be understood. That detail matters more than anything else because it shows pattern and repetition. It shows that behavior has been observed, repeated, and quietly standardized across users who don’t know each other but operate as if they’re following the same playbook.
The platform named in the investigation, Motherless.com, didn’t create this behavior. It doesn’t have to. It just has to host it. That’s the quiet agreement at the center of all of this. Platforms provide space. Users provide content. As long as nobody forces the issue, everything continues. The legal shield that makes this possible—Section 230—was built for a different internet. Message boards. Comments sections. Early social platforms. It was never designed for a world where abuse could be filmed, tagged, sold, and livestreamed to paying viewers in real time. The law stayed still while the technology accelerated.
What that leaves behind is a system where men are openly discussing how to drug their partners without killing them, where others claim to sell substances designed to erase memory, and where livestreams turn abuse into a transaction directed by paying strangers. These are not hypothetical risks. They are documented interactions, repeated often enough to form recognizable patterns. The most important detail is not that this exists. It’s that it exists in a form that allows it to be learned, copied, and refined over time.
The internet didn’t create the abuse. It industrialized it. Most of these cases begin in relationships, inside homes, in beds that were supposed to be safe. The digital layer comes later, when the abuse becomes content. That shift—from private violence to distributed media—changes everything. It multiplies the harm. It transforms a single act into something that can be replayed, interpreted, and shared indefinitely. It also changes the mindset of the people participating in it, because what was once hidden becomes visible and, in that visibility, validated.
Psychologists like Annabelle Montagne have described how these communities function from the inside. There’s reinforcement. There’s approval. There’s a sense of belonging built around shared harm. That dynamic matters because it lowers resistance. Someone alone might hesitate. Someone inside a group that normalizes the behavior moves differently. The hesitation gets replaced by a kind of confidence that comes from watching others do the same thing and face no immediate consequence.
Platforms understand this dynamic because they’ve seen it before in other contexts. Engagement rises with intensity. Content that shocks or provokes travels further. Algorithms don’t distinguish between types of engagement; they follow it. That means the system naturally elevates more extreme material over time. What begins as borderline becomes baseline. What was once unthinkable becomes familiar simply because it appears often enough to feel like part of the landscape.
There’s another reason this persists, and it has nothing to do with technology. The evidence itself works against the victims. Many don’t remember what happened. The substances involved leave the body quickly. Video, when it exists, can be dismissed as unclear or staged. The entire structure creates doubt at every step. That doubt doesn’t stay contained to courtrooms. It shows up in conversations, in social reactions, in the way people talk about what happened.
Victims are still told they imagined it. They’re still told it wasn’t that serious. They’re still told that abuse inside a relationship carries a different weight, as if proximity reduces harm. That phrase—“but he’s your husband”—continues to surface because it reflects a deeper problem. The law may recognize these acts as crimes, but social understanding hasn’t fully caught up. That gap allows the system to continue operating without the level of outrage or urgency it demands.
Regulators step in, but they do it cautiously. Ofcom investigates compliance, documentation, age verification. Necessary steps, but they operate at the edges. They don’t dismantle the core mechanism, which is that platforms can host and scale this behavior faster than any enforcement system can respond. Law enforcement handles individual cases. Prosecutors build what they can with the evidence available. None of that touches the underlying structure that allows new cases to form.
Even when action is taken—when a group is shut down or accounts are removed—the system doesn’t disappear. It shifts. It re-forms in another space. That’s the nature of something decentralized and anonymous. It doesn’t rely on a single platform. It relies on the existence of platforms willing or able to host it. As long as those spaces exist, the system finds a place to operate.
Money keeps it alive. Once something becomes monetized, it gains resilience. Livestreams, paid access, subscriptions—these aren’t fringe elements. They are incentives that reward participation and encourage expansion. The presence of payment, especially through methods that obscure identity, turns the system from a collection of users into a functioning market.
What’s hardest to confront is how visible all of this is. Journalists didn’t uncover it through sophisticated technical means. They followed links. They joined groups. They observed behavior that wasn’t hidden particularly well. That detail matters because it removes any illusion that this is buried beyond reach. It’s happening in spaces that are accessible, searchable, and active.
Still, it doesn’t register at the scale it should. Each case gets treated as its own story. A trial. A headline. A moment of outrage that fades as quickly as it appears. That framing keeps the focus narrow. It prevents people from seeing the continuity that connects one case to the next. The continuity is where the real story lives.
From Gisèle Pelicot to the survivors interviewed by CNN, the pattern repeats with disturbing consistency. Drugging. Recording. Sharing. Denial. Social disbelief. Legal difficulty. Long-term trauma that doesn’t resolve when the abuse ends. What has changed is the scale. A single act can now be viewed thousands of times, directed by strangers, and preserved indefinitely. That changes the nature of the harm in ways that legal systems are still struggling to define.
There’s a tendency to look for a single solution, a single point of failure that can be fixed. That approach doesn’t match reality. This system exists because multiple layers align—legal protections, platform design, social attitudes, enforcement limitations. Removing one layer doesn’t collapse the system. It forces it to adapt. That’s why it persists.
“Rape Academy” sounds extreme until you examine what’s actually happening. Instruction is present. Practice is present. Feedback is present. Community is present. Those elements define a system that teaches behavior, whether anyone is willing to say that out loud or not.
Comments