Why did Twitter Allow Child Sexual Abuse Materials (Child Porn) on its Platform?

Share:

Listens: 0

Sexploitation

Society & Culture


Child sexual abuse materials (or CSAM, aka child pornography) has surged over 106% during COVID according to the National Center on Missing and Exploited Children. And tragically, CSAM isn't only happening on the Dark Web - it's also flourishing on mainstream social media platforms like Twitter. Survivor John Doe was only 16 when he discovered exploitative child sexual abuse materials of himself at age 13 were posted on Twitter. The video managed to accrue over 160,000 views before Twitter finally took it down—despite multiple reports from both John Doe and his mother verifying his status as a minor.   Lisa Habba, Esq. and Peter Gentala, Esq. joined this episode of the Ending Sexploitation podcast to share the story of John Doe, and another male survivor, who are suing Twitter for facilitating their child sexual abuse materials. The discussion includes the legal challenges of the case and why Twitter assumes it should be immune from any liability, despite fostering an environment that appears to allow child sexual abuse materials to flourish. Take Action: If you or someone you know has been harmed by sexual exploitation via Twitter please contact the Haba Law Firm and the NCOSE Law Center. Learn more about this case and help spread the word on how Twitter is complicit with the distribution of child sexual abuse material (CSAM.)