The National Center on Sexual Exploitation is calling on YouTube to remove all pornography from its platform, following yet another disturbing account of apparently monetized child erotica on YouTube. This is one of the reasons Google has been placed on NCOSE’s 2019 Dirty Dozen List which names 12 mainstream facilitators of sexual exploitation.
In the description, he writes:
Over the past 48 hours I have discovered a wormhole into a soft-core pedophilia ring on Youtube. Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual CP in the comments. I can consistently get access to it from vanilla, never-before-used Youtube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.. Additionally, I have video evidence that these videos are being monetized by Youtube, brands like McDonald’s, Lysol, Disney, Reese’s, and more… Youtube is facilitating this problem. It doesn’t matter that they flag videos and turn off the comments, these videos are still being monetized, and more importantly they are still available for users to watch.
This is not the first time YouTube has come under fire for hosting sexually exploitive content. In April 2018 it was criticized for allowing a pornographic ad to appear on a trending YouTube video, and in November 2017 it was revealed that YouTube’s flagging system to prevent child victimization on its platform was reportedly malfunctioning for a year. Even the YouTubeKids app has been infiltrated with disturbing and often sexual content.
It’s an open secret that Google’s YouTube is a hub for child erotica and is used by pedophiles to network. It’s time for YouTube to make solving this issue their number one corporate priority. Too often, YouTube waits for users or the media to flag degrading and exploitive content on its platform. And then once the media buzz dies down, YouTube reverts to its whack-a-mole approach instead of making sustained improvements.
We know that the technological solutions exist that would be able to prevent this material from being posted and it would save countless man-hours that YouTube currently uses by employing human reviewers.
For instance, Dr. Michael Holm, Chief Data Scientist at Picnix, Inc., asserts “Our team is fully capable of delivering an effective, scalable AI solution for pornographic video detection, building on our seminal patent pending Iris Program (www.meetiris.ai).”
We implore Google and YouTube to collaborate with this company, and others, to find real solutions instead of putting a bandaid on it and waiting for the next blow up.