VICTORY! Microsoft’s GitHub Finally Fights Image-Based Sexual Abuse
One spring day, Jodie* received an anonymous message with a link to a pornography website. When she clicked on it, her world imploded.
There, right in front of Jodie’s eyes, was multiple pornographic images and a video, depicting herself having sex with various men.
Yet Jodie had never taken these photos! She had never even had sex with these men!
“I was screaming and crying and violently scrolling through my phone to work out what I was reading and what I was looking at,” Jodie says. “I knew this could genuinely ruin my life.”
As a devastated Jodie continued to investigate further, she found that the images and video were forged using AI. Someone had posted photos of her on a pornography site, saying she made them “so horny” and soliciting help in making AI-generated pornography of her.
And then came the realization that shook Jodie to her core: That person was her best friend.
*Name changed
Microsoft’s GitHub & the Epidemic of AI-generated Image-Based Sexual Abuse
Sexually explicit images created with AI are commonly called “deepfake pornography.” But the more apt term to describe this horrific violation is AI-generated image-based sexual abuse (IBSA).
AI-generated IBSA is being used to victimize women all around the world. And the impacts on their mental health, reputation, and overall wellbeing are profound.
One corporation has largely been at the root of nearly all AI-generated IBSA/“deepfake pornography”: Microsoft’s GitHub.
Microsoft’s GitHub Takes Steps to Combat IBSA
Microsoft’s GitHub is the world’s leading AI-powered developer platform and arguably the most prolific space for artificial intelligence development. Yet they have also been a breeding ground for software codes aiding the creation and rapid dissemination of AI-generated IBSA.
That’s why The National Center on Sexual Exploitation (NCOSE) named GitHub to the Dirty Dozen List for two years in a row. You joined us in taking action, demanding GitHub stop fueling the epidemic of IBSA.
And GitHub has finally listened to you!
What Changes Has Microsoft’s GitHub Made?
In September, the White House made an announcement about AI model developer’s voluntary commitments to combat image-based sexual abuse (IBSA), which included a public pledge from Microsoft’s GitHub to address the issue. The announcement drew our attention to a policy change GitHub had quietly made in May, prohibiting projects that are “designed for, encourage, promote, support, or suggest in any way” the creation of AI-generated IBSA/“deepfake pornography”
Although NCOSE repeatedly reached out to multiple Microsoft and GitHub contacts asking if they had made any of the changes requested through the Dirty Dozen List campaign, they all neglected to mention this new policy, or even acknowledge our emails or the campaign.
Nonetheless, we joyfully celebrate this new victory, which YOUR voice and action made possible!
GitHub began considering this policy change only a week after the 2024 Dirty Dozen List was revealed and you began taking action. Although the company is bent on pretending the List doesn’t exist, this timing is highly unlikely to be a coincidence.
As a result of their new policy, GitHub removed several repositories that hosted codes to generate IBSA, including DeepFaceLab which hosted the code used to create 95% of all deepfakes and sent users directly to the most prolific sexual deepfake website.
AUTHORS
Lily Moric
RELATED VIDEO: Expert Discusses Human Trafficking Crisis and How You Can Help
EDITORS NOTE: This National Center on Sexual Exploitation column is republished with permission. All rights reserved.
Leave a Reply
Want to join the discussion?Feel free to contribute!