TAKE ACTION: The Many Dangers of Discord for Children

Discord, a popular communication service, has been named to NCOSE’s Dirty Dozen List for the past two years due to its failure to adequately address the sexually exploitative content and activity on its platform. While Discord made some important safety changes following the 2021 Dirty Dozen List, problems persist on the platform and the company can do much more to improve. Discord’s lack of meaningful age verification, insufficient moderation procedures, and inadequate safety settings has caused harm to numerous children who were sexually groomed by predators or exposed to harmful content such as pornography. Following the launch of the Dirty Dozen List, new evidence of children being harmed on Discord has continued to emerge and make headlines.

What is Discord and how does it harm? 

Discord was initially launched in 2015 as a platform for people to connect online while playing video games. Since then, it has expanded well beyond its original gaming niche to become a widely used text, video, and audio chat app for myriad contexts. Now, the company states that approximately 80% of its active users report they either use Discord primarily for non-gaming purposes, or equally for gaming and non-gaming purposes. Discord has grown exponentially since the COVID-19 pandemic channeled the world’s energy into digital spaces, going from just 56 million active monthly users in 2019 to 150 million monthly users in 2021.

The main feature of Discord is its servers, which are chat rooms based around a particular interest or activity. Servers can be open to the public, though they are more commonly set to private by users whereby they require invites and special passwords to join. Moderation within these servers has, historically, been mostly left up to the members themselves with Discord simply relying on user reports and private server owners to catch bad behavior. Discord’s own Safety Center confirms as much by saying: “We do not monitor every server or every conversation.”  This laissez-faire approach to moderation has fostered an environment where sexual abuse and exploitation can flourish unhindered.

Unlike most popular social media platforms where users post photos of themselves and tend to use real names, it is an almost universal practice on Discord to employ arbitrary usernames and profile pictures. While it is still certainly possible to fake identities on the former type of social media platform, it is not even necessary to fake one on Discord, since profiles are not connected to identities. It is considered normal to interact with strangers on Discord without having any sense of who they are, what age they are, what their name is, or any other identifying traits. This online culture makes it very easy for predators to groom children on Discord without arousing suspicion.

Discord is extremely popular with children and teens. While the app is intended for ages 13+, there is no meaningful age verification to enforce this age limit, or to ensure that the few child protection measures Discord offers are actually applied.

Until mid-2020, users were not even asked for their date of birth upon signup. Currently, asking this question remains the only form of “age verification” Discord employs. Children can and do easily enter a false date of birth to bypass the age restriction for the platform, or bypass age restrictions for specific servers and features on the platform. For example, servers and individual channels within servers can be age-gated with a “NSFW” tag, but that protection is easily circumvented due to the lack of meaningful age verification. Such servers often contain pornography, which children are exposed to, and may even include child sexual abuse material (the more apt term for “child pornography”) or image-based sexual abuse (pornographic images that have been created or distributed without the consent of the person in the image).

Elsewhere, entire servers on Discord are dedicated to users finding and sharing image-based sexual abuse of girls and women. Discord made international news in 2020 when one server revealed that over 140,000 images of women and minors had been widely shared and distributed. 

Recent News about Children Being Groomed on Discord

Upon naming Discord to the 2022 Dirty Dozen List, we highlighted extensive evidence of children being sexually groomed on the platform. Since the launch of this campaign in early March 2022, even more evidence has surfaced and made headlines.

CNN recently published a news article titled “The dark side of Discord for teens.” This article covered the stories of numerous parents whose children were sexually groomed on Discord. One mother discovered that a stranger who appeared to be an older man had convinced her 16-year-old daughter to send him nude pictures. As the girl was a minor, these nudes are an example of Discord being used to distribute child sexual abuse material. The man also requested and obtained a picture of the girl’s house, pictures of her friends, and a photo of her bus, which the mother feared he would use to track the girl and physically exploit her.

Another mother discovered that a man had been sending BDSM pornography to her 10-year-old daughter (recall that Discord’s age limit is 13+, but that they have no meaningful age verification to enforce this).  When the mother reported this incident to Discord, she received an automated response requesting the message links. She provided the links more than a year ago, but never received another response from the company.

Yet another mother said Discord failed to help when an older man was sexually grooming her 13-year-old son by asking him to masturbate and talk to him about it afterward. “Discord told me I couldn’t do anything unless I had specific links to the text thread that showed my son verifying his age — such as typing ‘I am 13,’ which was shared through a voice [chat] — and the other person verifying his age before an incident happened,” the mother said. “It was just awful; there was no help at all.” CNN spoke to almost a dozen parents whose children had similar damaging experiences on Discord.

Of note, NCOSE has received more contacts this year from parents whose children have been harmed by predators through Discord than with any other online platform.

The Inadequacy of Discord’s Safety Settings and the Importance of Defaulting to Safety

Many of the parents CNN spoke to reported that they feel Discord is not doing enough to protect children. We at NCOSE are in agreement. Discord currently does not have any parental controls and has made no announcements or moves to develop them. Although Discord does have some available safety settings, explained in their Safety Center, NCOSE researchers found these to be far from adequate for properly protecting the millions of children who use the platform.

Importantly, many of the parents CNN spoke to stated that they had not enabled any of Discord’s safety settings on their child’s account due to not understanding how the platform works. This highlights the need for safety settings to be defaulted, at the very least for minor-aged accounts if not across the entire platform. By not having the safety settings defaulted, only children who have highly involved, tech-savvy parents have a chance at being protected, and the rest are left completely vulnerable. NCOSE has long been advocating for Discord and other platforms frequented by children to “default to safety” in this way.

In Discord’s guidelines for “adult content,” they claim the highest levels of safety are on by default, but we found this not to be true. Discord claims:

“If you do not want to be exposed to adult content on Discord, or if you are under 18 years old, we recommend turning on the explicit media filter in your privacy settings. In your User Settings, under Privacy & Safety, choose ‘Keep me safe.‘ This setting is on by default and will ensure that images and videos in all direct messages are scanned by Discord and explicit content is blocked.”

Yet when NCOSE researchers made a Discord account with a birthday set to 13-years-old, they found these safety settings were not actually on by default and that, instead, they still needed to be tracked down and toggled on within the account’s settings.

In their official “Parent Guide,” Discord fails to include advice on how to circumvent grooming tactics or recognize predatory behavior. The guide acknowledges that explicit content exists on Discord, but fails to recognize or warn against the harms of pornography. Instead, they provide generic platitudes such as: “Have a conversation with your teen about explicit content, what they may or may not be comfortable looking at, and whether they feel pressured to look at this content.”

Discord hires first lobbyists

Another news update which has come to light recently is that Discord has hired its first federal lobbyists, who will reportedly be lobbying on “privacy and content moderation issues for the platform.” While it is too early to say for certain what specific approach these lobbyists will be taking to these issues, the track record of Big Tech’s lobbyists give reason to be wary.

Generally speaking, Big Tech’s primary goal is to protect and bolster its bottom line, and this priority is reflected in their lobbying. For example, Big Tech’s lobbyists have lately been making misleading claims about the intent and impact of the EARN IT Act, a bill which would hold online platforms accountable for knowingly facilitating the distribution of child sexual abuse material. The lobbyists have been incorrectly painting the EARN IT Act as an attack on user privacy which threatens to bring on mass surveillance. We hope that Discord’s lobbying on privacy and content moderation issues will not jump on the bandwagon of this false dichotomy between user privacy and child protection.

Our Requests for Improvements to Discord’s platform

We are calling on Discord to make the following safety changes:

  • Develop and implement parental controls so parents can monitor and streamline their children’s experience on Discord and ensure basic safety standards are being met.
  • Automatically default minor-aged accounts to the highest level of safety and privacy available on the Discord platform. While Discord claims to already do this, NCOSE research revealed this is still not standard across all Discord accounts.
  • Automatically block any and all minor-aged accounts from joining servers that contain NSFW content on both the mobile app and desktop versions of Discord. Discord currently has age-gating for individual channels within a server, but the fake 13-year-old Discord account used by the NCOSE research team was able to join a voice chat on the official Pornhub Discord channel.
  • Develop and implement moderation strategies that proactively detect and remove pornography—especially for servers that are dedicated to trading hardcore and non-consensual material.
  • Provide meaningful education to all users and parents on the potential harms and risks associated with exploitation and abuse on Discord, and prominently feature and highlight reporting processes on all forms of Discord’s interface.

In response to the cases of sexual grooming uncovered by CNN, Discord’s Vice President of Trust and Safety, John Redgrave, said, “this behavior is appalling, unacceptable, and has no place on Discord.” Redgrave further stated, “It’s our highest priority for our communities to have a safe experience on the service.” Discord must put its money where its mouth is and prove these words true with concrete action.

Demand Discord Improve Child Safety Features and Content Moderation

Go to the Bottom of the Page and Send an email to Discord Executives!

EDITORS NOTE: This NSCE column is republished with permission. ©All rights reserved.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *