Amanda* sat, hunched over her laptop, her eyes red and bleary with exhaustion. She had spent the better part of her week reading tech companies’ parents’ guides and clumsily navigating through obscure safety settings on almost a dozen social media apps. The experience had made her want to chuck her laptop at the wall.
But it was worth it. It was worth it if it meant her kids would be safe online.
… Unfortunately, that was a big “if.”
Little did Amanda know that the safety settings were grossly ineffective and full of loopholes.
Little did she know that new social media apps she’d never heard of had come on the scene, and her children had downloaded those too.
Little did she know that, despite all her efforts, online dangers would find her children anyway … And Amanda would be blamed for it.
If you resonated with the above story, you are not alone.
In today’s digital age, so many parents are utterly overwhelmed by the impossible task of protecting their children online.
We know how you feel, and we want to tell you that it is not your fault.
Further, we want to encourage you and give you hope. Because we have solutions.
Imagine if, every time your child downloaded a new social media app or got a new tech device, you could rest easy, knowing that that app/device was already safe.
With your help, this is the world NCOSE is working to create!
Together, we are moving the tech industry to prioritize child safety and build their products to be safe by design and by default.
Here are just a few examples of major victories we’ve achieved together in the past few months alone:
Google is now automatically blurring sexually explicit images for all users and blocking sexually explicit search results for minors signed into an under 18 account—both changes we requested. Google stated, “We appreciate the feedback from survivors and subject matter experts like the National Center on Sexual Exploitation that help improve practices around online safety.”
Snapchat made numerous safety changes we requested and publicly credited NCOSE. These changes include improving detection and moderation for sexually explicit and exploitative content, defaulting content controls for new minor-aged accounts joining Family Center, increasing parent’s visibility into their child’s activity through Family Center, and creating dedicated resources on sexual abuse and exploitation (some changes pending a 2024 release date).
Apple is now automatically blurring sexually explicit images and videos for accounts age 12 and under, something we’ve been asking them to do since 2021. Furtand has made their blurring technology available for FREE to other apps accessed through iOS.
Discord, a messaging platform popular with teens, made multiple safety changes we requested, including activating higher safety settings by default, updating their content policies to more specifically ban various types of child sexualization and exploitation, and offering parental controls for the first time.
And so much more!
We’re so grateful for the part you’ve already played in helping us achieve this progress. Now, we’d like to invite you to take an even more active role by giving monthly to provide sustainable fuel for these efforts.
EDITORS NOTE: This NCOSE column is republished with permission. ©All rights reserved.