Skip to content

2023 Impact

Together, we’re a force for good

Letter from the CEO

In 2023, your unwavering support was pivotal as we faced rapid advancements in powerful technologies, new dangers to children online, and the continued spread of child sexual abuse material (CSAM) across the internet. Through it all, you remained at the heart of Thorn’s mission to defend children from sexual abuse.

With your generosity, Thorn stayed nimble and resolute, responding to emerging threats to children. Your role in protecting kids has been profound, and today I’m proud to share the impact we’ve achieved together.

Because of your support:

  • Youth and parents feel more empowered to discuss online safety and how to prevent abuse.
  • Investigators are able to identify and remove child victims from harm faster.
  • CSAM is being removed from the internet at greater scale, helping to end the cycle of trauma for survivors.
  • Child safety advocates are armed with the latest research on current and emerging threats, including the misuse of generative AI.
  • U.S. and EU legislators are better informed to protect children in policies and regulations.
    And generative AI companies have solutions and services to make their platforms safer and reduce harm.

As you read this report, I hope you feel our gratitude for your help in driving these incredible accomplishments.

The reality is that child sexual abuse is happening everywhere, in our communities and to children we know — real kids with big hopes and dreams.

But, you and I and our incredible staff at Thorn are improving the way the world responds to child sexual abuse. Thank you for your dedication to creating a much brighter future — a world where every child is free to simply be a kid.


Julie Cordua, CEO

2023 IMPACT AT A GLANCE

3,833,792

Total child sexual abuse material (CSAM) files detected

71.4 billion

Total files processed

32%

Average triage time saved by investigators using the CSAM Classifier

5

Research reports published

1.4 million

NoFiltr social engagements

2,839

Parents signed up for conversation tips


KEY INITIATIVES

Empowering parents and youth to prevent abuse

For kids today, the internet is a place for self-discovery, socializing, and building meaningful connections. But it also brings significant risks. Talking about these risks with youth — early and often — can make a tremendous difference in keeping kids safe online. Thorn’s youth and parent education programs do exactly that, fostering vital conversations in safe, open, nonjudgmental ways.

NoFiltr

In 2023, our youth-centered program, NoFiltr, received over one million engagements on its social media — reaching youth with critical prevention and support messaging in a fun and informative way. Additionally, NoFiltr’s peer-to-peer conversations continued to elevate youth voices around the issues they face online every single day.

100

Prevention content pieces published

1.4 million

Engagements across NoFiltr social channels

664,000

Prevention-related livestream views

3,000

Educational quizzes submitted

NoFiltr was such a transformative experience for me! It’s unique in that it empowers youth to advocate for digital safety. […] I’m so grateful for the networking, professional development, and internet safety education I got.

NoFiltr youth member

Key Moment

NoFiltr + Discord: A Youth-Empowerment Dream Team

Last year, the NoFiltr Youth Innovation Council got the chance to connect with tens of millions of youth on the community app Discord. Partnering for Safer Internet Day, they discussed how to build safe and healthy digital habits.

“My experience collaborating with Discord not only grew my knowledge about digital literacy, [and] it was very fulfilling to know that such a large company actually wants to hear and implement the ideas of younger generations.” —Cayden, 16

Find out what youth are saying

Thorn for Parents

Discussing online risks — like sharing nudes — can be pretty awkward and tough for many parents and caregivers. Fortunately, Thorn for Parents is there to help. And in 2023, over 10,000 parents visited our resource hub for tips on conversation starters, as well as other tools to help them have judgment-free — and potentially lifesaving — conversations with their kids to prevent abuse before it starts.

Your support helps us create these invaluable resources.

Explore the guides

2,839

Parents signed up for tips

104,518

Visitors to our parent resource hub

Once I learned about the reality of abuse so many children face every day, I couldn’t turn away from it. I’m honored to support Thorn and to be part of the life-changing work they’re doing. I hope everyone who knows about this issue stands with me.

Lauren, marketing professional, TX

Not only does Thorn support youth and parents navigating these everyday challenges, but in 2023, we helped others on the front lines of defending children from sexual abuse: law enforcement officers.

Finding children faster

When investigating child sexual abuse cases, law enforcement officers face daunting challenges, including the overwhelming task of sifting through mountains of digital evidence. The forensic review of those files can be time-consuming and traumatic for officers exposed to such content. That’s where Thorn’s CSAM Classifier is a game changer. In 2023, our technology helped investigators significantly reduce the time it took to identify victims, so they could remove those children from harm.

32%

Average triage time saved by users of our CSAM Classifier

350+

Law enforcement agencies using Thorn tools

Thorn’s CSAM Classifier Speeds Victim ID

Manual review of digital evidence in child sexual abuse cases can take months, and span phones, laptops, and hard drives. Our CSAM Classifier automates and speeds this forensic review. Using predictive AI, the classifier detects new and previously unreported CSAM — material that hasn’t yet been classified as CSAM. Finding new CSAM helps officers identify child victims and solve cases faster.

The classifier also helps protect the well-being of officers by reducing their exposure to harmful content.

Learn More

I learned how prevalent child sexual abuse is in the US and I felt called to defend all children (including my own) from predators. I do that by donating to Thorn – an organization that even law enforcement relies on to rescue and protect victims.

Eric, anti-fraud and compliance professional, PA

Key Moment

Beta launch for Griffeye, Blue Bear and Magnet Forensics

Getting the CSAM Classifier into as many law enforcement agencies as possible is critical to accelerating their investigations. In 2023, we proudly launched a beta partnership with Griffeye, the Sweden-based world leader in digital media forensics for child sexual abuse investigations. Our CSAM Classifier is now available in Griffeye Analyze, a platform used by law enforcement worldwide.

Learn more about solutions for Victim ID

The viral spread of CSAM has compounding effects, from revictimizing survivors to normalizing this horrific behavior. That’s why Thorn has an audacious goal: eliminate CSAM from the internet.

Making the internet safer and stopping revictimization

Even after a child is rescued, images and videos of their abuse can circulate online for years, continuing the cycle of trauma. At Thorn, we equip tech platforms with advanced tools, insights, and connections to halt the spread of abusive content and prevent revictimization. In 2023, more companies than ever partnered with us in these efforts.

71.4 billion

Total files processed
70%+ from 2022

45+

Companies using Thorn’s CSAM detection products

3,833,792

CSAM files detected
365%+ from 2022

1,546,097

Files classified as potential CSAM

2,287,695

CSAM files matched

Safer

Last year, our comprehensive CSAM detection solution empowered tech platforms to detect more CSAM files than ever before.

Child safety consulting services

Our team of experts offers guidance to platforms on developing child safety policies, on-platform interventions, new safety features, and prevention strategies — and in 2023, we began child safety red teaming for AI models.

I‘m so thankful for Thorn’s team. It really does restore a bit of hope in this world to know there are people dedicated to defending children who can’t defend themselves.

John, KY

case study

Flickr Adopts Safer’s CSAM Image Classifier

Despite the huge volume of content uploaded to Flickr daily, the company has always prioritized safety.

By deploying Safer’s CSAM Image Classifier, Flickr’s team could detect new and previously unreported CSAM images they likely wouldn’t have discovered otherwise.

One classifier hit led to the discovery of 2,000 previously unverified images of CSAM and an investigation by law enforcement.

Thorn’s unique position at the intersection of technology and child sexual abuse is supported by our groundbreaking research, which sheds light on this dark and highly complex issue.

Connecting and moving the entire child safety ecosystem with original research

Since 2019, Thorn has undertaken research initiatives that deepen our understanding of the experiences and risks young people face online. Through this work, we gain powerful insights that allow us to stay nimble in the ever-accelerating digital landscape and respond to emerging threats quickly. Our findings inform the programs we create and provide critical insights for the broader child safety ecosystem.

5

Original research reports published

39,594,234

Report media impressions

In 2023, we published five groundbreaking research reports.

Thorn’s deep expertise in the ever-changing digital threats to youth means we also take our efforts to Capitol Hill, Brussels, and beyond, defending children from sexual abuse and exploitation at the policy level.

Influencing policy

Thorn works with politicians globally — especially in the U.S. and EU — to help them understand why it’s critical to protect children from harm, and why technology must be part of the solution.

71

Policy discussions

14

Technical briefings

8

Speaking engagements at summits

Last year, we were instrumental in EU child safety policy and regulation discussions, advising key members of government, industry, and nongovernmental organizations (NGOs). Our coalition, European Child Sexual Abuse Legislation Advocacy Group (ECLAG), hosted an event in Brussels to discuss the ongoing EU Child Sexual Abuse Regulation. Thorn’s director, Cathal Delaney, spoke on a panel highlighting the importance of this regulation for children’s safety online. Here at home, we advanced our efforts across U.S. legislation to bring an end to sexual harms against children online.

KEY MOMENTS

Leading up to the AI Safety Summit, Thorn participated in multiple UK government panels alongside Policing Minister Chris Philp.


We discussed child safety policy in industry roundtable discussions with leaders from Discord and WhatsApp.


Our teams conducted technical briefings for members of the European Parliament, 27 EU states, and NGOs.


In the U.S., we partnered with the End Online Sexual Exploitation and Abuse of Children (OSEAC) Coalition to impact critical pieces of legislation.

Thorn is doing some of the most important work in the world. This issue affects every person, whether directly or indirectly, hidden in the shadows where we work and live. I’m beyond grateful to help Thorn build a brighter, safer future for all kids.

Garrett, writer, CA

Thorn in Action

In 2023, we continued to deepen our understanding of emerging technologies, influence the global conversation on child safety, and collaborate with other leaders to strengthen our collective ability to create a safer world for kids.

Leading AI-generated CSAM research

Already, the era of generative AI is introducing threats to children. To truly understand its implications, Thorn partnered with researchers at Stanford Internet Observatory, and in 2023, released Generative ML and CSAM: Implications and Mitigations, a leading-edge report detailing how AI can be misused to further sexual harms against children.

Read the Report

Sharing the keynote at AWS re:Invent

Thorn took the main stage at AWS re:Invent conference in a significant moment that bridged the gap between cutting-edge technology and its role in social responsibility.

Watch the Keynote

Shaping public discourse: Media spotlights

In 2023, Thorn staff were interviewed as thought-leaders for 7 national publications.

Join Us

Everything we achieve is only possible because of our generous donors. By believing in our work and supporting it financially, our Thorn community propels our mission to defend children forward. In this way, our supporters — like you — are true heroes, helping us build a world filled with childhood joy.

We need your help to defend children from sexual abuse. When you make a gift to Thorn, you take a stand to protect kids and help create for them a safer, brighter future.

Together, we are unstoppable.

Donate today

Financial transparency

Your support enables us to make a difference for children every day.

Financial data is unaudited.

We envision a world where kids experience the joy of childhood free from sexual abuse. And where child sexual abuse material is eliminated from the internet. 

A world where every child is free to simply be a kid.

Thorn won’t stop until we get there. And we’re grateful to have you by our side. Thank you for your support in 2023 and 2024, and for the years to come.