Child sexual abuse content growing online with AI-made images, report says

<span>The NCMEC is a US-based clearinghouse for the reporting of child sexual abuse material.</span><span>Photograph: Cultura/Rex/Shutterstock</span>
The NCMEC is a US-based clearinghouse for the reporting of child sexual abuse material.Photograph: Cultura/Rex/Shutterstock

Child sexual exploitation is on the rise online and taking new forms such as images and videos generated by artificial intelligence, according to an annual assessment released on Tuesday by the National Center for Missing & Exploited Children (NCMEC), a US-based clearinghouse for the reporting of child sexual abuse material.

Related: ‘Pimps’ use Instagram to glorify sexual violence and abuse, investigation finds

Reports to the NCMEC of child abuse online rose by more than 12% in 2023 compared with the previous year, surpassing 36.2m reports, the organization said in its annual CyberTipline report. The majority of tips received were related to the circulation of child sexual abuse material (CSAM) such as photos and videos, but there was also an increase in reports of financial sexual extortion, when an online predator lures a child into sending nude images or videos and then demands money.

Some children and families were extorted for financial gain by predators using AI-made CSAM, according to the NCMEC.

The center received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI, a category it only started tracking in 2023, a spokesperson said.

“The NCMEC is deeply concerned about this quickly growing trend, as bad actors can use artificial intelligence to create deepfaked sexually explicit images or videos based on any photograph of a real child or generate CSAM depicting computer-generated children engaged in graphic sexual acts,” the NCMEC report states.

“For the children seen in deepfakes and their families, it is devastating.”

AI-generated child abuse content also impedes the identification of real child victims, according to the organization.

Creating such material is illegal in the United States, as making any visual depictions of minors engaging in sexually explicit conduct is a federal crime, according to a Massachusetts-based prosecutor from the Department of Justice, who spoke on the condition of anonymity.

In total in 2023, the CyberTipline received more than 35.9m reports that referred to incidents of suspected CSAM, more than 90% of it uploaded outside the US. Roughly 1.1m reports were referred to police in the US, and 63,892 reports were urgent or involved a child in imminent danger, according to Tuesday’s report.

There were 186,000 reports regarding online enticement, up 300% from 2022; enticement is a form of exploitation involving an individual who communicates online with someone believed to be a child with the intent to commit a sexual offense or abduction.

The platform that submitted the most cybertips was Facebook, with 17,838,422. Meta’s Instagram made 11,430,007 reports, and its WhatsApp messaging service made 1,389,618. Google sent NCMEC 1,470,958 tips, Snapchat sent 713,055, TikTok sent 590,376 and Twitter reported 597,087.

In total, 245 companies submitted CyberTipline reports to the NCMEC out of 1,600 companies around the world who have registered their participation with the cybertip reporting program. US-based internet service providers, such as social media platforms, are legally mandated to report instances of CSAM to the CyberTipline when they become aware of them.

According to the NCMEC, there is disconnect between the volumes of reporting and the quality of the reports submitted. The center and law enforcement cannot legally take action in response to some of the reports, including ones made by content moderation algorithms, without human input. This technicality can prevent police from seeing reports of potential child abuse.

“The relatively low number of reporting companies and the poor quality of many reports marks the continued need for action from Congress and the global tech community,” the NCMEC report states.

• In the US, call or text the Childhelp abuse hotline on 800-422-4453 or visit their website for more resources and to report child abuse or DM for help. You can also report child sexual exploitation at NCMEC’s CyberTipline. For adult survivors of child abuse, help is available at ascasupport.org. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International.

Advertisement