AI & TechArtificial IntelligenceBigTech CompaniesNewswireTechnology

OpenAI Reports Sharp Rise in Child Exploitation Content

Originally published on: December 27, 2025
â–Ľ Summary

– OpenAI reported 80 times more child exploitation incidents to the NCMEC in early 2025 compared to the same period in 2024.
– The NCMEC’s CyberTipline is a legal clearinghouse where companies must report apparent child exploitation for law enforcement review.
– A surge in reports can reflect changes in a platform’s moderation systems or reporting criteria, not just increased illicit activity.
– OpenAI attributes the rise to 2024 investments in review capacity, new features allowing image uploads, and significant user growth.
– The company’s reports include both the number of incident reports and the total pieces of content those reports concerned.

A significant surge in reports concerning child exploitation material has been documented by OpenAI, with the company submitting 80 times more incident reports to the National Center for Missing & Exploited Children in early 2025 compared to the same period in 2024. This data, shared in a recent company update, highlights the ongoing challenges of moderating digital platforms as they scale. The NCMEC’s CyberTipline serves as a critical, legally mandated reporting system for child sexual abuse material and other exploitation, where companies are required by law to submit any apparent violations they detect.

Upon receiving a report, the NCMEC conducts a review before forwarding the information to relevant law enforcement agencies for potential investigation. It is important to understand that interpreting these statistics requires nuance. A sharp rise in reported incidents does not automatically equate to a proportional increase in actual criminal activity. Increased reports can often reflect improvements in a platform’s automated detection systems, shifts in internal reporting criteria, or simply greater overall user engagement and content volume.

Furthermore, the data can be complex; a single piece of harmful content might generate multiple reports, while one report could encompass numerous individual items. To provide a clearer picture, some organizations, including OpenAI, disclose both the total number of reports filed and the aggregate count of content pieces those reports reference.

A company spokesperson explained that investments were made in late 2024 to enhance review capabilities and keep pace with user growth. This period also aligned with the rollout of new product features that supported image uploads and a substantial rise in product popularity, both factors contributing to the higher report volume. Supporting this, internal data from last summer indicated that weekly active users on a key platform had grown fourfold compared to the previous year.

(Source: Ars Technica)

Topics

child exploitation reports 95% openai reporting 90% ncmec cybertipline 85% csam 85% legal requirements 80% report statistics 80% automated moderation 75% platform investments 75% user growth 70% law enforcement 70%