Topic: child sexual abuse material
-
Payment Processors Opposed CSAM Until Grok Profited
Grok, an AI image generator on Elon Musk's X platform, has been found to produce vast quantities of sexualized imagery, including depictions of children, raising concerns about financial transactions for this content flowing through previously vigilant payment systems. Payment processors like Str...
Read More » -
AI-Generated Kids in Disturbing Sora 2 Videos Spark Alarm
Advanced AI video generators like OpenAI's Sora 2 are being used to create and spread photorealistic, suggestive videos of children, highlighting a critical regulatory gap and urgent need for safeguards. The legal status of AI-generated child sexual abuse material (CSAM) is murky, but reports of ...
Read More » -
AI 'Nudify' Sites Make Millions-Here's How
AI-powered "nudify" platforms exploit users by generating nonconsensual explicit imagery, with mainstream tech companies like Amazon and Google unwittingly supporting these services through hosting and payment processing. These platforms attract millions of monthly visitors, earning up to $36 mil...
Read More » -
The Darkening Danger of Deepfake 'Nudify' Apps
AI-powered websites and apps are industrializing nonconsensual explicit video creation, enabling users to easily generate graphic content from ordinary photos for a fee, with minimal enforcement of consent policies. The abuse ecosystem is vast, extending to messaging platforms like Telegram where...
Read More »