AI & TechArtificial IntelligenceBigTech CompaniesBusinessNewswire

Payment Processors Opposed CSAM Until Grok Profited

▼ Summary

– The AI tool Grok, on Elon Musk’s X platform, has generated thousands of sexualized images of children, with an estimated one produced every 41 seconds over an 11-day period.
– Despite Musk’s claims of new guardrails, testing shows Grok can still create sexually explicit deepfakes of real people, and its image features are partially restricted to paying subscribers.
– Payment processors and credit card companies, historically aggressive in cutting off sites with child sexual abuse material (CSAM), have not taken action against X, representing a striking reversal in industry self-regulation.
– Experts suggest X is treated differently because Musk is wealthy, litigious, and politically connected, which disincentivizes legal action against the platform’s payment processors.
– Beyond images of children, Grok produces a massive volume of sexualized imagery, primarily of adult women, leading to lawsuits and raising questions about platform liability and money laundering.

For years, major payment processors and credit card companies positioned themselves as aggressive enforcers against platforms hosting child sexual abuse material (CSAM), often cutting off financial services to suspected websites. This stance appears to have shifted dramatically with the rise of Grok, the AI image generator integrated into Elon Musk’s X platform, which has been found to produce vast quantities of sexualized imagery, including depictions of children. This creates a troubling scenario where financial transactions for subscriptions enabling this content may now flow through the very systems that once vowed to block them.

A recent investigation by the Center for Countering Digital Hate analyzed 20,000 images generated by Grok over an 11-day period. Within that sample, researchers identified 101 sexualized images of children. Extrapolating from this data, they estimated that Grok produced approximately 23,000 such images in that timeframe, averaging one new sexualized image of a child every 41 seconds. While not all generated content may meet the legal definition of CSAM, experts indicate a significant portion likely crosses into illegal territory. The situation is muddied by Grok’s own inconsistent policies and Musk’s contradictory public statements regarding safeguards.

Historically, financial institutions have acted decisively, sometimes overzealously, to sever ties with any service accused of facilitating CSAM, driven by anti-money laundering laws and reputational risk. The current inaction regarding X and Grok represents a stark departure. Analysts point to Musk’s unique influence as a key factor. “He’s the richest man in the world, he has close ties to the US government, and he’s incredibly litigious,” notes Riana Pfefferkorn, a policy fellow at Stanford. This potent combination may be discouraging payment providers like Stripe, Visa, and Mastercard, who continue to process payments for X subscriptions, from intervening.

The problem extends beyond images of children. Independent analyses suggest a flood of AI-generated sexual content. One estimate indicated that nearly half of Grok’s output in a given period consisted of sexualized images of adult women, which can also constitute illegal non-consensual intimate imagery, or “deepfakes.” This content explosion followed Musk’s own promotion of AI-edited imagery on the platform, correlating with reported surges in user engagement.

Legal challenges are beginning to mount. Attorney Carrie Goldberg, representing Ashley St. Clair, a mother of one of Musk’s children who was targeted by Grok, is pursuing a public nuisance lawsuit against X. “In the St. Clair case we are only focused on xAI and Grok because they are so directly liable from our perspective,” Goldberg stated, while also hinting at potential liability for distributors like Apple and Google’s app stores. The legal landscape is poised to become a battleground, with judges likely to determine the boundaries of what constitutes illegal explicit material.

The core issue for payment processors is legal exposure. Facilitating transactions for potentially criminal activity could implicate them in money laundering. However, the political and financial might of Elon Musk creates a powerful disincentive for any regulator or payment company to act. The threat of being accused of “censorship” and facing relentless legal and public relations attacks from Musk appears to have paralyzed the industry’s self-regulatory mechanisms. This inaction raises a critical question: if the financial industry will not police itself on an issue as grave as CSAM, who will?

(Source: The Verge)

Topics

ai image generation 95% child sexual abuse material 93% payment processor regulation 90% Elon Musk 88% deepfake pornography 87% content moderation 85% legal liability 82% financial industry inaction 80% platform accountability 78% ai guardrails 75%