AI & TechArtificial IntelligenceBusinessNewswireTechnology

Men Profit From Teaching Others to Make AI Porn

▼ Summary

– MG, a twentysomething from Arizona, discovered that AI-generated nude images of her were being circulated online without her consent.
– The doctored images were used to advertise AI ModelForge, a platform teaching men how to create AI influencers using real women’s photos.
– MG is one of three plaintiffs in a lawsuit against three Phoenix men who allegedly scraped women’s photos to train AI models and sell explicit content on Fanvue.
– The men allegedly sold courses on Whop for $24.95/month, teaching others how to generate AI influencers and remove clothes from images using CreatorCore software.
– The scheme reportedly generated over $50,000 in one month, with CreatorCore having more than 8,000 subscribers producing over 500,000 AI images and videos.

Just over a year ago, MG was living what most would consider a typical twentysomething life in Scottsdale, Arizona. Between shifts waiting tables on weekends and working as a personal assistant, she posted the occasional Instagram Story,matcha lattes, pool hangs with friends, a Pilates class. Nothing about her online presence suggested fame was the goal. “I never really cared to pop off and become popular on social media,” says MG, who is identified only by her initials in a lawsuit to protect her privacy. “I just used it the way most people did when it first came out, to share their lives with the people closest to them.” Her follower count hovered around 9,000,respectable, but far from influencer status.

Then last summer, a follower slid into her DMs with an unsettling question. Did she know that photos and videos of a woman who looked exactly like her were circulating on Instagram? MG clicked the link and found multiple Reels: her face, seemingly pasted onto a body that matched her own proportions, tattoos in the exact same places, but wearing next to nothing. “If you didn’t know me well, you could very well think they were images of me,” she recalls. “It was kind of like this reality check that I don’t have any control over my own image.”

The horror deepened when she learned those doctored nude and semi-nude images weren’t just floating around the internet. According to a recently filed complaint, they were being used to advertise AI ModelForge, a platform that teaches men how to create their own AI-generated influencers. Through online classes and tutorials, the men allegedly showed subscribers how to use a software called CreatorCore to train AI models on photos of unsuspecting young women, then post the resulting content on Instagram and TikTok. “They provided a whole playbook, including instructions on how to pick the right person so that it’s not someone who can defend themselves,” MG claims. “It was disgusting on every single level.”

MG is one of three plaintiffs in a January lawsuit filed in Arizona against three Phoenix men: Jackson Webb, Lucas Webb, and Beau Schultz, along with 50 other John Does. The suit alleges the Webbs and Schultz scoured social media for photos of unwitting women, used AI to generate explicit images and videos of fictional models who look exactly like them, and sold that content on the subscription platform Fanvue. For $24.95 a month on Whop, the men allegedly sold courses teaching other men,including the John Does,how to replicate the process. The so-called “Blueprints” instructed subscribers on scraping images from women’s social media accounts, feeding them into the generative AI on CreatorCore, and using a separate app to remove clothing and produce sexually explicit material. The suit claims this content generated millions of views and more than $50,000 in income during a single month. (The Webbs and Schultz did not respond to requests for comment.)

The complaint describes the operation as a moneymaking scheme that preyed on a “harem of indistinguishable AI copies of unsuspecting women and girls,” while instructing “predators seeking to prey on” women active on social media. By 2025, the suit alleges, CreatorCore had more than 8,000 subscribers cranking out their own AI influencers, resulting in over 500,000 images and videos.

(Source: Wired)

Topics

ai deepfakes 98% online exploitation 95% legal action 92% social media abuse 90% ai influencers 88% privacy violation 86% identity theft 84% monetization schemes 82% predatory behavior 80% technology misuse 78%