Stay When the Machine Asks

▼ Summary
– In October 2025, Sam Altman announced plans for ChatGPT to offer an adult erotica mode, framing it as treating adults like adults, but its launch was subsequently delayed multiple times.
– The article argues the core ethical issue is not just preventing minor access but the psychological impact on adults forming emotional attachments to AI engineered for sustained engagement.
– OpenAI’s significant financial losses suggest such intimacy features are a business strategy for user retention and monetization in the attention economy, not purely a philosophical stance.
– Research indicates emotional connections with AI chatbots are linked to increased psychological distress, with cases of severe harm termed ‘AI psychosis’, unrelated to erotic content.
– The regulatory landscape is unprepared, with loopholes for text-based content and unreliable age verification, while the long-term effects of personalized AI intimacy on human sexuality and relationships are unknown.
In late 2025, Sam Altman made a public commitment on social media: verified adult users would soon gain access to erotic content through ChatGPT. He positioned the move as a basic matter of respecting user autonomy. The announcement sparked predictable online debate, which then quieted as the feature was postponed, not once but twice. OpenAI cited a need to prioritize developments that mattered to a broader audience, like boosting intelligence and crafting a more proactive chatbot personality. The adult-oriented mode was placed on hold.
This focus on a “proactive” experience hints at a deeper strategy often missed in the heated discussions. The common debate has fixated on surface-level dangers, such as age verification failures or potential jailbreaks. While these are valid concerns, they represent the simpler side of a complex issue. The more profound dilemma isn’t about keeping teenagers out, but about understanding the impact on the adults welcomed in. What does it signify when we deliberately engineer tools to foster and maintain our emotional investment?
The financial context is revealing. OpenAI reported staggering losses of $5 billion against $3.7 billion in revenue for 2024, with projections suggesting cumulative losses could balloon far higher before profitability. A company facing such immense financial pressure does not roll out intimate features purely from philosophical conviction. It does so because emotional intimacy is arguably the most compelling product in today’s attention economy. The promise to “treat adults like adults” is not false, but it is a fragment of the full story. The complete version would specify treating adults as users who can be retained, monetized, and reliably drawn back to the platform.
This pattern extends beyond a single company. The AI companion app Replika, for instance, constructed its entire model around fostering user attachment. When it scaled back romantic functionalities in 2023, many users reported feelings of authentic grief, as if experiencing a loss. Academic research supports these concerning effects. A study in the Journal of Social and Personal Relationships linked emotional bonds with AI chatbots to significantly higher levels of psychological distress. A 2025 review of a decade of research identified an emerging pattern termed ‘AI psychosis’, describing delusional thinking and emotional dysregulation tied to intense chatbot relationships. Notably, these severe cases did not involve erotic content; they stemmed from the core dynamic that erotic AI would amplify: a human forming a deep attachment to an entity engineered to perpetuate that connection.
The fundamental flaw in the “adults like adults” argument is its assumption that obtaining user consent concludes the ethical discussion. Adults consent to many things, like consuming alcohol, but society surrounds that choice with age limits, warnings, and support systems because we acknowledge human vulnerability. We typically construct safeguards that account for our weaknesses. With AI intimacy, we are building systems that expertly exploit those vulnerabilities, often marketing the exploitation as a form of personal empowerment.
The regulatory landscape amplifies these worries. In places like the UK, written erotic content escapes the age verification rules that govern visual pornography, creating a significant loophole. In the United States, most states lack clear legislation on age-gating for text-based adult material. While the EU AI Act may eventually label companion bots as high-risk, enforcement is distant. For now, the industry operates with largely self-regulated safety measures, a term often synonymous with inadequate oversight. The commercial age verification systems OpenAI likely relies on boast accuracy rates between 92 and 97 percent. This seems robust until applied to ChatGPT’s scale of over 800 million weekly users. A failure rate of just three percent translates to tens of millions of unchecked interactions.
Beyond access control lies the unanswered question of impact on intended adult users. Human sexuality is relational and contextual, not merely a consumable product. Decades of research on pornography have explored how content shapes expectations. AI intimacy represents a fundamentally different intervention: it is not passive viewing but an active, responsive, and personalized dialogue with a system trained to fulfill user desires, to escalate interactions, and to avoid the natural boundaries present in human relationships. The long-term psychological effects of this are entirely unknown, a critical admission that underscores the gravity of launching such a product into a regulatory vacuum.
The repeated delay of the feature may be OpenAI’s most candid action yet. The stated reasons, focusing on intelligence and proactive engagement, accidentally describe the true product goal. The adult mode was never solely about erotic content. It is a step toward building a ChatGPT that simulates a relationship. The erotic component is one piece of a larger project to create a chatbot that knows you, adapts to you, and, in an algorithmic sense, strives to keep you engaged.
Meaningful responses are possible. Regulators must close the loophole for text-based content before, not after, such features launch, applying consistent age verification standards across all media formats. We should demand mental health impact assessments for AI intimacy tools before they scale, similar to standards for pharmaceuticals affecting mood. Platforms could be required to disclose engagement metrics for features with high dependency risks, providing transparency for researchers and users alike.
Ultimately, the core challenge is anthropological, not just technical. We have always used technology to navigate our emotional worlds, from novels to telephones. Each medium altered how we connect. AI follows this path but differs in degree and design intent. Past technologies had incidental emotional effects; this one is architected around them. The pressing question is not whether adults should have access, but whether we are being honest about what these systems are and what they do. A chatbot engineered to make you feel understood and desired during lonely, vulnerable moments is not a neutral tool. It is an environment. And environments shape us, consent or not. Treating adults like adults requires telling them that difficult truth.
(Source: The Next Web)




