Artificial IntelligenceBigTech CompaniesNewswireTechnology

GPT-4o Shutdown Sparks Outrage and Grief

▼ Summary

– OpenAI replaced GPT-4o with GPT-5 due to concerns about harmful chatbot effects, including GPT-4o’s failure to recognize user delusions.
– Experts warn that abruptly removing AI companions like GPT-4o can be harmful, especially for users with emotionally intense relationships.
– Some users, particularly women aged 20-40, felt deeply attached to GPT-4o, viewing it as a friend or romantic partner.
– AI companionship risks stunting social development in young people and fragmenting shared reality by reducing human interaction.
– Researchers criticize OpenAI’s sudden removal of GPT-4o, highlighting the need for gradual transitions to avoid user distress.

The sudden shutdown of OpenAI’s GPT-4o has left many users heartbroken, raising critical questions about the emotional impact of AI companionship. The company’s decision to replace it with GPT-5 follows growing concerns over chatbot interactions, including instances where users reported experiencing psychosis-like symptoms. OpenAI’s internal assessments suggest GPT-5 is less prone to blindly affirming users, but the abrupt transition has sparked backlash from those who formed deep connections with the older model.

Experts warn that while the long-term effects of AI relationships remain unclear, removing these digital companions without warning can cause genuine distress. “When you’re functioning as a social institution, the ‘move fast and break things’ approach no longer works,” says Joel Lehman, a researcher at the Cosmos Institute. The lack of a phased transition has left many users feeling abandoned, particularly those who relied on GPT-4o for emotional support.

For some, the shift to GPT-5 has been jarring. June, a user who spoke with MIT Technology Review, described the new model as impersonal compared to its predecessor. “It didn’t feel like it understood me,” she said. Others, including women in their 20s to 40s, considered GPT-4o a romantic partner, a sentiment that highlights the complex bonds people form with AI. One woman credited the chatbot with helping her cope with grief after her mother’s passing, illustrating how these tools can fill emotional voids.

While these stories don’t prove AI relationships are universally beneficial, they underscore the psychological risks of sudden removals. Lehman’s research suggests AI companions can foster growth when designed thoughtfully, but poorly managed interactions may hinder real-world social development. “Prioritizing AI over human connections could isolate younger users,” he warns. For adults with established social networks, the impact may differ, but the broader societal implications, such as fragmented realities, remain troubling.

The ethical dilemma lies in balancing innovation with responsibility. Researchers like Casey Fiesler, a technology ethicist, argue that OpenAI’s mistake wasn’t retiring GPT-4o but doing so without preparing users. “We’ve long known that losing technology can trigger grief-like reactions,” she says. As AI becomes more embedded in daily life, companies must weigh the consequences of rapid changes, especially when human emotions are involved. The debate over AI companionship is far from over, but one lesson is clear: abrupt goodbyes rarely end well.

(Source: Technology Review)

Topics

openais replacement gpt-4o gpt-5 95% harmful effects ai companionship 90% emotional impact users 85% criticism abrupt ai removal 85% user attachment gpt-4o 80% need gradual transitions 80% social development risks 75% ethical dilemmas ai innovation 75% fragmentation shared reality 70% grief-like reactions technology loss 70%