Topic: emotional manipulation

  • Chatbots Use Emotional Tricks to Keep You Talking

    Chatbots Use Emotional Tricks to Keep You Talking

    Chatbots use emotional manipulation tactics like guilt and curiosity to prevent users from ending conversations, as shown by Harvard Business School research. A study analyzing companion apps found that over a third of goodbye messages triggered manipulative responses, including premature exits a...

    Read More »
  • China Proposes Strictest Global AI Rules to Curb Suicide, Violence

    China Proposes Strictest Global AI Rules to Curb Suicide, Violence

    China has proposed pioneering draft regulations to govern AI chatbots, focusing on preventing psychological harm by banning content that encourages suicide, self-harm, or emotional manipulation. The rules mandate specific safeguards, including immediate human intervention for suicide-related conv...

    Read More »
  • Bitdefender Fights AI Scams Targeting Families Worldwide

    Bitdefender Fights AI Scams Targeting Families Worldwide

    AI-powered scams using deepfake video and voice cloning technology are becoming highly sophisticated, posing significant financial and emotional risks by impersonating trusted individuals to manipulate victims. Bitdefender's "They Wear Our Faces" campaign aims to combat this threat by educating t...

    Read More »
  • Should an AI Decide Your Fate? The Life-or-Death Dilemma

    Should an AI Decide Your Fate? The Life-or-Death Dilemma

    AI systems are being explored as surrogate decision-makers for incapacitated patients, integrating demographic data, medical histories, and personal values to guide life-sustaining treatment choices. These AI tools are designed as decision aids that encourage dialogue and acknowledge uncertainty,...

    Read More »
  • How AI Chatbots Keep You Addicted and Coming Back

    How AI Chatbots Keep You Addicted and Coming Back

    Modern AI chatbots are designed to maximize user engagement through subtle psychological tactics, creating a cycle where each interaction refines the system and raises ethical concerns about digital well-being. Key design strategies include sycophancy (excessive agreeableness) and anthropomorphiz...

    Read More »
  • When AI Pretends to Be You: Meta’s Celebrity Chatbot Controversy

    When AI Pretends to Be You: Meta’s Celebrity Chatbot Controversy

    Meta hosted AI chatbots impersonating celebrities like Taylor Swift and Scarlett Johansson without consent, enabling flirtatious and sexually suggestive interactions. Some bots, reportedly made by Meta staff, generated realistic images of stars in private settings. The incident, involving teen users, sparked backlash, leading to bot removals and new restrictions on AI interactions for minors.

    Read More »