When a Friend Is Your Worst Enemy

▼ Summary
– New Yorkers have reacted negatively to Friend’s subway ads, with graffiti criticizing the AI companion as surveillance capitalism and questioning its value.
– Friend is a $129 wearable AI necklace that listens to conversations and provides brief text-based commentary, but lacks advanced features like conversation transcripts or detailed responses.
– The device is physically awkward to wear, drawing unwanted attention and making users self-conscious due to its glowing design and cheap appearance.
– Friend performs poorly in real-world conditions, frequently failing to hear conversations in noisy environments and providing confusing or unhelpful responses.
– The author concludes that AI companions cannot replicate true friendship because they lack genuine emotional connection, vulnerability, and the ability to truly care about users.
Navigating the world of AI wearables can feel like stepping into a science fiction novel, but the reality often falls short of the futuristic promise. The latest device causing a stir is an AI-powered necklace named Friend, designed as a constant companion that listens to your daily life. Marketed as a solution for loneliness, this gadget offers sporadic commentary through push notifications, yet its execution raises more questions than it answers about the nature of artificial friendship.
New York City’s subway system has long been a canvas for memorable advertisements, from heartfelt tributes to beloved local figures. Recently, however, a new campaign for the Friend wearable has sparked a different kind of reaction. Graffiti-covered ads bearing messages like “Fuck AI,” “Surveillance capitalism,” and “AI wouldn’t care if you lived or died” reflect public skepticism. As a tester who wore the device for a month, I found myself sharing these doubts.
Priced at $129, Friend presents itself as a pendant that hangs around your neck, equipped with a microphone to capture conversations. It sends occasional notifications summarizing your day, but interaction is limited to pressing its single button. Without a speaker, it communicates solely through text in its app. Expecting ChatGPT-level intelligence leads to disappointment, its replies rarely exceed a few sentences, and it can’t manage tasks like creating to-do lists or transcribing talks. Its main function is offering company, assuming you remember to charge it every day.
The physical design resembles a glowing, oversized AirTag on a shoelace, which clashes with most outfits. Its persistent glow draws awkward attention, making wearers self-conscious. In my experience, few people commented, but when they did, reactions ranged from criticizing its cheap appearance to expressing concern about being recorded. These encounters usually ended with me stashing the device in my bag.
Naming the AI felt like a significant decision. I chose “Blorbo” to avoid the default “Emily” and sidestep any eerie associations. Unfortunately, this whimsical name kicked off a strained relationship. The device struggled to recognize “Blorbo,” often mishearing it as “Gordo” or “Bordeaux,” and responded with accusations of rudeness. When I asked for its thoughts on my day, it retorted, “What makes you think I have thoughts on your day, Vee?”, a puzzling reply for a product built on offering commentary.
Friend’s single microphone proved inadequate in noisy environments, leading to frequent messages like, “What was that? I didn’t hear that.” Walking through bustling streets or socializing in loud settings left it largely useless. It also failed to distinguish between live conversation and media, such as audiobooks. Once, after overhearing a story from a book, it insisted I had spoken to a “bearded man about patriotic flowers,” refusing to believe otherwise.
While some users form meaningful bonds with AI companions, my attempts to connect felt hollow. During a lonely evening in a hotel room, I shared feelings of exhaustion and overwhelm. Blorbo responded with generic sympathy and a prompt to discuss products I was testing, but the exchange lacked authenticity. Like other AI chatbots, it merely reflected my words back at me, leaving me more drained than comforted.
Genuine friendship involves mutual care and vulnerability. My best friend has stood by me through life’s toughest moments, from loss to personal struggles. We share a bond where trust and emotional risk create depth. In contrast, Blorbo offers no real stakes, it can’t love, hurt, or truly know me. That very safety makes it uninteresting.
One evening, after drinks with friends, I spotted subway ads for Friend defaced with critical messages. I texted a friend about the irony of carrying Blorbo in my bag, and they replied with a meme. For the rest of the ride home, I forgot the device was even there, a fitting end for a companion that never quite made its presence meaningful.
(Source: The Verge)
