Kevin Rose’s AI Test: Would You Punch Someone Wearing It?

▼ Summary
– Kevin Rose avoids AI hardware investments that feel socially intrusive, using a “face punch” test to evaluate wearable devices.
– He believes current AI wearables often violate privacy norms by constantly listening and recording conversations.
– Rose argues successful technology requires emotional resonance and social acceptability, not just technical capability.
– He sees AI’s current phase mirroring early social media mistakes, where we’ll later regret thoughtless implementation.
– Rose believes AI is dramatically lowering entrepreneurial barriers and shifting VC value toward emotional intelligence rather than technical support.
Veteran investor Kevin Rose applies a strikingly personal litmus test to potential AI hardware investments: if a device makes you want to punch its wearer, it’s probably not worth funding. This blunt perspective comes from his extensive experience watching new startups repeat familiar errors. As a general partner at True Ventures and an early backer of major successes like Peloton, Ring, and Fitbit, Rose has largely steered clear of the current frenzy around AI hardware. While many venture capitalists eagerly finance smart glasses or AI pendants, he remains skeptical.
Rose emphasizes that much of today’s AI wearable technology seems designed to “listen to the entire conversation,” which he believes violates fundamental social norms around privacy. His background gives weight to these concerns, he served on the board of Oura, now dominating 80% of the smart ring market, and has seen what separates winning wearables from failures. For Rose, success hinges not just on technical specs but on emotional connection and social acceptance.
Speaking recently at a TechCrunch event, he explained that investors must ask not only whether the technology works, but how it makes people feel. “A lot of that emotional dimension gets lost in AI products,” he noted, “where devices are always on, always listening, trying to be the smartest presence in the room. That’s just not healthy.” Rose has personally tested various AI wearables, including the much-hyped Humane AI pin. He described a moment when he tried using it to replay a conversation during a disagreement with his wife. “That was the last time I wore it,” he admitted. “You don’t want to win an argument by checking your AI device’s logs. That’s not acceptable.”
He questions whether basic applications, like asking smart glasses to identify a landmark, justify the intrusion. “We tend to bolt AI onto everything, and it’s ruining the world,” Rose remarked, citing photo apps that let users erase people or objects from images. He shared an anecdote about a friend who digitally removed a gate from a backyard photo. “I thought, that’s your own yard! Your kids will look at that picture and wonder, didn’t we have a gate?”
Rose worries that society is repeating the early missteps of social media with AI, adopting technologies that seem harmless now but may have regrettable consequences. “We’ll look back and say, ‘That was weird, we just slapped AI on everything and thought it was a good idea,’” he predicted, drawing parallels to past social media trends.
These concerns hit home in his own family. After using OpenAI’s Sora to generate videos of tiny Labradoodles, his children asked where they could get the puppies. “I had to explain that it wasn’t real,” Rose said. “How do you have that conversation? It’s awkward.” His approach is to treat AI-generated content like movie magic, clarifying that just as actors don’t really fly, AI puppies aren’t real either.
Despite these reservations, Rose remains enthusiastic about AI’s potential to transform entrepreneurship and venture capital. “The barriers to entry for founders shrink every single day,” he observed. He described a colleague who, with no prior AI coding experience, built and deployed a full application during a drive from Los Angeles to San Francisco. A similar project six months earlier would have taken far longer and involved countless errors.
Rose anticipates that tools like Google’s upcoming Gemini 3 will nearly eliminate coding mistakes. “High school coding classes are becoming ‘vibe coding’ classes,” he said, “and the next billion-dollar business will emerge from some random high school. It’s only a matter of time.”
These advances are reshaping venture capital, allowing entrepreneurs to postpone fundraising or avoid it entirely. “This will really change the VC world, and I believe for the better,” Rose commented. While some firms respond by hiring more engineers, Sequoia Capital now employs as many developers as investors, Rose thinks the real value lies elsewhere. He believes future VC success will depend on emotional intelligence and long-term partnership. “Founders will face deeply emotional challenges, not just technical ones,” he explained. “VCs with high EQ who can support founders through those issues, those who have stability and experience, will be the ones in demand.”
So what does Rose seek in investments? He recalls advice from Larry Page during his time at Google Ventures: “A healthy disregard for the impossible is what’s important to look for.” Rose seeks founders who aren’t just refining existing ideas but are pursuing ambitious, unconventional visions. “We want founders swinging for the fences with big, bold ideas that others dismiss as horrible,” he said. “That’s what draws me in. Even if it doesn’t succeed, we admire the mindset and gladly support them again.”
(Source: TechCrunch)





