Artificial IntelligenceBigTech CompaniesGadgetsNewswire

I Tried Meta’s New Ray-Ban Smart Glasses With Mark Zuckerberg

▼ Summary

Mark Zuckerberg uses Meta Ray-Ban Display glasses extensively for sending text messages, operating the company through frequent WhatsApp pings.
– The glasses feature a neural wristband that enables typing via subtle gestures by detecting signals from the arm’s muscular nervous system.
– Zuckerberg believes glasses will become the next major computing platform, ideal for AI integration due to their ability to see, hear, and interact with users.
Sales of Meta’s Ray-Bans have reached single-digit millions with triple-digit growth, and the broader tech eyewear market is projected to expand significantly.
– The glasses offer features like text messaging, AI recognition, live captions, and translation, with the long-term goal of reducing reliance on smartphones.

Mark Zuckerberg has found his favorite new productivity tool: Meta’s latest Ray-Ban smart glasses. He uses them constantly throughout the workday, firing off rapid text messages to his team via WhatsApp. “I run the company through text messages,” he revealed in a recent conversation. According to Alex Himel, Meta’s head of wearables, Zuckerberg is the product’s heaviest user. His typing style even gives him away, messages sent through the glasses arrive faster and are noticeably shorter than his usual lengthy, multi-paragraph texts.

Zuckerberg claims he’s already typing around 30 words per minute using the glasses’ innovative control system. Unlike earlier attempts at smart eyewear, these glasses rely on a neural wristband that interprets subtle muscular signals from the wearer’s arm. This allows for typing and navigation without visible hand movements, whether your hands are in your pockets, behind your back, or resting at your side. After testing it myself, I can confirm the experience feels like something out of science fiction.

Zuckerberg firmly believes glasses represent the next major computing platform, especially for artificial intelligence. “It’s the only device where you can basically let an AI see what you see, hear what you hear, talk to you throughout the day,” he explains. Once a display is integrated, the AI can generate a user interface right in front of your eyes. This vision is beginning to gain traction beyond mere concept demos. Sales of Meta’s earlier Ray-Ban models have reached the single-digit millions, with triple-digit growth over the past year. The tech-enabled eyewear market as a whole is projected to expand into the tens of millions soon.

Competition is heating up, too. Google plans to release AI glasses next year, Snap is preparing consumer AR glasses, and Apple is rumored to be targeting 2027 for its own entry, the same year Meta aims to launch its high-end AR glasses. The potential user base is enormous. As Zuckerberg points out, up to two billion people worldwide wear glasses daily for vision correction. He draws a parallel to the shift from flip phones to smartphones, suggesting it’s only a matter of time before ordinary glasses become smart glasses.

Meta’s collaboration with EssilorLuxottica almost didn’t happen. Initially, the company thought display technology would be too bulky for the Ray-Ban brand. But last year’s prototype won them over. The resulting design, while still somewhat chunky, is far less conspicuous than earlier AR prototypes. With transition lenses included, the display-enabled Ray-Bans start at $800 before prescription lenses. Andrew Bosworth, Meta’s CTO, says they’re aimed at “optimizers” and productivity-focused users.

Production numbers are intentionally limited, reportedly just a few hundred thousand units, but Bosworth expects every pair to sell. When asked about profitability, Zuckerberg hints that the real revenue will come not from hardware margins, but from long-term AI and service usage.

During my hour-long test, the hardware impressed with its refinement. The geometric waveguide display sits discreetly beside the right lens, offering a clear 20-degree field of view even in sunlight. The neural wristband allows intuitive control through simple pinching gestures, and the display remains invisible to others. Battery life lasts up to six hours, with additional charges available from the carrying case.

Software functionality still depends on a paired smartphone, but it goes far beyond mirroring notifications. Users can send messages, take calls, share music, get walking directions, view camera footage, and use Meta AI for real-time object recognition. Bosworth emphasizes the importance of crisp text rendering for AI utility. “If the AI has to read it back to you verbally, you’re not getting the most information,” he notes. “Asking a question and seeing the answer is much better, and more private.”

Though AI played a supporting role in my demo, I used it to identify a painting and even generate a virtual table setting. The system handled off-script queries smoothly, and the display offered AI-suggested follow-up prompts selectable via the wristband.

One standout feature was live captions. In a noisy environment, I could look at someone across the room and see their speech transcribed in real time right in front of me, like having super hearing. Language translation is already in the works, with Spanish and several other languages supported at launch. A teleprompter function is also under development.

According to Bosworth, the glasses already address five or six of the top ten reasons people pull out their phones. The long-term goal is to eventually make smartphones unnecessary. The neural wristband may prove even more significant in the near future. Initially developed for the Orion AR prototype, it’s advancing faster than anticipated. A handwriting mode, once thought years away, is already functional. Zuckerberg envisions even subtler control: “The future version involves firing opposing muscles with no visible movement at all.”

Beyond typing, the neural interface could enable personalized autocomplete, smart home control, or even operate as a standalone platform. For now, these first-generation smart glasses are clearly for early adopters. AI capabilities are still limited, the wristband requires practice, and the software needs refinement. But after trying them, it’s easy to see why Zuckerberg is so convinced. “We have this incredibly rich digital world,” he says, “and you access it through a five-inch screen in your pocket. I just think it’s a little crazy.”

If adoption follows the pattern of earlier models, where the second generation sold five times more than the first, Meta may finally be on the verge of realizing Zuckerberg’s long-held vision of glasses as the next great digital platform.

(Source: The Verge)

Topics

meta glasses 95% neural wristband 90% AI Integration 88% next computing platform 87% text messaging 85% platform vision 85% software features 83% market potential 82% future development 80% hardware design 80%