Artificial IntelligenceBigTech CompaniesGadgetsNewswire

Samsung and Google Reveal the Future of Smart Glasses

▼ Summary

– The Samsung Galaxy XR is a mixed reality headset developed by Samsung and Google, serving as the first phase of their plan before transitioning to everyday smart glasses.
– Gemini AI in the Galaxy XR can see both the real world through cameras and virtual screens, with future integration into inconspicuous glasses being the goal.
– Contextual AI, which understands user activities and surroundings, is a key focus for these devices, with smart glasses expected to be the most accessible form for this technology.
– Future smart glasses will connect with other devices like phones, watches, and rings for processing power and control, enhancing their functionality and integration.
– Health and fitness applications, such as biking and nutritional tracking, are anticipated to be major features in upcoming smart glasses from Google and Samsung.

Wearing the new Samsung Galaxy XR headset feels like stepping into a hybrid world, blending elements of the Meta Quest and Apple’s Vision Pro. This substantial mixed reality device, born from a collaboration between Samsung and Google, marks the initial stage of a much broader strategy. The ultimate goal extends far beyond this headset, aiming to introduce smart glasses suitable for everyday use both indoors and outdoors.

What sets the Galaxy XR apart from other virtual reality headsets is its deeply integrated artificial intelligence. Gemini AI doesn’t just observe the physical environment through the headset’s multiple cameras; it also perceives the virtual screens and content you interact with. This capability to act as a second set of eyes hints at a future where such powerful AI could reside discreetly within ordinary-looking eyewear.

Senior executives from both companies confirm this direction. Samsung’s Mobile Experiences COO Won-Joon Choi and Google’s Android lead Sameer Samat have revealed that subsequent wearable technology is already in development. While specific launch dates and pricing remain undisclosed, the partnership’s trajectory is clear. They’re collaborating with eyewear specialists Warby Parker and Gentle Monster to create AI-enhanced glasses that will compete directly with products from Meta and Oakley EssilorLuxottica. The current Galaxy XR headset serves as a preview of how artificial intelligence will eventually operate across glasses, phones, watches, and other connected devices.

The core innovation lies in developing contextual AI systems that comprehend your surroundings. Both Meta and Google envision AI assistants that understand not just voice commands but also visual and auditory context, what you’re viewing, which applications you’re using, and your physical environment. Meta’s existing glasses already incorporate Live AI features using cameras and microphones, while Google’s Gemini AI on the Galaxy XR expands this awareness to include virtual experiences and real-world settings.

Samsung’s Choi emphasizes that “XR headsets and glasses represent the most intuitive platforms for implementing multimodal AI,” establishing the groundwork for forthcoming wearable technologies that people will incorporate into their daily routines. In the immediate future, smart glasses are positioned to become the most economically viable and widely adopted form of contextual AI, avoiding the bulk and expense of full-scale VR systems.

Google’s Samat notes that the Android XR operating system wasn’t developed exclusively for headsets but will also support glasses, which he describes as “a critically important form factor for AI’s evolution.” The Galaxy XR’s ability to interpret both the physical world and digital displays suggests how future glasses might comprehend everyday environments and nearby screen-based devices.

Samat elaborates on the interconnected relationship between devices: “You’ll view other computing surfaces through your AI glasses.” This synergy requires significant development, particularly regarding how glasses collaborate with other computing services, especially smartphones. He envisions phones handling the computational workload for glasses, while wearables filter and present information from phones, similar to smartwatch functionality but with greater depth and interactivity.

Choi confirms that glasses will leverage phone processing power, aligning with Google’s existing explorations through Project Aura with Xreal and Qualcomm’s Snapdragon Spaces platform. This software bridge facilitates phone-to-glasses communication and integrates with the Android XR ecosystem.

Interface design presents another challenge. Unlike smartphones with touchscreens, smart glasses lack obvious control mechanisms. Connected accessories could provide more natural interaction methods. While Meta’s Ray-Ban Displays use a specialized neural band for gesture recognition, Samsung and Google possess extensive experience with watches and rings, suggesting these will likely become integrated control systems for their smart glasses.

Choi confirms that Android XR will integrate seamlessly within the Galaxy ecosystem, ensuring connectivity across smartphones, watches, glasses, and headsets. He notes that watches with displays would naturally complement display-free glasses, allowing users to utilize wearable screens when needed.

Health and fitness applications represent another significant focus area. Surprisingly, the Galaxy XR doesn’t currently sync with WearOS watches or Fitbits for exercise tracking, but future Android XR glasses are expected to emphasize these functions. Meta has already pioneered fitness integrations through Garmin watches, Strava, and sports-oriented Oakley Meta Vanguard glasses. Both Samat and Choi see similar potential for Google and Samsung products.

Choi observes that while VR headsets face challenges incorporating fitness features, glasses are better suited for health applications. He mentions potential uses during cycling and nutritional tracking, where glasses could help input calorie data or scan food items.

Samat acknowledges that the $1,799 Galaxy XR targets developers and businesses alongside consumers. He describes it as “a development platform that provides the same inputs you’d need for building AR glasses applications,” encouraging developers to utilize it accordingly.

Regarding AI model diversity, Samat indicates that while Gemini currently serves as the primary AI option, Android’s open nature suggests other AI technologies will likely emerge within the XR space. He draws parallels to the smartphone environment where multiple AI technologies coexist, expecting similar diversity in extended reality platforms, emphasizing that “we’re still at the beginning of this journey.”

(Source: CNET)

Topics

Mixed Reality 95% AI Integration 93% smart glasses 92% contextual ai 90% tech partnerships 88% wearable devices 87% gemini ai 85% android xr 83% device ecosystem 82% fitness tracking 80%