Snapchat Unveils Major Platform Updates with New Features

▼ Summary
– Snap OS is an Android-based operating system for AR glasses that runs sandboxed “Lenses” apps built with JavaScript/TypeScript in Lens Studio, not native code or third-party engines.
– Snap Spectacles are $99/month AR glasses for developers with a 46° field of view, 45-minute battery life, and 226g weight, but consumer Specs launching in 2026 will be smaller and more capable.
– Snap OS 2.0 introduces monetization via Commerce Kit for in-app payments, a UI Kit for consistent design, and permissions for raw sensor data with bystander indicators.
– New developer features include EyeConnect for automatic colocation by looking at other users, Semantic Hit Testing for object placement, and Mobile Kit for phone app integration.
– Snap Cloud provides back-end services like authentication and databases, while recent updates added AI features, GPS navigation, and first-party app improvements for consumer readiness.
Snapchat has rolled out a substantial set of enhancements for its augmented reality platform, introducing several new tools designed to empower developers and improve user interaction. These updates include an in-app payment system called Commerce Kit, a dedicated UI Kit, a permissions framework for raw camera access, and an automated colocation feature named EyeConnect.
For those just getting acquainted, Snap Spectacles represent a developer-focused AR glasses initiative priced at $99 per month, with a student discount bringing the cost down to $50 monthly. These glasses serve as a testing ground for applications destined for the consumer version of Specs, anticipated to launch in 2026. The current developer model offers a 46-degree diagonal field of view and an angular resolution on par with premium devices like the Apple Vision Pro. However, they come with constraints, including a battery life of roughly 45 minutes and a substantial weight of 226 grams. Snap’s CEO, Evan Spiegel, has assured that the final consumer product will be significantly lighter and more capable, while maintaining compatibility with all existing applications.
The true centerpiece of this ecosystem is Snap OS, a distinctive Android-derived operating system that does not support traditional APK installations. Developers create sandboxed applications, known as Lenses, utilizing the Lens Studio software on Windows or macOS. By employing JavaScript or TypeScript, they engage with high-level APIs, while the OS manages fundamental tasks like rendering. This approach yields benefits similar to other advanced platforms, such as nearly instantaneous app launches, uniform interaction patterns, and straightforward multi-user experiences. The system even permits the Spectacles mobile app to function as a spectator screen for any Lens. Although multitasking is not currently supported, this is likely a hardware limitation rather than a software one.
Since its debut, Snap OS has seen a steady stream of new functionalities. Earlier additions allowed developers to incorporate GPS and compass data for outdoor navigation, detect when a user is holding a phone, and summon a system-level floating keyboard. More recently, a suite of AI tools was integrated, offering speech-to-text in over forty languages, on-the-fly 3D model generation, and connections to AI services from Google and OpenAI. The latest iteration, Snap OS 2.0, not only refines the developer experience but also enriches first-party applications like Browser, Gallery, and Spotlight, and introduces a Travel Mode, paving the way for the consumer launch.
At the recent Lens Fest event, Snap detailed the newest developer capabilities in OS 2.0, with a major focus on monetization. Commerce Kit provides a built-in payment system, enabling developers to incorporate microtransactions within their Lenses. Users configure a payment method and a four-digit PIN via the Spectacles smartphone app. Since Lenses must be free to download, this represents the first direct revenue stream for creators. The feature is presently in a closed beta, available to US-based developers.
A new permissions framework now allows experimental Lenses to simultaneously access the internet and raw sensor data, such as camera feeds, microphone audio, or GPS coordinates. To address privacy, a permissions prompt appears each time the Lens starts, and a forward-facing LED on the glasses pulses during use, serving as a bystander indicator.
The Spectacles UI Kit (SUIK) gives developers access to the same interface components used by the Snap OS system, ensuring a cohesive user experience. Meanwhile, the new Mobile Kit enables iOS and Android applications to communicate with Spectacles Lenses over Bluetooth. This allows for the creation of custom companion apps or the addition of AR features to existing mobile applications, supporting data transfer from health tracking, navigation, or gaming apps to enable hands-free, Wi-Fi-independent augmented reality.
Another new tool, Semantic Hit Testing, allows Lenses to cast a ray, for instance, from a user’s hand, and determine if it intersects with a valid ground surface, enabling instant placement of virtual objects. This functionality is becoming commonplace across extended reality platforms.
Through a partnership with Supabase, Snap Cloud offers a back-end-as-a-service for developers, providing authentication, databases, serverless functions, real-time data sync, and content delivery network storage. This service is currently in an alpha release, with access granted on a case-by-case basis.
Finally, the new EyeConnect feature builds upon the platform’s existing colocation support, which already simplifies joining local multiplayer sessions. EyeConnect takes this a step further by allowing Spectacles wearers to simply look at one another; using device and face tracking, the system automatically colocates all users in a session without any manual mapping. It’s important to note that for Lenses placing virtual objects at specific real-world coordinates, a separate Custom Locations system is used, which does not support EyeConnect.
(Source: UploadVR)