BigTech CompaniesCybersecurityGadgetsNewswire

Block Meta’s Tracking with This New Android App

Originally published on: February 27, 2026
▼ Summary

– A new Android app called Nearby Glasses can detect nearby smart glasses, like Ray-Ban Meta AI Glasses, by scanning for their manufacturer identifiers in Bluetooth signals.
– The app’s creator warns it may produce false positives from other devices by the same manufacturer and cautions users not to harass people based on its alerts.
– There is social discomfort and reported incidents of conflict over non-consensual recording with smart glasses, including their use to create misogynistic content.
– While Meta states its glasses have a recording indicator light and require responsible use, critics note the light can be disabled and many people don’t recognize the glasses as recording devices.
– Legal and privacy concerns are rising as these glasses can collect biometric data, potentially violating privacy or wiretapping laws, especially with features like facial recognition.

For Android users concerned about privacy in public spaces, a new application offers a potential alert system. The app, called Nearby Glasses, scans for Bluetooth signals emitted by specific smart glasses models, including the Ray-Ban Meta AI Glasses, and notifies the user if such a device is detected nearby. Developed by an academic researcher, the tool aims to provide individuals with awareness in an era of increasingly discreet wearable recording technology.

The software operates by monitoring Bluetooth Low Energy advertising packets. These broadcasts contain mandatory manufacturer identifiers, which the app uses to recognize devices from companies like Meta. The developer, Yves Jeanrenaud, clarified that while MAC addresses can be randomized, these company IDs within the Bluetooth frames are fixed and cannot be changed, allowing for detection.

However, Jeanrenaud emphasizes the tool is not flawless. It may generate false positives by identifying other Bluetooth hardware from the same manufacturer, such as virtual reality headsets. He strongly cautions users against confronting individuals based solely on an app notification, as the person might simply be wearing ordinary eyewear. The project’s repository includes a prominent warning advising users not to harass anyone due to suspicions raised by the software.

This development occurs against a backdrop of social tension over covert recording. Incidents, such as a reported altercation on the New York subway where a woman allegedly destroyed a TikToker’s Meta glasses, highlight public discomfort. Other reports describe individuals using smart glasses to secretly record interactions for creating inappropriate online content.

In response to inquiries, a Meta spokesperson highlighted that their glasses feature a lit LED to indicate recording and stated that users must comply with laws and terms of service, which prohibit harassment or privacy infringement. Jeanrenaud counters that the indicator light can be disabled, a process demonstrated in online videos, and that many people do not immediately recognize such glasses as recording devices.

The legal landscape surrounding recording in public is complex. While video capture is generally permissible, audio recording can violate wiretapping laws in numerous states that require all-party consent. Furthermore, the collection of biometric data, such as through facial or voice recognition, introduces significant privacy concerns and potential legal liabilities, especially if the behavior constitutes stalking or harassment.

Recent events underscore the ongoing debate. A California judge recently admonished members of Mark Zuckerberg’s legal team for wearing the smart glasses in court, violating courtroom rules. Jeanrenaud acknowledges his app is an imperfect solution but hopes it offers some utility and a greater sense of safety for users until broader issues of consent and privacy in wearable technology are adequately addressed.

(Source: The Register)

Topics

smart glasses 95% privacy concerns 93% bluetooth detection 88% surveillance app 85% legal risks 82% biometric data 80% public recording 78% harassment prevention 75% meta response 73% social discomfort 70%