Flock Surveillance AI Built by Overseas Gig Workers

▼ Summary
– Flock uses overseas workers, often from the Philippines via Upwork, to train its AI algorithms by reviewing and categorizing footage from its US surveillance cameras.
– The company’s pervasive cameras continuously scan vehicle details, creating a nationwide system that law enforcement searches without a warrant, raising significant privacy concerns.
– Training tasks for workers include annotating vehicle makes, colors, license plates, and even detecting attributes like clothing or screaming, based on footage showing US locations.
– The exposure of an internal panel revealed the scale of this work, with workers completing thousands of annotations over short periods.
– This practice highlights sensitive questions about who accesses the surveillance footage and where they are based, given Flock’s role in monitoring residents’ movements.
The widespread use of Flock’s AI-powered surveillance cameras across American communities raises significant privacy concerns, particularly regarding who has access to the sensitive footage they collect. Recent findings reveal that the company relies on overseas gig workers, often hired through platforms like Upwork, to train its machine learning algorithms. These workers are tasked with reviewing and annotating video footage captured within the United States, a practice that highlights the globalized and often opaque nature of building modern surveillance technology.
Training materials, accidentally exposed by the company, instruct these contractors on how to categorize images of people and vehicles. The work involves detailed annotations, such as identifying vehicle makes, colors, and types, transcribing license plate numbers, and handling audio tasks. Flock’s system continuously scans and records license plates, vehicle details, and even pedestrian clothing, creating a vast database of movement patterns. Law enforcement agencies routinely access this data, frequently without obtaining a warrant, to track vehicles nationwide.
The location of the workers reviewing this footage is a critical issue. Profiles linked to annotators listed in the exposed company panel indicate many are based in the Philippines. While outsourcing AI training labor is common due to lower costs, the sensitive nature of constant public surveillance makes this arrangement particularly controversial. The footage used for training includes clear images from various U.S. states, such as New York, Michigan, and California, with visible road signs and local advertisements confirming the domestic origin.
This practice intensifies existing debates over data security and oversight. Police departments have used Flock’s system to perform searches for immigration authorities, and civil liberties groups have filed lawsuits against municipalities deploying hundreds of these cameras. Furthermore, a company patent references the camera system’s ability to detect “race,” adding another layer of concern about biometric surveillance and potential bias.
The exposed operational panel showed workers completing thousands of annotation tasks over short periods, underscoring the scale of data processing required to refine Flock’s algorithms. As the company advertises new features like detecting “screaming,” the role of overseas contractors in shaping these capabilities remains largely invisible to the public and the communities under constant camera watch. This situation underscores the complex ethical and legal landscape where advanced surveillance tools are built through global labor networks, far removed from the neighborhoods they monitor.
(Source: Wired)





