
This article was produced with support from WIRED.
Flock, the automatic license plate reader (ALPR) and AI-powered camera company, uses overseas workers from Upwork to train its machine learning algorithms, with training material telling workers how to review and categorize footage including images people and vehicles in the U.S., according to material reviewed by 404 Media that was accidentally exposed by the company.
The findings bring up questions about who exactly has access to footage collected by Flock surveillance cameras and where people reviewing the footage may be based. Flock has become a pervasive technology in the U.S., with its cameras present in thousands of communities that cops use everyday to investigate things like car jackings. Local police have also performed numerous lookups for ICE in the system.
Companies that use AI or machine learning regularly turn to overseas workers to train their algorithms, often because the labor is cheaper than hiring domestically. But the nature of Flock’s business—creating a surveillance system that constantly monitors U.S. residents’ movements—means that footage might be more sensitive than other AI training jobs.
Flock’s cameras continuously scan the license plate, color, brand, and model of all vehicles that drive by. Law enforcement are then able to search cameras nationwide to see where else a vehicle has driven. Authorities typically dig through this data without a warrant, leading the American Civil Liberties Union (ACLU) and Electronic Frontier Foundation (EFF) to recently sue a city blanketed in nearly 500 Flock cameras.
Broadly, Flock uses AI or machine learning to automatically detect license plates, vehicles, and people, including what clothes they are wearing, from camera footage. A Flock patent also mentions cameras detecting “race.”




Screenshots from the exposed material. Redactions by 404 Media.
Multiple tipsters pointed 404 Media to an exposed online panel which showed various metrics associated with Flock’s AI training.




