Flock鈥檚 Aggressive Expansions Go Far Beyond Simple Driver Surveillance


The cloud Automatic License Plate Reader (ALPR or LPR) company Flock is building a dangerous nationwide mass-surveillance infrastructure, as we have been pointing out for several years now. The problem with mass surveillance is that it always expands beyond the uses for which it is initially justified 鈥 and sure enough, Flock鈥檚 system is undergoing insidious expansion across multiple dimensions. If your community adopts this technology, you need to know it鈥檚 doing more than just recording what car is driving where and at what time. It鈥檚 worth stepping back and looking at an overview of what鈥檚 going on.
The company鈥檚 surveillance data is being used by ICE.
First, as has received wide attention, this system is being used by ICE to help carry out the Trump Administration鈥檚 abusive removal program.
Flock sells their cloud-connected cameras to police departments and private customers across the nation, pulls the license plate readings they collect into their own servers, and allows police to do nationwide searches of the resulting database, giving even the smallest-town police chief access to an enormously powerful driver-surveillance tool. The tech news outlet 404Media of nationwide searches which include a field in which officers list the purpose of their search. These records revealed that many of the searches were carried out by local officers on behalf of ICE for immigration purposes, including its notorious Enforcement and Removal Operations division. Emails from police departments in Oregon also shed light on how local police are providing to ICE.
It鈥檚 safe to say that even many who support the use of ALPR programs by their local police to catch local criminals do not support funneling the data that is collected to the Trump Administration and those carrying out its abusive and often unlawful immigration program.
A search for a recipient of an illegal abortion
The same kinds of police department logs that revealed ICE鈥檚 access to Flock鈥檚 dragnet also revealed that a used the system to search nationwide for a woman who鈥檇 had a self-administered abortion 鈥 illegal in the state. An abortion rights group told 404Media that, based on calls to their hotline, already 鈥渢here is an overwhelming fear鈥 among women that they鈥檙e 鈥渂eing watched and tracked by the state鈥 鈥 and such reports are hardly going to help. This mass surveillance tool is creating fear among those targeted by immigration, anti-abortion, and other regressive actions, but eventually everyone will become aware that their movements are being tracked. That鈥檚 no way to live in a democratic society.
Plugging in to data brokers
Meanwhile, as police around the nation expand their uses of this surveillance machinery, Flock is expanding the power of the system itself. For example, the company is planning to that offer services such as 鈥減eople lookup.鈥 Flock has long claimed that their LPRs don鈥檛 collect personally identifiable information, as if license plates can鈥檛 easily be connected to specific people. That claim was always bogus, but with their new product they make that falsity explicit, boasting that the new product will let police 鈥渏ump from LPR to person.鈥
In the 1970s, after some government agencies were found to be building dossiers on people who aren鈥檛 suspected of involvement in crime like the East German Stasi, Congress enacted the Privacy Act banning agencies from such recordkeeping. Yet the ethically shady and frequently inaccurate data broker industry does basically the same thing, and when law enforcement becomes a customer of those data brokers, it represents an end-run around the law. By tying its LPR data together with data brokers, Flock is effectively automating and scaling the end run around our checks and balances that law enforcement data broker purchases represent. (A proposal called the that would ban this was passed by the U.S. House in 2024, but got blocked in the Senate.)
From still to video, and with AI
In another major expansion, Flock is turning its plate readers into surveillance cameras. The company has that police departments will soon be able to obtain not just still photos from ALPR cameras, but also video, with the ability to request live feeds or 15-second clips of cars passing by the cameras. And Flock is using AI to let law enforcement using natural language searches. The company uses the example of searching for 鈥渓andscaping trailer with a ladder,鈥 but we have to assume searches could encompass descriptions of anything captured by one of their cameras, including vehicle occupants and bystanders.
We recently wrote about how generative AI is turbo-charging video search and surveillance, and this is an example of the trend. Imagine that a police officer stood on your street writing detailed notes about you every time you drove or walked by them. All the details about what your car looks like (make, model, color, distinguishing characteristics, bumper stickers, etc.), as well as details about visible occupants and pedestrians 鈥 how many, at what time, their activities, demographic data, what they are wearing, attributes they may have such as a beard, hat, tattoo, or t-shirt, and what that hat, t-shirt, or tattoo might say. Now imagine that there is an army of police officers doing this on every block.
This is the surveillance world that Flock is building.
Creating an infrastructure for corporate blacklisting and surveillance
In June, Flock also the launch of a 鈥淔lock Business Network,鈥 a 鈥渃ollaborative hub designed to help private sector organizations work together to solve and prevent crime.鈥
This will sound ominous to anyone familiar with the of private companies and government agencies working together to create watch lists, blacklists, and databases about people in the United States. In the heyday of the labor movement (and perhaps today), organizers were commonly put on as 鈥渢roublemakers,鈥 and could have trouble getting a job as their name was shared among employers. During the civil rights, antiwar, and other social justice movements of the 20th century, there were a number of private databases created by shady collections of right-wing vigilantes and super-patriots who took it upon themselves to compile dossiers on activists they disagreed with. These private databases, such as the and the , were shared with police and government security agencies and took on quasi-official roles in the efforts of police 鈥渋ntelligence鈥 arms to combat those progressive movements, while remaining outside the normal checks and balances of government.
Today, face recognition technology threatens to make these lists easier than ever to create and administer 鈥 and so does license plate surveillance. In its , Flock boasted that its service would allow companies to 鈥渁dd vehicles to Flock Hotlists鈥 so any user subscribed to that Hotlist is alerted the next time that vehicle is detected by a Flock LPR,鈥 giving the private sector 鈥渢he power of a shared network to identify threats.鈥 Elsewhere Flock , 鈥淏y sharing insights and intelligence, companies can identify patterns, suspects, and criminal networks that might not be apparent to a single security team.鈥
Investigating criminals should be the job of law enforcement, not big companies that have strong incentives to use these infrastructures against labor activists, , and others 鈥 to use them not for crime, but to protect the bottom line.
Generating suspicion
Finally, as I recently wrote about, Flock has also introduced AI analytics products that shift the company from providing tools for officials to use in investigating suspicion to generating suspicion. Because the company funnels plate reads from customers across the nation into its own centralized database, it is able to run analytics on that dataset. One such analysis service that it has begun selling is an attempt to identity 鈥渓arge-scale criminal activities鈥 by scanning the movement patterns of all vehicles contained in their dataset to try to identify and alert law enforcement to those that their algorithm decides are 鈥渟uspect.鈥
Overall, this explosion of new uses is what happens when you build an authoritarian tracking infrastructure 鈥 it expands in more and more ways. State legislatures and local governments around the nation need to enact strong, meaningful protections of our privacy and way of life against this kind of AI surveillance machinery.