Back to News & Commentary

Surveillance Company Flock Now Using AI to Report Us to Police if it Thinks Our Movement Patterns Are “Suspicious”

Police car with license plate readers on it
Company crosses a dangerous line by beginning to offer AI suspicion-generation functions
Police car with license plate readers on it
Jay Stanley,
Senior Policy Analyst,
VlogSpeech, Privacy, and Technology Project
Share This Page
August 7, 2025

The police surveillance company Flock has built an enormous nationwide license plate tracking system, which streams records of Americans’ comings and goings into a private national database that it makes available to police officers around the country. The system allows police to search the nationwide movement records of any vehicle that comes to their attention. That’s bad enough on its own, but the company is also now apparently analyzing our driving patterns to determine if we’re “suspicious.” That means if your police start using Flock, they could target you just because some algorithm has decided your movement patterns suggest criminality.

There has been a lot of reporting lately about Flock but I haven’t seen anyone focus on this feature. It’s a significant expansion in the use of the company’s surveillance infrastructure — from allowing police to find out more about specific vehicles of interest, to using the system to generate suspicion in the first place. The company’s cameras are no longer just recording our comings and goings — now, using AI in ways we have long warned against, the system is actively evaluating each of us to make a decision about whether we should be reported to law enforcement as potential participants in organized crime.

In a February 13 touting an “Expansive AI and Data Analysis Toolset for Law Enforcement,” the company announced several new capabilities, including something called “Multi-State Insights”:

Many large-scale criminal activities—such as human and narcotics trafficking and Organized Retail Crime (ORC)—involve movement across state lines. With our new Multi-State Insights feature, law enforcement is alerted when suspect vehicles have been detected in multiple states, helping investigators uncover networks and trends linked to major crime organizations.

Flock appears to offer this capability through a larger “,” which urges police departments to “Maximize your LPR data to detect patterns of suspicious activity across cities and states.” The company also offers a “Linked Vehicles” or “Convoy Search” allowing police to “uncover vehicles frequently seen together,” putting it squarely in the business of tracking people’s associations, and a “Multiple locations search,” which promises to “Uncover vehicles seen in multiple locations.” All these are variants on the same theme: using the camera network not just to investigate based on suspicion, but to generate suspicion itself.

In a democracy, the government shouldn’t be watching its citizens all the time just in case we do something wrong. It’s one thing if a police officer out on a street sees something suspicious in public and reacts. But this is an entirely different matter.

First, the police should not be collecting and storing data on people’s movements and travel across space and time in the first place, or contracting to use a private company’s technology to accomplish the same thing. Second, they shouldn’t be taking that data and running it through AI algorithms to potentially swing the government’s eye of suspicion toward random, innocent civilians whose travel patterns just happen to fit what that algorithm thinks is worth bringing to the attention of the police.

And of course because Flock is a private company not subject to checks and balances such as open records laws and oversight by elected officials, we know nothing about the nature of the algorithm or algorithms that it uses— what logic it may be based upon, the data upon which it was trained, or the frequency and nature of its error rates. Does anyone actually know whether there are movement patterns characteristic of criminal behavior that won’t sweep in vastly larger numbers of innocent people?

We also don’t know what kind of biases the company’s algorithms might exhibit; it’s very easy to imagine an algorithm trained on past criminal histories in which low-income neighborhoods and communities of color are highly over-represented because of the well-established, top-to-bottom biases in our criminal justice system. That could mean that just living in such a neighborhood could make you inherently suspicious in the eyes of this system in a way that someone living in a wealthier place would never be. Among other problems, that’s just plain unfair.

The bottom line is that Flock, having built its giant surveillance infrastructure, is now expanding its uses — validating all our warnings about how such systems inevitably undergo mission creep, and providing all the more reason why communities should refuse to allow the police departments that serve them to participate in this mass surveillance system.

Learn More About the Vlog on This Page