The Institute for Justice published an analysis on April 28 documenting 14 US cases of police officers allegedly abusing automated license plate reader (ALPR) data — most of them on Flock Safety's network — to stalk romantic partners, ex-partners, or strangers. The majority of cases are post-2024, the year Flock launched a major expansion into 4,000-plus US cities. Today Flock operates 76,000-plus license plate readers across 6,000-plus cities. Only a few of the 14 cases were caught by internal police investigations; most surfaced because victims filed stalking complaints, or because they looked themselves up on HaveIBeenFlocked.com — a public audit site that lets people see when their own plates have been queried.
The named cases show what the misuse looks like in practice. In Milwaukee, eight-year veteran officer Josue Ayala used the department's Flock system to track a woman he was dating and her ex-partner roughly 180 times over two months; he resigned in 2026 after being charged with misconduct in public office. In Costa Mesa, California, officer Robert Josett pleaded guilty in April 2026 to using Flock to track his mistress and her other partners. In Jerome County, Idaho, Sheriff George Oppedyk searched for his wife's vehicle hundreds of times and retired two years early. In Kenosha County, Wisconsin, sheriff's deputy Frank McGrath resigned with severance pay after using the system on a fellow deputy he was romantically involved with. The mechanics are the same in each case: log into Flock, query a plate, get a movement history. No warrant requirement, no per-query justification, and at most departments no automated abuse detection.
This is the second-order story of an AI-enabled-surveillance product. Flock's ALPR network does object recognition (license plates, vehicle types) and stores movement patterns over time. The technical capability is unremarkable — plate OCR is solved. What is new is the scale and the price. Flock has driven the deployment cost low enough that 6,000-plus small departments can afford camera networks that previously only major federal agencies could build. The Electronic Frontier Foundation has documented Flock cameras being used to surveil protesters and activists; a federal class action filed in San Jose this month alleges the city's ALPR system is unconstitutional mass surveillance; per NBC News, Virginia police used Flock to track a single driver 526 times in four months. The 14 stalking cases are the most viscerally bad data points in a much wider pattern of ALPR misuse.
For builders working on AI-enabled surveillance, automated infrastructure, or any product with movement or biometric data, three things are concrete. First, the Flock pattern — cheap deployment, weak per-query controls, no warrant requirement, no audit by default — is what regulatory backlash will look like when it lands. If your product aggregates movement, financial, or biometric data, design audit logs and per-query justifications into the system now, not after a victim's lawsuit. Second, HaveIBeenFlocked.com is a useful precedent: third-party audit interfaces over private surveillance systems are now a thing, and they uncover misuse the companies themselves do not. Expect more of these to appear, and expect them to be cited in litigation. Third, civilian ALPR capability is rising — current Flock deployments add vehicle-attribute matching ("white pickup truck with ladder rack") and behavior signatures. The 14 stalking cases happened on today's capability stack. Tomorrow's misuse pattern, on richer data, will be worse.
