Tracking Your Travel and Predicting Your Next Move: The AI Surveillance Tech Police Departments Are Installing in Your Neighborhood

Tracking Your Travel and Predicting Your Next Move: The AI Surveillance Tech Police Departments Are Installing in Your Neighborhood

Introduction

You drove past one this morning. Probably didn't even notice it—just another camera on a pole, another piece of street furniture you've learned to ignore. But that box is watching you. Not just reading your license plate. Building a profile of your life. Flock Safety, a $7.5 billion surveillance company founded by three Georgia Tech alumni, has installed cameras in over 5,000 communities across 49 states. Every month, those cameras perform over 20 billion scans of vehicles in the U.S. Every trip to the grocery store. Every drive to your partner's house. Every visit to the shooting range, the mosque, the abortion clinic, the protest. All of it logged, timestamped, and fed into a database that law enforcement can search without a warrant. This isn't some dystopian future we're sliding toward. This is happening right now, in your neighborhood, and most people have no idea these cameras even exist. The company markets itself as a tool to catch stolen cars and solve crimes. But the internal documents, the lawsuits, and the leaked employee communications tell a different story. They've built something far more dangerous: a nationwide surveillance dragnet that tracks where you go, who you spend time with, and uses AI to predict what you might do next. And they're sharing that data with thousands of police departments, federal agencies, and even immigration enforcement—regardless of what your city's privacy laws say.

Article image

The Algorithm Doesn't Just Watch—It Predicts Who You Are

Here's what Flock doesn't put in the brochures they show city councils. Their system doesn't just capture license plates anymore. It creates what they call a "fingerprint" of your vehicle: make, model, color, bumper stickers, roof racks, distinguishing marks. Then it feeds that into their Investigations Manager platform, which urges police to "Maximize your LPR data to detect patterns of suspicious activity across cities and states." Read that phrase again. Not "investigate crimes." Not "locate stolen vehicles." Detect patterns of suspicious activity. The AI is generating suspicion where none existed before. Flock offers something called "Linked Vehicles" or "Convoy Search" that allows police to "uncover vehicles frequently seen together." Translation: the system tracks your associations. If your car is repeatedly photographed near someone else's car, the algorithm flags that relationship. You're not being investigated because you committed a crime. You're being flagged because a machine decided your movement patterns look weird. They also offer "Multiple locations search," which promises to "Uncover vehicles seen in multiple locations." You drove to three different places in one day? Suspicious. You visit the same coffee shop every morning? Pattern detected. The ACLU's analysis is blunt: if your police department uses Flock, you could be targeted just because some algorithm decided your life looks like criminality. Lee Schmidt, a Navy veteran in Norfolk, Virginia, knows exactly what this feels like. According to his lawsuit against the city, there are four Flock cameras just outside his neighborhood. He drives past them almost every day. If the cameras catch him turning right, police can infer he's going to the shooting range. If he turns left, they know he's heading to the grocery store.

As the lawsuit states: "The Flock Cameras capture the start of nearly every trip Lee makes in his car, so he effectively cannot leave his neighborhood without the NPD knowing about it." Crystal Arrington, a healthcare worker in Norfolk, has a different nightmare. According to her lawsuit, it would be trivial for the government to identify her clients based on which vehicles show up at certain locations. She's not doing anything wrong. But the surveillance creates a map of relationships, visits, and patterns that could expose the private medical decisions of the people she serves. This is what "pattern of life" profiling looks like. Not investigation. Automated suspicion at scale.

Article image

Your Data Gets Shared Everywhere—Even Where It's Illegal

You might live in a sanctuary city. You might live in a state with strict data privacy laws. Doesn't matter. Flock built a nationwide sharing network that overrides local protections, and most city councils had no idea what they were signing. The Massachusetts ACLU discovered something that should trigger investigations: Flock's data-sharing settings allow for the automatic sharing of information collected about local drivers with thousands of police departments nationwide. Even when those settings are supposedly restrictive, standard Flock contracts include language that may supersede the settings entirely, giving the company expansive permission to share local data across the country. According to public records obtained by the ACLU, police departments in Massachusetts are currently sharing data with thousands of agencies nationwide. Not just neighboring towns. Thousands. In California, state law explicitly prohibits sharing license plate reader data with federal agencies. But in 2025, multiple state and municipal law enforcement agencies did it anyway. In May 2025, 404 Media reported that Flock data had been queried for use in immigration enforcement. Flock later admitted it had been running a "pilot program" with Customs and Border Protection and Homeland Security Investigations. After public outcry, Flock announced they were ending that program. Sounds like progress, right? The Institute for Justice wasn't fooled. IJ Attorney Michael Soyfer issued a statement making the reality clear: "Although the announcement means federal law enforcement cannot directly access this trove of information, they can just ask other Flock customers to run searches or share log-in information, as we've seen happen repeatedly." In other words, Flock closed the front door and left every window open.

Federal agents can't log in directly anymore, but they can ask a local cop to do it for them. The data sharing continues. The surveillance continues. And because Flock is a private company, it's not subject to open records laws or oversight by elected officials. You can't FOIA their algorithms. You can't audit their error rates. You can't see who's accessing your data or why. The company has steadfastly refused to allow IPVM, an independent security technology testing outlet, to obtain one of its readers for testing—even though IPVM has tested all of Flock's major competitors. They don't want scrutiny. They want market dominance.

Article image

They Used This System to Hunt a Woman for Having an Abortion

In Texas, a sheriff's office searched data from more than 83,000 automated license plate reader cameras to track down a woman suspected of self-managing an abortion. Not 83 cameras. Eighty-three thousand. This wasn't a kidnapping. This wasn't a violent crime. This was a healthcare decision that someone made about their own body, and law enforcement used a nationwide surveillance dragnet to hunt her down. The Electronic Frontier Foundation documented the case after investigative reporting from 404 Media revealed the scope of the search. This officer had access to a network spanning entire regions, and he used it to track someone's medical choices. The EFF noted that this scenario might have been avoided if Flock had taken action when they were first warned about this threat three years ago. They didn't. Records show that police in Florida and Texas have searched license plate reader data through the Flock nationwide database for queries related to immigration enforcement and at least one abortion-related investigation. Not theoretical risks. Documented searches. If you've driven to a Planned Parenthood, an abortion clinic, or even a hospital that provides reproductive care, that trip is in the database. If you've driven someone else to one of those locations, that's in there too. The "Linked Vehicles" feature means the system can connect your car to theirs, mapping out who's helping whom. And it's not just abortion. Think about every sensitive location you've ever driven to. Therapist's office. Psychiatrist. Addiction treatment center. HIV clinic. Mosque. Synagogue. Gun range. Political rally. Your ex's house. Your lawyer's office. Every single one of those trips is a data point. Every one of them can be queried, analyzed, and used to build a narrative about who you are and what you might be doing.

And because this is all happening through a private company's platform, there's no warrant requirement. No judicial oversight. Just a search bar and a cop with a hunch.

Nova: When License Plate Tracking Meets Hacked Data

In May 2025, 404 Media broke a story that made even Flock employees uncomfortable. The company was developing a new product called Nova, described internally as a "public safety data platform." Nova doesn't just track your car. It connects that tracking data to information from data breaches, public records, and commercially available databases to track specific individuals without a warrant. Let that sink in. Your license plate gets scanned. That scan gets cross-referenced with databases built from breached data. Suddenly you're not just a plate number—you're a full profile. According to 404 Media's reporting, leaks from a company meeting indicated that some of the data came from a hacked parking meter app. Flock employees raised concerns because the platform was accessing information from breaches, not just above-board public records. As of May 2025, Nova was already in use by law enforcement through an Early Access program. This is the nightmare scenario privacy advocates have been warning about for years: the fusion of location tracking with personal data, all accessible without judicial oversight. Law enforcement can now know not just where your car has been, but who you are, where you live, who you're connected to, and potentially even your financial information—all pulled from databases you never consented to share. The Fourth Amendment was supposed to protect against unreasonable searches. But when a private company collects the data and sells access to police, the constitutional protections evaporate. You're not being searched by the government. You're being tracked by a vendor. The courts are still trying to figure out if that distinction matters. Spoiler: it shouldn't. A search is a search, regardless of who's holding the camera.

The System Is Already Failing—and Hurting People

Last July in Española, New Mexico, 21-year-old Jaclynn Gonzales was driving with her 12-year-old sister when a Flock camera misread her license plate. The camera confused the number 2 on her plate for a 7—possibly because a license plate cover was partially blocking it—and flagged her car as stolen. An officer pulled them over at gunpoint. He handcuffed both sisters and put them in the back of his patrol car. The officer failed to verify the error before escalating the stop into a traumatic incident that has now led to lawsuits against the city. This wasn't an edge case. This is what happens when you automate suspicion and train officers to trust the algorithm more than their own judgment. In 2022, a lieutenant from the Kechi Police Department in Kansas was arrested for illegally using the Wichita Police Department's Flock system to track his estranged wife. The misuse was discovered after another officer raised concerns and an audit confirmed he'd accessed the system inappropriately. The technology makes stalking trivial for anyone with access. And here's the part that should make every city council member who approved a Flock contract deeply uncomfortable: according to an Illinois village trustee, the local Civilian Police Oversight Commission found that over 99% of Flock alerts do not result in any police action. Over 99%. That means the system is generating a tidal wave of false positives, pulling over innocent people, flagging cars that aren't stolen, creating alerts that lead nowhere. But every one of those alerts is a data point that stays in the system. Every false flag is still a record of where you were and when. Because Flock is a private company, we know nothing about the nature of the algorithms it uses. What logic are they based on? What data were they trained on? What are the error rates?

Does anyone actually know whether there are movement patterns characteristic of criminal behavior that won't sweep up vastly larger numbers of innocent people? We also don't know what kind of biases the algorithms might have. It's very easy to imagine an algorithm trained on past criminal records—where low-income neighborhoods and communities of color are vastly over-represented because of well-established biases in the criminal justice system—learning to treat those neighborhoods as inherently suspicious. If you live there, you're flagged. If you drive there, you're flagged. The algorithm doesn't see injustice. It sees patterns. And it reproduces them at scale.

Cities Are Starting to Say No

The legal dam is breaking. In June 2024, a judge in Norfolk, Virginia, ruled that collecting location data from the city's 172 Flock cameras constitutes a search under the Fourth Amendment and cannot be used as evidence in a criminal case when collected without a warrant. The ruling compared license plate reader databases to tracking devices, whose use by police was previously found unconstitutional without a warrant in United States v. Jones. In October 2024, the Institute for Justice filed a federal lawsuit against the Norfolk Police Department asserting that the department's use of Flock cameras constitutes illegal surveillance in violation of the Fourth Amendment. The lawsuit states plainly: "The City of Norfolk, Virginia, has installed a network of cameras that make it functionally impossible for people to drive anywhere without having their movements tracked, photographed, and stored in an AI-assisted database that enables the warrantless surveillance of their every move." When Flock Safety tried to intervene in the Norfolk case to defend its product, Judge Davis shut them down. His opinion notes that Flock sat on the sidelines hoping the court would dismiss the case and only tried to intervene once the lawsuit was well underway. He wrote: "Flock made a conscious gamble to not show up to the platform on time; it is not this Court's fault that the train had already left the station by the time Flock arrived. Allowing Flock to intervene at this late juncture would throw this case off the rails." Meanwhile, cities are rejecting Flock contracts outright. Denver's city council unanimously rejected a $666,000 extension of the city's Flock pilot program in May 2025, citing "new community concerns" about constant surveillance. Oakland's Public Safety Committee rejected a proposed $2.25 million contract after hours of emotional public testimony.

Councilmembers Carroll Fife and Kevin Jenkins Brown voted against the proposal following widespread concerns from community members about privacy violations, data selling, and collaboration with federal immigration agencies. Gig Harbor, Washington, rejected a motion that would have installed 10 Flock cameras in March. Ferndale ended its contract with Flock after a pilot period, with the police department citing concerns about how data is used and accessed. And then there's the resistance movement. A privacy activist named Will Freeman created DeFlock.me, a crowdsourced map showing the exact locations of Flock cameras. The company sent him a cease-and-desist letter claiming the project dilutes its trademark. The Electronic Frontier Foundation, representing Freeman, sent Flock a letter rejecting the demand and pointing out that the grassroots project is well within its First Amendment rights. Through open-source research, DeFlock.me is shining a light on the surveillance network one camera at a time. Flock doesn't want you to know where the cameras are. That should tell you everything.

They're Not Stopping at Cameras—Now They're Adding Microphones

As if tracking every vehicle in America wasn't enough, Flock is expanding into audio surveillance. In October 2025, the company announced that their Raven devices would begin listening for "human distress," with advertisements showing police being alerted when the device picks up screams. These aren't just cameras anymore. They're high-powered microphones parked above densely populated city streets, running algorithms that decide what sounds count as distress. Who defines distress? What does the algorithm think it's hearing? What gets recorded? How long is it stored? Who has access? Flock isn't saying. They're just rolling it out. This is the pattern with surveillance technology: start with a narrow use case (catching stolen cars), expand the capabilities (tracking movement patterns, predicting behavior), then add new sensors (audio) while the public is still trying to understand the previous expansion. By the time people realize what's been built, the infrastructure is already everywhere. The Raven devices are being marketed as public safety tools. But there's no public oversight. No transparency about what's being recorded or how the audio analysis works. Just another sensor in the growing network of machines watching and listening to everything we do.

Frequently Asked Questions

How do I know if my city has Flock cameras?

Check DeFlock.me, a crowdsourced map showing reported Flock camera locations. You can also submit public records requests to your local police department asking about any contracts or agreements with Flock Safety. Look for the cameras themselves—they're typically mounted on poles at intersections and neighborhood entrances, often near stop signs or traffic lights. The cameras are usually small, rectangular boxes, sometimes with solar panels attached.

Can I opt out of being tracked by Flock cameras?

No. There is no opt-out mechanism. If you drive past a Flock camera, your vehicle gets scanned and the data gets stored. The only way to avoid the surveillance is to not drive in areas where the cameras are installed—which is increasingly difficult as the network expands to over 5,000 communities. Some people use license plate covers, but these can be illegal in many states and, as the Española case showed, can actually trigger false alerts that lead to dangerous police stops.

Is this legal? Don't they need a warrant?

That's currently being fought in courts. In June 2024, a Norfolk, Virginia judge ruled that collecting location data from Flock cameras constitutes a search under the Fourth Amendment and requires a warrant. But that ruling only applies in that jurisdiction. Federal lawsuits are ongoing. The legal question is whether mass surveillance by a private company that sells access to police counts as a government search requiring constitutional protections. So far, Flock has operated by claiming they're just a vendor, not a government actor, which has allowed them to avoid warrant requirements in most places.

What happens to my data? How long is it stored?

Flock stores vehicle location data for varying periods depending on the contract—often 30 days to a year or longer. But here's the problem: Flock contracts allow data sharing with thousands of police departments nationwide, and once your data is shared, you have no control over how long those other agencies keep it or who they share it with. The Nova platform reportedly supplements this data with information from breaches and commercial databases, creating profiles that persist indefinitely. Because Flock is a private company, it's not subject to public records laws, so there's no way to audit what they're actually doing with the data.

My city council is considering a Flock contract. What should I do?

Show up to the meeting. Bring printed copies of the Norfolk lawsuit, the Texas abortion tracking case, and the Massachusetts ACLU findings about data sharing. Ask specific questions: Will our data be shared with other jurisdictions? With federal agencies? With ICE? What happens if the company is breached? What are the error rates? Can we see the algorithm? Can residents opt out? Demand that any contract include strict limitations on data sharing, retention periods, and third-party access. Better yet, ask your council to reject the contract entirely and point to cities like Denver, Oakland, and Gig Harbor that have said no. Organize with neighbors—these decisions are being made in local meetings that most people don't know about.

If I haven't done anything wrong, why should I care about being tracked?

Because 'wrong' is defined by whoever has access to the database. Today it's your local police looking for stolen cars. Tomorrow it's a cop tracking his ex-wife (this already happened in Kansas). Next week it's a sheriff hunting someone for having an abortion (this already happened in Texas). The data doesn't care about your intentions. It just shows where you've been—and algorithms decide what that means. You might be flagged as suspicious just for driving through the 'wrong' neighborhood or because your car is frequently seen near someone else's. Over 99% of Flock alerts lead to no police action, meaning the system is mostly generating false suspicion about innocent people. And once you're in the database, that record persists. You don't get to explain yourself to the algorithm.

Conclusion

So what do you do about it? First, check if your city has Flock cameras—go to DeFlock.me and see what's already watching you. If your city council is considering a contract, show up and fight it. Bring evidence. Bring neighbors. Make them explain why they think mass surveillance without warrants is acceptable. If the cameras are already installed, file public records requests demanding to know who has access to the data, how it's being shared, and what the error rates are. Support the legal challenges happening in Norfolk and other cities. These lawsuits are setting precedents that could dismantle the entire network. This surveillance infrastructure wasn't built overnight, and it won't be dismantled overnight either. But every city that says no makes it harder for Flock to claim this technology is inevitable. Every public records request creates a paper trail. Every lawsuit forces them to defend their practices in court. The machine wants you to feel powerless, like this is all too big and too technical to fight. That's a lie. You have more power than you think—but only if you use it. Check your city. Tell someone else. Make this a problem Flock can't ignore. Because right now, they're counting on you not noticing the cameras at all. **If you want tools that actually protect your privacy instead of selling it—tools that work offline, that don't phone home, that can't be subpoenaed because the data never leaves your device—join the waitlist for SurvivalBrain at https://survivalbrain.ai/#waitlist. We're building technology that answers to you, not to surveillance companies or their government clients.**

Get Early Access to Uncensored Offline AI

Join the waitlist for SurvivalBrain launching Q1 2026. Early supporters lock in $149 lifetime pricing (save $50).

Lock In $149 Pricing
← Back to Blog