I think the person requesting to access the data was doing the right thing and I agree with the judge’s ruling.
The fact that they’re gonna shut it down, implies the scale of indiscriminate nature of data capture and the volume of data being captured.
These cameras are popping up all over the nation and if people realize how much data is being captured and where that data is going (or who it’s being sold to) and how it’s being used by government and private entities they would be appalled.
There’s been exposés about these cameras, everything from AI misidentification of “stolen” (not) vehicles and erroneous arrests and police encounters, to analysis of shopping patterns being sold back to private entities for better ad targeting. It’s wild.
The laws need to be updated. CCTV in public used to be fine because no one was actually watching it unless there was an incident. Now it’s possible to have AI watch every camera and correlate everything everywhere we need new privacy laws to reflect this capability.
I don't mind a local AI on an airgapped security camera network monitoring a camera and issuing an alert to a security guard. The issues are internet connectivity, data retention/mining/sale, and non-local processing (ie handing stuff off to a third party that does who knows what and probably doesn't take security seriously).
Just as two trivial examples, even though neither affects me personally:
The estimated number of heroin users in the UK exceeds the total prison population. The number of class-A drug users in the UK is estimated to be so high that if they actually followed the minimum sentencing guidelines for possession, it would cause a catastrophic economic disaster both from all the people no longer working and also all of the people who suddenly had to build new prisons to hold them. I'm not interested in drugs (and I don't live in the UK, but I assume the UK isn't abnormal in this regard).
Another example is road traffic law. Even just speeding offences, I think you probably catch everyone who actually drives in the UK, often enough that after a month the only people left allowed to drive would be people like me who don't even own a car.
The entire legal system has to be radically changed with far less punishments for almost everything if you have perfect, or even 30% of the way to perfect, surveillance.
I was a bit unclear. I agree, I don't want the government using AI to identify all violations of the law. That sounds like a very straightforward dystopia.
What I don't mind is private companies using AI analysis to support their security guards. I object to any sharing of the data with third parties though. It should be illegal for the data to leave their internal network and it should be illegal to retain it for more than a few days.
I don't care if grocery store loss prevention has eyes on every aisle. My concern is data warehousing and subsequent misuse.
No. Everyone should be restricted from buying (or better: collecting) it, otherwise you just created the business model for evil corp that does the job and collects your tax dollar to do the same thing.
Yep, ban collection and pruchase of such data for everyone. Exceptions usually mean private companies hop in to offer the "service".
I think the current insane development are surveilance capitalists, trying to rush their panopticon to solidify their power. Guess that means no reasoable privacy law for the US, even under hypothetical president newsom.
If the government is only restricted from buying the data, then they'll just have someone else buy it. Palantir is not the government. So they can buy the real time feed, analyze it in real time, and give the real time results of that analysis to the government without issue.
Restricting the government from buying that data does nothing. If you want to stop the government taking advantage of the data, then you would have to outlaw the collection of the data altogether. So that the initial collection of the data by anyone, is illegal.
Personally, I don't think that's gonna happen. There's way too many people making way too much money telling the government who hangs out with who, who cheats with who, and so on and so forth.
> the scale of indiscriminate nature of data capture and the volume of data being captured
It took a lot of naivete, to put it gently, and head-in-sand attitude to believe otherwise. Flock had everything in place to collect a treasure trove of data but they would decide not to do it? Out of principle? Or even the very charitable interpretation that they don't do it today but also they'll never cave in to the pressure to do it?
I might be good with legal guarantees, meaning jail time for those involved, that the only place images on these devices went was local to the municipality collecting them and that they were only accessed for very well defined reasons by very specific people.
The core issues are that aggregation and exfiltration of this data means that privacy is dead and the AI world allows analysis for almost no cost. We need an idea in our laws that puts back the limited scope that technology has removed. If the police have to expend one person's worth of time to listen to a wiretap then it really isn't possible to get out of control. We need that level of cost associated with ALPR and all surveillance so that the abuse of these systems doesn't get out of control. Make it appropriately hard and it won't be a problem.
There is another looming threat of modern day surveilance: previously hidden correlations.
The data you found benign sharing in the past might allow unpleasant conclusions in the future and might not even come from you personally. Think about what toys you bought for your kids, or in what college milieu your worldview developed.
Then the feds come in with a national security law and bypass all those state/local protections and slurp it all up into an AI-powered Palantir database, the very existence of which is classified. Suddenly you’re the victim of parallel construction and don’t even know it.
The database CANNOT EXIST SAFELY. Why don’t people who “might be okay with this IF…” understand that?
Collect the data and it WILL be misused, eventually, with 100% certainty. Has nobody read Snowden’s book? They even have a name for intel agents casually spying on their partners/crushes.
The law does not apply to everyone equally. The intelligence agencies get to break any laws they want without consequences, by longstanding tradition (remember 007’s “license to kill” or the CIA’s famous heart attack gun?). There are NO legal safeguards that can prevent abuse, no matter how you word them, because there will always be some animals who are “more equal than others” to whom they do not apply (“national security carve-out”, “LEO exception”, etc).
Sadly, those to whom they do not apply are now coordinating with the new wannabe SturmAbteilung in what are called “fusion centers”.
Or Martin Niemoller: a good Protestant German pastor who viewed the anti-theist attitudes of the Socialists and Communists as more threatening than Nazis. And then the Nazis put him in a concentration camp.
The defense of the photos not being government business until accessed seems shaky. That the physical camera installations were purposeful intentions to conduct government business in those areas is a reasonable line; this doesn't set precedent for Google's information becoming public records because the police might do a google search, to use an extreme example.
The proposed legislative amendment that would exclude Flock footage from public records (which would make this judgment moot) makes sense in the light of red light cameras already being excluded by the same legislators. However, I'd like to see a more incisive law covering both that would compel a reasonable amount of public insight into the footage.
The defense of the photos not being government business until accessed seems shaky.
It's reminiscent of the NSA's argument that data "collection" occurs only when a search is performed on existing "gathered" data. File under "Stuff that's only legal when the government does it."
Scott Alexander has a decent article (or rather a guest blog post) at https://www.astralcodexten.com/p/all-lawful-use-much-more-th... that brings up the subject in the context of Anthropic and OpenAI's dealings with the Department of War (sic), and how their contracts with the DoW might be interpreted with regard to mass surveillance of Americans.
Worth checking out. I'm not personally knowledgeable enough to vouch for the veracity, though.
My prediction: “It’s not a “search” when an AI looks at the stored database and does sentiment analysis, because an algorithm doesn’t violate your privacy. It’s only a “search” after it’s flagged and an actual human with human opinions sees your private chats criticizing the supreme leader.”
> Cameras that automatically capture images of vehicle license plates are being turned off by police in jurisdictions across Washington state, in part after a court ruled the public has a right to access data generated by the technology.
> For now, Everett’s Flock camera network remains offline, as the debate over transparency, privacy and public safety continues in the Legislature. The bill in Olympia that would put guidelines on Flock's data has passed in the Senate.
> “We were very disappointed,” Franklin said. “That means perpetrators of crime, people who are maybe engaged in domestic abuse or stalkers, they can request footage and that could cause a lot of harm.”
No concern over the dozens (or hundreds?) of cases of police or government employees themselves doing exactly what they’re afraid of here. Strange.
While I agree with the risks of DA/stalkers getting that data, this data is not known for being well protected against LoveInt. Quite the opposite it is usually sold on grey markets.
"The masses/general populace are the enemy" - once you understand that this is the fundamental belief at the root of the elites behaviour, everything will make sense. Flock cameras and AI surveillance is designed to reign in 'the enemy'.
I am less worried about Flock ALPR (which are aimed in the direction of traffic flow to read rear number plates) as I am about the THOUSANDS of facial recognition cameras installed in the last year in all four directions at nearly every intersection in southern Nevada and many many cities in southern California (LA notably excepted). These are mounted above the stoplights and aimed against traffic at stoplights to read faces.
I mention these locales specifically only because I have directly observed them. I would be surprised if this isn’t also happening in many other US metro areas, given how eagerly DHS/TSA/CBP/ICE are mass collecting facial geometries at every available opportunity.
A mix of public (city councils) and private (think HOAs that then donated access/equipment to the city) contracted with Flock in the past few years.
The questions of exactly who, when, and why, are very muddy especially with the HOAs who operate rather privately.
The fact that they’re gonna shut it down, implies the scale of indiscriminate nature of data capture and the volume of data being captured.
These cameras are popping up all over the nation and if people realize how much data is being captured and where that data is going (or who it’s being sold to) and how it’s being used by government and private entities they would be appalled.
There’s been exposés about these cameras, everything from AI misidentification of “stolen” (not) vehicles and erroneous arrests and police encounters, to analysis of shopping patterns being sold back to private entities for better ad targeting. It’s wild.
Just as two trivial examples, even though neither affects me personally:
The estimated number of heroin users in the UK exceeds the total prison population. The number of class-A drug users in the UK is estimated to be so high that if they actually followed the minimum sentencing guidelines for possession, it would cause a catastrophic economic disaster both from all the people no longer working and also all of the people who suddenly had to build new prisons to hold them. I'm not interested in drugs (and I don't live in the UK, but I assume the UK isn't abnormal in this regard).
Another example is road traffic law. Even just speeding offences, I think you probably catch everyone who actually drives in the UK, often enough that after a month the only people left allowed to drive would be people like me who don't even own a car.
The entire legal system has to be radically changed with far less punishments for almost everything if you have perfect, or even 30% of the way to perfect, surveillance.
What I don't mind is private companies using AI analysis to support their security guards. I object to any sharing of the data with third parties though. It should be illegal for the data to leave their internal network and it should be illegal to retain it for more than a few days.
I don't care if grocery store loss prevention has eyes on every aisle. My concern is data warehousing and subsequent misuse.
I think the current insane development are surveilance capitalists, trying to rush their panopticon to solidify their power. Guess that means no reasoable privacy law for the US, even under hypothetical president newsom.
If the government is only restricted from buying the data, then they'll just have someone else buy it. Palantir is not the government. So they can buy the real time feed, analyze it in real time, and give the real time results of that analysis to the government without issue.
Restricting the government from buying that data does nothing. If you want to stop the government taking advantage of the data, then you would have to outlaw the collection of the data altogether. So that the initial collection of the data by anyone, is illegal.
Personally, I don't think that's gonna happen. There's way too many people making way too much money telling the government who hangs out with who, who cheats with who, and so on and so forth.
Presumably they’re not doing this for free.
It took a lot of naivete, to put it gently, and head-in-sand attitude to believe otherwise. Flock had everything in place to collect a treasure trove of data but they would decide not to do it? Out of principle? Or even the very charitable interpretation that they don't do it today but also they'll never cave in to the pressure to do it?
The core issues are that aggregation and exfiltration of this data means that privacy is dead and the AI world allows analysis for almost no cost. We need an idea in our laws that puts back the limited scope that technology has removed. If the police have to expend one person's worth of time to listen to a wiretap then it really isn't possible to get out of control. We need that level of cost associated with ALPR and all surveillance so that the abuse of these systems doesn't get out of control. Make it appropriately hard and it won't be a problem.
The data you found benign sharing in the past might allow unpleasant conclusions in the future and might not even come from you personally. Think about what toys you bought for your kids, or in what college milieu your worldview developed.
The database CANNOT EXIST SAFELY. Why don’t people who “might be okay with this IF…” understand that?
Collect the data and it WILL be misused, eventually, with 100% certainty. Has nobody read Snowden’s book? They even have a name for intel agents casually spying on their partners/crushes.
The law does not apply to everyone equally. The intelligence agencies get to break any laws they want without consequences, by longstanding tradition (remember 007’s “license to kill” or the CIA’s famous heart attack gun?). There are NO legal safeguards that can prevent abuse, no matter how you word them, because there will always be some animals who are “more equal than others” to whom they do not apply (“national security carve-out”, “LEO exception”, etc).
Sadly, those to whom they do not apply are now coordinating with the new wannabe SturmAbteilung in what are called “fusion centers”.
You have to point out the Jews in Amsterdam. They had nothing to hide--until they did.
https://en.wikipedia.org/wiki/1943_Amsterdam_civil_registry_...
Or Martin Niemoller: a good Protestant German pastor who viewed the anti-theist attitudes of the Socialists and Communists as more threatening than Nazis. And then the Nazis put him in a concentration camp.
https://en.wikipedia.org/wiki/Martin_Niem%C3%B6ller https://en.wikipedia.org/wiki/First_They_Came
The defense of the photos not being government business until accessed seems shaky. That the physical camera installations were purposeful intentions to conduct government business in those areas is a reasonable line; this doesn't set precedent for Google's information becoming public records because the police might do a google search, to use an extreme example.
The proposed legislative amendment that would exclude Flock footage from public records (which would make this judgment moot) makes sense in the light of red light cameras already being excluded by the same legislators. However, I'd like to see a more incisive law covering both that would compel a reasonable amount of public insight into the footage.
It's reminiscent of the NSA's argument that data "collection" occurs only when a search is performed on existing "gathered" data. File under "Stuff that's only legal when the government does it."
Worth checking out. I'm not personally knowledgeable enough to vouch for the veracity, though.
> Cameras that automatically capture images of vehicle license plates are being turned off by police in jurisdictions across Washington state, in part after a court ruled the public has a right to access data generated by the technology.
https://www.geekwire.com/2025/washington-state-cities-turn-o...
https://www.ycombinator.com/companies/flock-safety
> For now, Everett’s Flock camera network remains offline, as the debate over transparency, privacy and public safety continues in the Legislature. The bill in Olympia that would put guidelines on Flock's data has passed in the Senate.
No concern over the dozens (or hundreds?) of cases of police or government employees themselves doing exactly what they’re afraid of here. Strange.
https://www.everettpost.com/local-news/everett-temporarily-s...
Why does that not convince me?
I mention these locales specifically only because I have directly observed them. I would be surprised if this isn’t also happening in many other US metro areas, given how eagerly DHS/TSA/CBP/ICE are mass collecting facial geometries at every available opportunity.
"I wouldn't be surprised if this is also happening..."
Only the second sounds correct to me.
Flock is no more populate on the right than it is on the Left.