Courts Weigh In On ALPR

In quick succession, two courts weighed in on the license plate reader problem – in very different ways.

In Schmidt v Norfolk, the one you’ve probably heard of, a federal court declined to reconsider the “no right to privacy in a public place” axiom in the wake of Carpenter, Beautiful Struggle and other cases starting to look at it in a different way. The judge however did leave the door open to a different conclusion in the future.

In Bartholomew v Parking Concepts, the case you haven’t heard of, the CA Court of Appeals overturned a lower court ruling to determine that an ALPR operator not having the written privacy policy that is required by SB 34 (2015) is all by itself a privacy harm for which an injured plaintiff can sue. The unanimous decision by a 3 judge panel concluded that if a plaintiff cannot access the information about what is being collected, how it is being stored and who has access to it that they are entitled to by law, then they have been sufficiently harmed to file a lawsuit. This ruling makes SB 34 immediately a more enforceable and more useful regulatory tool.

You can read both court decisions below.

Automated License Plate Reader Use Doesn’t Add Up


by Tracy Rosenberg

In June of 2025, Oakland Privacy received a Flock audit log from the Riverside County Sheriff’s Office detailing the activity in their cloud database of license plate reader scans from the department’s owned cameras. After keyword searches of the hundreds of thousands of entries, a handful of hits for out of state shares and references to searches “for” Customs and Border Patrol got (rightfully) most of the attention and formed the basis for this Cal Matters story by Khari Johnson.

But in looking in a more leisurely fashion at the rest of the data – and we are not picking on the Riverside County Sheriff in particular, these logs are very similar throughout the state – we wanted to pose some questions about the plain old regular legally compliant use of this technology. In short, we don’t completely understand what they are looking for.

How CA Tried to Address Algorithmic and Surveillance Pricing (Part 2)

by Samuel Leitch

In 2022, a time when rents continued to soar impossibly, the company RealPage boasted that it could help landlords increase profits even further.1 How was this possible? By pooling nonpublic pricing data from clients, RealPage’s software can offer landlords recommendations on the highest possible rent that they can set for a given residential unit. Although it would be illegal for these landlords to communicate directly to agree upon rent prices, critics of RealPage claim that this software simply acts as an algorithmic middleman for price setting. RealPage is no small player, either: Greystar, the largest apartment manager in the United States, will have to pay $7 million to nine different states for its use of RealPage’s software.2

This scenario is an example of algorithmic price fixing, where through the use of software, competitors may coordinate outcomes that would otherwise be illegal under antitrust law.

How CA Tried to Address Algorithmic and Surveillance Pricing (Part 1)

by Samuel Leitch

Do rideshare companies charge you more if your phone is about to die? The trending Internet rumor, based in part on a fabricated screenshot posted to Reddit, is ultimately unproven.1 However, it serves as an example of a real-world practice called surveillance pricing, which companies use every day in the era of big data.

In a recent study, the FTC found that “details like a person’s precise location or browser history can be frequently used to target individual consumers with different prices for the same goods and services.”2 Data collected from browser cookies, data brokers, and other sources can determine how much more willing you are to purchase a given product or service—and, consequently, how much more you might be charged.

Facts and Fiction on the California Invasion of Privacy Act and the problematic SB 690

by Don Marti and Robert Tauler

While other states debate private right of action in privacy laws, California already has one, and it’s working for us. Since it was enacted the California Invasion of Privacy Act (CIPA) has included an individual’s right to bring a civil suit when someone surreptitiously tracks them, allowing for more robust privacy enforcement than any other state.

CIPA was ahead of its time, but some special interests want to weaken it. California Senate Bill 690, which passed the state Senate in June, would eliminate some of CIPA’s protections entirely if the surveillance was done for a “business purpose.” Unfortunately, a lot of Big Tech and data broker misinformation has come along with the bill, so we are taking the opportunity to clear the air.

Below are the fictions being advanced by Big Tech’s lobbying efforts, which we clarify with the facts.

An SF Supervisor Wants To Make SF’s Surveillance Transparency Law Unenforceable

San Francisco supervisor Matt Dorsey, a former police department public affairs official, is taking aim at San Francisco’s 2019 surveillance transparency ordinance and facial recognition ban by trying to strip the provision that pays attorney fees for people who enforce the law in court. 

If Dorsey’s ordinance passes, then only individually wealthy people would be able to bring suits in response to violations. We will largely be left with SF city government policing itself. 

Eliminating attorney fees eliminates public accountability and sanctions lawlessness. 

Use this easy one-click action to tell the San Francisco Board of Supervisors to vote NO on the Dorsey proposal.