Largest CCPA Settlement Yet: What GM’s $12.75M penalty changes about US privacy enforcement

On May 8, 2026, the California Attorney General announced that General Motors (GM) has agreed to pay $12.75 million to settle allegations that it illegally collected and sold driving and location data from hundreds of thousands of OnStar subscribers.
That makes it the largest California Consumer Privacy Act (CCPA) settlement in history; previously, the largest CCPA penalty was $2.75M, handed down to Disney in February 2026. But the size of the penalty is not the most important part.
The real story is what California regulators tested: whether GM could justify why sensitive driving data was still retained, why it was later reused for insurance risk scoring, and why that actual data flow contradicted what consumers were told.
That matters far beyond connected vehicles. The GM case is the first CCPA action enforcing CCPA’s data minimization and purpose limitation requirements.
For privacy and legal teams, the question is no longer just whether the policy says the right thing. It’s whether data processing evidence can prove that the data was collected, retained, shared, and deleted for the right reasons under CCPA.
The story behind GM’s illegal sale of personal data
GM's OnStar platform collected a remarkable amount of data from connected vehicles: GPS coordinates, hard braking events, rapid acceleration, speed threshold crossings, seat belt usage, late-night driving patterns, and trip duration. All of this data was generated as a byproduct of providing emergency assistance, navigation, and crash response services.
In 2020, GM began selling that data to two data brokers, LexisNexis Risk Solutions and Verisk Analytics. Both companies then used the data to build insurance risk scoring products. This continued until 2024. According to the California Attorney General (AG)'s complaint, GM's own privacy notices stated explicitly that it would not sell driving or location data.
GM’s actual data flows said otherwise.
Investigators found that GM had retained consumers’ driving and location data long after it was needed to operate OnStar services. Then, the company monetized that retained data.
This is a direct violation of the CCPA's data minimization and purpose limitation requirements, legislation that was added to California law in 2023.
The settlement requires GM to:
- Pay $12.75 million in civil penalties
- Stop selling driving data to consumer reporting agencies for five years
- Delete retained driving data within 180 days (absent consumer consent)
- Formally request that LexisNexis and Verisk delete the data already shared with them
GM also has to build out a strong privacy compliance program and submit regular assessments to CalPrivacy, the California Department of Justice, and several California district attorneys.
What makes this case so different
There have been significant CCPA actions before this. Most of them tested a pretty basic set of questions:
- Did the company honor opt-out requests?
- Did it properly disclose data sales?
- Did it use consumer data in the right context?
These questions are important, to be sure, but they’re also quite narrow.
The GM case is different in scope. It's the first CCPA action to specifically enforce data minimization requirements. This reveals that regulators are actually changing what they’re testing.
The GM case doesn't just ask whether you disclosed the data use correctly. It asks whether the data use was proportionate and purposeful in the first place. It asks whether you can prove that with technical evidence, not just policy language.
For most privacy teams, that is a materially harder question to answer.
The four failure modes regulators are now testing
The GM complaint is built around what we’ll call four types of drift. Each one has a direct parallel in how data programs fail inside real companies.
Purpose drift
Data collected for one legitimate purpose, like emergency assistance, navigation, or crash detection, was later used for something the original collection never considered: sale to data brokers. The complaint found that this purpose was never disclosed to consumers. Once data gets collected for purpose A, using it for purpose B violates CCPA.
Retention drift
GM reportedly began collecting driving and location data in 2016. The data sales to LexisNexis and Verisk didn't start until 2020. That four-year gap is the violation in miniature: data that should have been deleted because the operational need had passed was instead sitting in storage. This kept it available to be monetized later. The data didn't have to be improperly shared immediately. It just had to exist past its purpose.
Disclosure drift
GM's privacy notices told subscribers their data wouldn't be sold. The actual data flows proved otherwise. When regulators compared the published policy against their third-party data flows, the gap between statement and reality became evidence of the violation.
Governance drift
We can assume that GM had a privacy governance infrastructure: vendor contracts, internal review processes, and a privacy team, for starters. At no point did the infrastructure catch or stop these violations. That's a structural observation: privacy governance that operates at the policy and documentation layer, without any visibility into what's happening in production, will miss exactly these kinds of failures. Yes, the program existed, but the data privacy practices drifted anyway.
These four types of drift aren’t unique to automotive companies. They happen anywhere that data is collected at scale for one purpose and then ages into a system where business incentives create pressure to use it differently.
How to avoid similar enforcement
Ensure internal data policies match external disclosures
- Establish internal alignment for adherence to privacy regulation
- Disclose privacy policies consistent with internal governance approach
- Monitor changes to disclosed privacy policies, internal data practices, and privacy regulation
Build comprehensive data maps to track data flows against policies
- Map how all personal data is collected, used, stored, and shared
- Ensure real-time monitoring for high-volume personal data collection points, e.g., websites and apps
- Track data processing against all vendor contractual obligations and privacy policies
- Prioritize sensitive data, data sharing with advertising third parties, and any sale of personal data
Establish risk controls to assess new activity and flag non-compliant data flows
- Build scalable privacy assessment program to conduct the proper reviews based risk
- Implement checks at high-risk areas: websites, apps, new vendor reviews, new products, and software release pipelines
How Privado AI enables privacy compliance at scale
Dynamic Data Maps
- Build complete and real-time data maps without manual assessments
- Scan all first and third party software, documentation, and contracts to identify personal data elements, third parties, flows, and purposes
- Flag potential risk for any data processing violating your policies
Agentic Assessments
- Strengthen your privacy assessments and free up resources
- Instead of manually filling forms and running behind, hire Wren, our AI privacy analyst, to run the entire assessment process — from intake and research to risk analysis and approval routing
- Stakeholders simply input documents, and Wren populates the entire assessment
- Review AI-generated responses and evidence side-by-side
Web Auditor
- Continuously scan your websites to verify consent banners, cookies, pixels, and data flows are compliant with each privacy law in each location worldwide, including CCPA, CIPA, VPPA, GDPR, PIPEDA, etc.
- Flag sensitive data sharing by tracking each data element shared with each third party
- Immediately resolve risks with automated alerts that identify the root cause and route a ticket to appropriate dev team
App Auditor
- Continuously scan iOS and Android app files to verify consent banners, SDKs, and data flows are compliant with each privacy law in each location, including CCPA, CIPA, VPPA, GDPR, etc.
- Flag sensitive data sharing by tracking each data element shared with each third party
- Immediately resolve risks with automated alerts that identify the root cause and route a ticket to appropriate dev team




