Privacy Research

Privacy Case Studies

40 research case studies organized by the Privacy Transistors framework. Explore real-world privacy challenges across linkability, power dynamics, knowledge gaps, and jurisdictional conflicts.

40
Case Studies
4
Categories
~150
Pages Total
4
PDF Downloads
T1SOLID

Linkability

Technical mechanisms that enable re-identification and tracking of individuals across systems

Definition: The ability to connect two pieces of information to the same person.

Download PDF
01

Browser fingerprinting

The Problem

Linking device attributes into a unique identity — screen, fonts, WebGL, canvas combine into a fingerprint identifying 90%+ of browsers.

Recommended Solution

Redact: completely removing fingerprint-contributing values eliminates the data points that algorithms combine into unique identifiers.

Compliance Mapping

GDPR Article 5(1)(c) data minimization, ePrivacy Directive tracking consent

02

Quasi-identifier re-identification

The Problem

87% of the US population identifiable by zip code + gender + date of birth alone. Netflix Prize dataset de-anonymized via IMDB correlation.

Recommended Solution

Hash: deterministic SHA-256 hashing enables referential integrity across datasets while preventing re-identification from original values.

Compliance Mapping

GDPR Recital 26 identifiability test, Article 89 research safeguards

03

Metadata correlation

The Problem

Linking who/when/where without content — 'we kill people based on metadata' (former NSA director).

Recommended Solution

Redact: removing metadata fields entirely prevents correlation attacks that link communication patterns to individuals.

Compliance Mapping

GDPR Article 5(1)(f) integrity and confidentiality, ePrivacy Directive metadata restrictions

04

Phone number as PII anchor

The Problem

Linking encrypted communications to real-world identity via mandatory SIM registration in 150+ countries.

Recommended Solution

Replace: substituting phone numbers with format-valid but non-functional alternatives maintains data structure while removing the PII anchor.

Compliance Mapping

GDPR Article 9 special category data in sensitive contexts, ePrivacy Directive

05

Social graph exposure

The Problem

Contact discovery maps entire relationship networks — personal, professional, medical, legal, political.

Recommended Solution

Redact: removing contact identifiers from documents prevents construction of social graphs from document collections.

Compliance Mapping

GDPR Article 5(1)(c) data minimization, Article 25 data protection by design

06

Behavioral stylometry

The Problem

Writing style, posting schedule, timezone activity uniquely identify users even with perfect technical anonymization. 90%+ accuracy from 500 words.

Recommended Solution

Replace: replacing original text content with anonymized alternatives disrupts the stylometric fingerprint that writing analysis algorithms depend on.

Compliance Mapping

GDPR Article 4(1) personal data extends to indirectly identifying information including writing style

07

Hardware identifiers

The Problem

MAC addresses, CPU serials, TPM keys — burned into hardware, persistent across OS reinstalls, the ultimate cookie.

Recommended Solution

Redact: completely removing hardware identifiers from documents and logs eliminates persistent tracking anchors that survive OS reinstalls.

Compliance Mapping

GDPR Article 4(1) device identifiers as personal data, ePrivacy Article 5(3)

08

Location data

The Problem

4 spatiotemporal points uniquely identify 95% of people. Used to track abortion clinic visitors, protesters, military.

Recommended Solution

Replace: substituting location data with generalized alternatives preserves geographic context while preventing individual tracking.

Compliance Mapping

GDPR Article 9 when location reveals sensitive activities, Article 5(1)(c) minimization

09

RTB broadcasting

The Problem

Real-time bidding broadcasts location + browsing + interests to thousands of companies, 376 times per day per European user.

Recommended Solution

Redact: removing PII before it enters advertising pipelines prevents the 376-times-daily broadcast of personal information.

Compliance Mapping

GDPR Article 6 lawful basis, ePrivacy Directive consent for tracking, Article 7 consent conditions

10

Data broker aggregation

The Problem

Acxiom, LexisNexis combine hundreds of sources — property records, purchases, app SDKs, credit cards — into comprehensive profiles.

Recommended Solution

Redact: removing identifiers before data leaves organizational boundaries prevents contribution to cross-source aggregation profiles.

Compliance Mapping

GDPR Article 5(1)(b) purpose limitation, Article 5(1)(c) minimization, CCPA opt-out rights

T3STRUCTURAL LIMIT

Power Asymmetry

Imbalances in control between data subjects and data controllers that undermine meaningful consent

Definition: The collector designs the system, profits from collection, writes the rules, and lobbies for the legal framework.

Download PDF
01

Dark patterns

The Problem

One-click to consent, 15 steps to delete. Studies show dark patterns increase consent from ~5% to 80%+. Asymmetry by design.

Recommended Solution

Redact: anonymizing personal data entered through consent interfaces reduces value extracted through dark patterns.

Compliance Mapping

GDPR Article 7 conditions for consent, Article 25 data protection by design

02

Default settings

The Problem

Windows 11 ships with telemetry, ad ID, location, activity history all ON. Each default represents billions of users whose PII is collected because they didn't opt out.

Recommended Solution

Redact: removing tracking identifiers from data transmitted by default-on settings reduces PII collected through privacy-hostile configurations.

Compliance Mapping

GDPR Article 25(2) data protection by default, ePrivacy Article 5(3)

03

Surveillance advertising economics

The Problem

Meta's €1.2B GDPR fine equals less than 1% of annual revenue (~$135B). Fines are a cost of doing business, not a deterrent. Median GDPR fine under €100K.

Recommended Solution

Redact: anonymizing PII before it enters advertising systems reduces personal data available for surveillance capitalism.

Compliance Mapping

GDPR Article 6 lawful basis, Article 21 right to object to direct marketing

04

Government exemptions

The Problem

The largest PII collectors (tax, health, criminal records, immigration) exempt themselves from the strongest protections. GDPR Art 23 allows restricting rights for 'national security'.

Recommended Solution

Redact: anonymizing government-issued identifiers in documents prevents use beyond the original collection context.

Compliance Mapping

GDPR Article 23 restrictions for national security, Article 9 special category data

05

Humanitarian coercion

The Problem

Refugees must surrender biometrics as condition of receiving food. Most extreme power imbalance: surrender your most sensitive PII or don't survive.

Recommended Solution

Redact: removing identifying information from humanitarian documents after processing protects vulnerable populations.

Compliance Mapping

GDPR Article 9 special category data, UNHCR data protection guidelines

06

Children's vulnerability

The Problem

PII profiles built before a person can spell 'consent.' School-issued Chromebooks monitor 24/7. Proctoring software uses facial recognition on minors.

Recommended Solution

Redact: anonymizing children's PII in educational records prevents lifelong tracking from data collected before meaningful consent.

Compliance Mapping

GDPR Article 8 children's consent, FERPA student records, COPPA parental consent

07

Legal basis switching

The Problem

Company switches from 'consent' to 'legitimate interest' when you withdraw consent. Continues processing same PII under different legal justification.

Recommended Solution

Redact: anonymizing personal data across legal basis changes prevents continued use of PII collected under withdrawn consent.

Compliance Mapping

GDPR Article 6 lawful basis, Article 7(3) right to withdraw consent, Article 17 erasure

08

Incomprehensible policies

The Problem

Average 4,000+ words at college reading level. 76 work days/year needed to read all. 'Informed consent' is legal fiction at internet scale.

Recommended Solution

Redact: anonymizing PII in submitted documents reduces personal data surrendered through policies nobody reads.

Compliance Mapping

GDPR Article 12 transparent information, Article 7 consent conditions

09

Stalkerware

The Problem

Consumer spyware captures location, messages, calls, photos, keystrokes. Installed by abusers. Industry worth hundreds of millions, operating in regulatory vacuum.

Recommended Solution

Redact: anonymizing device data exports removes PII that stalkerware captures, enabling victims to document abuse safely.

Compliance Mapping

GDPR Article 5(1)(f) integrity and confidentiality, domestic abuse legislation

10

Verification barriers

The Problem

To delete PII, you must provide even more sensitive PII — government ID, notarized documents. More verification to delete than to create.

Recommended Solution

Redact: anonymizing verification documents after deletion request completion prevents accumulation of sensitive identity data.

Compliance Mapping

GDPR Article 12(6) verification of data subject identity, Article 17 right to erasure

T6SOLID

Knowledge Asymmetry

Information gaps between privacy engineers and users that lead to implementation failures

Definition: The gap between what is known and what is practiced.

Download PDF
01

Developer misconceptions

The Problem

'Hashing = anonymization' believed by millions of developers. Hashed emails are still personal data under GDPR. Most CS curricula include zero privacy training.

Recommended Solution

Hash: proper SHA-256 hashing through a validated pipeline ensures consistent, auditable anonymization meeting GDPR requirements.

Compliance Mapping

GDPR Recital 26 identifiability test, Article 25 data protection by design

02

DP misunderstanding

The Problem

Organizations adopt differential privacy without understanding epsilon. DP does not make data anonymous, does not prevent aggregate inference, does not protect against all attacks.

Recommended Solution

Redact: anonymizing underlying PII before applying DP provides defense in depth — even if epsilon is set incorrectly, raw data is protected.

Compliance Mapping

GDPR Recital 26 anonymization standards, Article 89 statistical processing safeguards

03

Privacy vs security confusion

The Problem

Users believe antivirus protects PII. But Google, Amazon, Facebook collect PII through normal authorized use. Primary threat is legitimate collection, not unauthorized access.

Recommended Solution

Redact: anonymizing PII in security logs addresses the gap between security and privacy — security tools protect systems, but PII requires anonymization.

Compliance Mapping

GDPR Article 5(1)(f) integrity and confidentiality, Article 32 security of processing

04

VPN deception

The Problem

'Military-grade encryption' from companies that log everything. PureVPN provided logs to FBI despite 'no-log' marketing. Free VPNs caught selling bandwidth.

Recommended Solution

Redact: anonymizing browsing data at the document level provides protection independent of VPN claims — whether or not the VPN logs, PII is already anonymized.

Compliance Mapping

GDPR Article 5(1)(f) confidentiality, ePrivacy metadata provisions

05

Research-industry gap

The Problem

Differential privacy published 2006, first major adoption 2016. MPC and FHE remain mostly academic after decades. Transfer pipeline from research to practice is slow and lossy.

Recommended Solution

Hash: providing production-ready anonymization bridges the 10-year gap between academic research publication and industry adoption.

Compliance Mapping

GDPR Article 89 research safeguards, Article 25 data protection by design

06

Users unaware of scope

The Problem

Most don't know: ISP sees all browsing, apps share location with brokers, email providers scan content, 'incognito' doesn't prevent tracking. Billions consent to collection they don't understand.

Recommended Solution

Redact: anonymizing personal data before it enters any system addresses the awareness gap — protection works even when users don't understand collection scope.

Compliance Mapping

GDPR Articles 13-14 right to be informed, Article 12 transparent communication

07

Password storage

The Problem

bcrypt available since 1999, Argon2 since 2015. Plaintext password storage still found in production in 2026. 13B+ breached accounts, many from trivially preventable mistakes.

Recommended Solution

Encrypt: AES-256-GCM encryption of credentials demonstrates the correct approach — industry-standard cryptography, not plaintext storage.

Compliance Mapping

GDPR Article 32 security of processing, ISO 27001 access control

08

Unused cryptographic tools

The Problem

MPC, FHE, ZKP could solve major PII problems but remain in academic papers. Theoretical solutions awaiting practical deployment for decades.

Recommended Solution

Redact: providing practical, deployable anonymization today addresses the gap while MPC/FHE/ZKP remain in academic development.

Compliance Mapping

GDPR Article 25 data protection by design, Article 32 state-of-the-art measures

09

Pseudonymization confusion

The Problem

Developers believe UUID replacement = anonymization. But if the mapping table exists, data remains personal data under GDPR. The distinction has billion-dollar legal consequences.

Recommended Solution

Redact: true redaction removes data from GDPR scope entirely — addressing the billion-dollar distinction between pseudonymization and anonymization.

Compliance Mapping

GDPR Article 4(5) pseudonymization definition, Recital 26 anonymization standard

10

OPSEC failures

The Problem

Whistleblowers search for SecureDrop from work browsers. Users resize Tor Browser window. Developers commit API keys. Single careless moment permanently deanonymizes.

Recommended Solution

Redact: anonymizing sensitive identifiers in code and documents before sharing prevents single-careless-moment OPSEC failures.

Compliance Mapping

GDPR Article 32 security measures, EU Whistleblower Directive source protection

T7STRUCTURAL LIMIT

Jurisdiction Fragmentation

Legal and regulatory conflicts across borders that create protection gaps and compliance challenges

Definition: PII flows globally in milliseconds.

Download PDF
01

US federal law absence

The Problem

No comprehensive federal privacy law in the world's largest tech economy. Patchwork of HIPAA, FERPA, COPPA, and 50 state laws. Data brokers operate in regulatory void.

Recommended Solution

Redact: anonymizing PII across all US regulatory categories using a single platform eliminates the patchwork compliance problem.

Compliance Mapping

HIPAA Privacy Rule, FERPA student records, COPPA, CCPA consumer rights

02

GDPR enforcement bottleneck

The Problem

Ireland's DPC handles most Big Tech complaints. 3-5 year delays. noyb filed 100+ complaints — many still unresolved. Overruled by EDPB repeatedly.

Recommended Solution

Redact: anonymizing PII before it becomes subject to regulatory disputes eliminates the enforcement bottleneck — anonymized data is outside GDPR scope.

Compliance Mapping

GDPR Articles 56-60 cross-border cooperation, Article 83 administrative fines

03

Cross-border conflicts

The Problem

GDPR demands protection vs CLOUD Act demands access vs China's NSL demands localization. Creates impossible simultaneous compliance.

Recommended Solution

Encrypt: AES-256-GCM encryption enables organizational control with jurisdictional flexibility — encrypted data protected from unauthorized government access.

Compliance Mapping

GDPR Chapter V transfers, US CLOUD Act, China PIPL data localization

04

Global South law absence

The Problem

Only ~35 of 54 African countries have data protection laws. Variable enforcement. PII collected by telecoms, banks, government without constraint.

Recommended Solution

Redact: anonymizing data collected by telecoms, banks, and governments prevents misuse where data protection laws are absent.

Compliance Mapping

African Union Malabo Convention, national data protection laws where they exist

05

ePrivacy stalemate

The Problem

Pre-smartphone rules governing smartphone communications since 2017. Nine years of stalemate from industry lobbying. 2002 Directive still in effect.

Recommended Solution

Redact: anonymizing tracking data regardless of ePrivacy status provides protection not dependent on resolving a nine-year regulatory stalemate.

Compliance Mapping

ePrivacy Directive 2002/58/EC, proposed ePrivacy Regulation, GDPR Article 95

06

Data localization dilemma

The Problem

African/MENA/Asian PII stored in US/EU data centers. Subject to CLOUD Act. But local storage in weak-rule-of-law countries may reduce protection.

Recommended Solution

Redact: anonymizing data at collection eliminates the localization dilemma — anonymized data does not require localization.

Compliance Mapping

GDPR Article 44 transfer restrictions, national data localization requirements

07

Whistleblower jurisdiction shopping

The Problem

Five Eyes intelligence sharing bypasses per-country protections. Source in Country A, org in Country B, server in Country C — three legal regimes, weakest wins.

Recommended Solution

Redact: anonymizing source-identifying information before documents cross jurisdictions prevents weakest-link exploitation.

Compliance Mapping

EU Whistleblower Directive, press freedom laws, Five Eyes agreements

08

DP regulatory uncertainty

The Problem

No regulator has formally endorsed differential privacy as satisfying anonymization requirements. Organizations invest in DP with uncertain legal status.

Recommended Solution

Redact: anonymizing PII using established methods provides legal certainty that DP currently lacks — regulators endorse anonymization but not DP.

Compliance Mapping

GDPR Recital 26 anonymization standard, Article 29 Working Party opinion

09

Surveillance tech export

The Problem

NSO Group (Israel) sells Pegasus found in 45+ countries — Saudi Arabia, Mexico, India, Hungary. Export controls weak, enforcement weaker, accountability zero.

Recommended Solution

Redact: anonymizing surveillance research documents prevents identification of targets and journalists investigating spyware proliferation.

Compliance Mapping

EU Dual-Use Regulation, Wassenaar Arrangement, human rights legislation

10

Government PII purchasing

The Problem

ICE, IRS, DIA buy location data from brokers. Purchasing what they cannot legally collect. Third-party doctrine loophole converts commercial data into government surveillance.

Recommended Solution

Redact: anonymizing location data before it reaches commercial datasets closes the third-party doctrine loophole — agencies cannot buy what is anonymized.

Compliance Mapping

Fourth Amendment, GDPR Article 6, proposed Fourth Amendment Is Not For Sale Act

Download All Case Studies

Access all 40 case studies organized into 4 comprehensive PDF documents. Each PDF contains detailed analysis of 10 privacy challenges with real-world examples.

About the Privacy Transistors Framework

The Privacy Transistors framework categorizes privacy challenges into distinct types based on their underlying mechanisms and potential solutions:

  • SOLID transistors (T1, T6) represent technical challenges that can be addressed through better engineering, tools, and education.
  • STRUCTURAL LIMIT transistors (T3, T7) represent systemic issues rooted in power imbalances and regulatory gaps that require policy interventions.

This research helps organizations understand where PII anonymization tools like anonym.legal can provide protection (SOLID challenges) versus where broader systemic changes are needed (STRUCTURAL LIMITS).

Frequently Asked Questions

What is the Privacy Transistors framework?

The Privacy Transistors framework categorizes privacy challenges into distinct types based on their underlying mechanisms. SOLID transistors (T1, T6) are technical challenges addressable through engineering and tools. STRUCTURAL LIMIT transistors (T3, T7) are systemic issues requiring policy interventions.

What are the 4 categories of privacy case studies?

The 40 case studies are organized into 4 categories: T1 Linkability (re-identification and tracking mechanisms), T3 Power Asymmetry (consent and control imbalances), T6 Knowledge Asymmetry (information gaps leading to implementation failures), and T7 Jurisdiction Fragmentation (cross-border legal conflicts).

How can anonym.legal help with SOLID privacy challenges?

anonym.legal addresses SOLID challenges (T1 Linkability, T6 Knowledge Asymmetry) through PII detection and anonymization. By detecting and removing identifiers like browser fingerprints, quasi-identifiers, and metadata, organizations can prevent re-identification risks covered in these case studies.

What is the difference between SOLID and STRUCTURAL LIMIT transistors?

SOLID transistors represent technical challenges that can be solved with better tools, engineering practices, and education. STRUCTURAL LIMIT transistors represent systemic problems rooted in power imbalances (dark patterns, surveillance capitalism) or regulatory gaps (GDPR enforcement delays, cross-border conflicts) that require policy changes.

Where can I download the full case study PDFs?

All 4 case study PDFs are available for free download at anonym.community. Each PDF contains 10 detailed case studies (~37 pages per document) covering real-world privacy challenges with analysis and examples.

Apply These Insights

Understanding privacy challenges is the first step. anonym.legal helps you address SOLID privacy risks with practical PII detection and anonymization tools.