When Personalization Breaks Privacy: Lessons from the TikTok Decision
In September 2025, Canada’s privacy regulators issued a major decision about TikTok’s data practices, reshaping what personalization means under Canadian privacy law. The Office of the Privacy Commissioner of Canada (OPC), together with Quebec, Alberta, and British Columbia’s privacy authorities, found that TikTok’s personalization systems violated core principles of consent, transparency, and appropriate purpose under PIPEDA and related provincial laws.
While this investigation focused on TikTok, the implications extend far beyond social media. If your organization uses AI, algorithms, or data analytics to personalize content, this decision should be on your radar.
What TikTok Did Wrong: A Breakdown
TikTok’s business model relies on personalized content recommendations, the “For You” feed, powered by complex algorithms. These systems collect and analyze enormous amounts of behavioural, biometric, and inferred data.
The OPC found that TikTok’s personalization practices were opaque and intrusive, particularly for children and teens.
Below is an organized chart of TikTok’s practices that were reviewed:
TikTok Practice | What Happened | Why the OPC Objected |
---|---|---|
Profiling users with inferred and biometric data | TikTok analyzed facial movements, voice tone, and engagement patterns to predict traits like age, interests, and gender. | These are biometric and inferred data, considered highly sensitive under PIPEDA and requiring explicit consent. |
Vague privacy disclosures | Personalization, ad targeting, and analytics were bundled into a single consent statement. | Users couldn’t distinguish what they were agreeing to, making consent invalid. |
Using implied consent | TikTok assumed users agreed to personalization by using the platform. | The OPC ruled that this exceeded users’ reasonable expectations, express consent was required. |
Profiling minors | Many underage users had accounts and were shown personalized content. | Collecting or using children’s data for profiling was deemed an inappropriate purpose. |
Insufficient transparency | Explanations of personalization were buried in legal text. | The OPC expects clear, up-front disclosures in plain language. |
What Regulators Expect Going Forward
The OPC made clear that personalization is a high-impact data use, and businesses must handle it accordingly.
Expectation | What Businesses Should Do |
---|---|
Explicit, informed consent | Ask users to actively opt in before personalizing content based on profiling, inference, or biometrics. |
Front-loaded transparency | Explain how personalization works during signup or onboarding, not buried in a policy. |
No profiling of minors | Disable personalization and ad targeting for users under 18. |
Privacy-first defaults | Default personalization settings to “off” until users opt in. |
Granular consent | Let users choose between types of personalization (e.g., “recommended content” vs. “targeted ads”). |
Plain-language explanations | Avoid jargon; use examples and visuals to show how personalization affects experience. |
PIAs and algorithm audits | Regularly assess personalization tools for fairness, accuracy, and privacy impact. |
Why This Decision Matters for Canadian Businesses
1. Personalization Is No Longer “Business as Usual”
The OPC’s TikTok findings make it clear: personalization is a regulated activity, not a marketing bonus. Whether you’re in retail, tech, or professional services, any time your system tailors content or recommendations, it triggers consent and transparency obligations.
2. Children’s Data Requires Extra Protection
Even if your company doesn’t target minors, regulators expect you to design systems that identify and protect young users. The standard is now “data minimization + no profiling for youth.”
3. Transparency Must Be Practical
Canadian regulators are pushing for “just-in-time transparency”, short, contextual notices that appear as people make decisions. For example:
“We’ll use your activity to recommend content. You can change this anytime in settings.”
4. “Reasonable Expectation” Sets the New Consent Bar
If your personalization involves inferences, biometrics, or sensitive traits, you can no longer rely on implied consent. The OPC considers these beyond what a “reasonable person” would expect.
5. National Consistency Is Coming
The TikTok investigation was a joint effort by four regulators, signalling more coordinated enforcement across Canada. Businesses can expect consistent expectations across provinces, regardless of jurisdiction.
How to Build Compliant and Trusted Personalization
Here’s are a few proactive measures to take to be compliant and earn consumer trust:
Step 1: Map Your Data
Identify what data feeds your personalization engine (e.g., behaviour, demographics, location, biometrics, or inferences) and link each to a clear purpose.
Step 2: Conduct a Privacy Impact Assessment (PIA)
Assess whether personalization is necessary, proportional, and appropriate. Document risks, mitigations, and lawful bases.
Step 3: Design for Opt-In Consent
Create an active choice moment, for example: “Personalize my experience (uses your activity and preferences).”
Step 4: Explain Inferences Clearly
Tell users what’s being inferred (We may predict your interests to show relevant content) and how it affects their experience.
Step 5: Limit or Disable Personalization for Minors
Use technical measures to prevent profiling for under-18 users and block under-13 accounts entirely.
Step 6: Monitor and Audit Algorithms
Run regular checks for bias, over-collection, or creep in personalization data sources.
Step 7: Keep Records
Document your consent flows, privacy communications, and audit findings. The OPC expects evidence of accountability.
The Bigger Trend: Personalization Meets Accountability
TikTok’s case is part of a broader shift. Regulators in Canada, the EU, and beyond are scrutinizing algorithmic personalization, AI-driven recommendations, and automated decision-making.
Businesses that proactively align personalization with privacy-by-design will gain both regulatory resilience and consumer trust.
The message is simple: you can personalize, but you must do it responsibly.