Brazil LGPD AI Compliance: Automated Decision Rights Under Article 20
Brazil's Lei Geral de Proteção de Dados (LGPD, Lei 13.709/2018) is the comprehensive privacy framework governing all organizations processing personal data of individuals in Brazil. Article 20 grants every data subject the right to request human review of decisions made solely through automated processing — including AI credit scoring, hiring algorithms, and insurance underwriting. ANPD (Autoridade Nacional de Proteção de Dados) began enforcement in August 2021. Fines reach up to 2% of Brazil gross revenue from the prior fiscal year, capped at R$50 million per infraction. AI teams serving Brazilian users must: (1) maintain an Article 20 human review process; (2) document automated decision criteria and disclose them on request; (3) conduct DPIAs for high-risk AI processing; (4) appoint a DPO (Encarregado); and (5) notify ANPD of material data breaches. Unlike GDPR's opt-out model, LGPD Article 20 creates an unconditional review right — no consent or legal basis exemption applies.
Article 20: The Core Right to Human Review of AI Decisions
Article 20 of LGPD (Lei 13.709/2018) grants every data subject the right to request a review of decisions made solely on the basis of automated processing of personal data that affect their interests — including decisions defining their personal, professional, consumer, credit profile, or aspects of their personality. Key elements: (a) Trigger: any automated decision with legal effects or that significantly affects the data subject — credit scoring, hiring AI, insurance underwriting, fraud detection blocks, clinical AI; (b) Review right: the data subject can request that a human review the automated decision, assess the criteria used, and confirm or overturn the output; (c) Criteria disclosure: the controller must disclose the criteria and procedures used in the automated processing, the categories of data used as inputs, and whether sensitive data was involved; (d) No exemptions: unlike GDPR Article 22 (which permits automated decisions when necessary for a contract, legally authorized, or with explicit consent), LGPD Article 20 does not provide exemptions from the review right. ANPD Resolution 2/2022 clarified that a human reviewer must substantively assess the decision — rubber-stamping an automated output does not satisfy Article 20.
ANPD Enforcement: Penalties, Sanctions, and Audit Risk
The Autoridade Nacional de Proteção de Dados (ANPD, created by Lei 13.853/2019) is Brazil's national data protection authority. Enforcement timeline: ANPD formed August 2020; enforcement began August 2021; first formal sanctions issued 2023; AI-specific enforcement prioritized from 2024. LGPD penalty structure: fines up to 2% of the legal entity's Brazil gross revenue in the prior fiscal year, with a maximum of R$50 million per infraction; additional sanctions include: public warning (published online, creating reputational risk), blocking of data processing operations (can shut down AI systems), deletion orders for unlawfully processed data, and partial or total suspension of activities. Per-infraction model: unlike GDPR's per-incident cap, LGPD penalties apply per infraction — an automated system generating unlawful decisions could create multiple simultaneous infractions. ANPD enforcement priorities: 2024-2026 Strategic Plan identifies automated decision systems, consent management, and child data protection as top priorities. AI compliance is not a future concern in Brazil — it is an active enforcement area.
LGPD Legal Bases for AI Processing: The 10 Grounds
Unlike GDPR's 6 legal bases, LGPD Article 7 provides 10 legal bases for processing personal data, giving organizations more flexibility. Most relevant for AI: (I) Consent — freely given, informed, specific, and unambiguous; must specify the AI-processing purpose; can be withdrawn at any time; (II) Legitimate interest — controller's legitimate interests or interests of third parties, provided fundamental rights are not overridden; ANPD guidance requires a proportionality assessment for AI processing based on legitimate interest; (V) Execution of a contract — applicable to AI systems directly used to fulfill contractual obligations (e.g., fraud detection in banking transactions); (IX) Legitimate interest — a second legitimate interest ground applicable to public entities. Important: the legal basis for collecting data binds downstream AI processing. Repurposing data collected for one purpose to train AI for a different purpose requires fresh legal basis assessment. Sensitive personal data (health, racial/ethnic, biometric, genetic, religious, political, sexual) requires explicit consent or legal compulsion for processing.
Data Protection Impact Assessment (DPIA/RIPD) for High-Risk AI
LGPD Article 38 requires controllers to produce a Data Protection Impact Assessment (DPIA — called Relatório de Impacto à Proteção de Dados Pessoais or RIPD in Portuguese) for high-risk personal data processing when requested by ANPD. ANPD guidance triggers DPIA for: large-scale processing of sensitive data; systematic monitoring of individuals in public areas; AI systems using personal data for scoring, profiling, or behavioral prediction at scale; processing involving vulnerable populations (children, elderly, economically disadvantaged); and cross-border transfers of personal data for AI training. DPIA content requirements: description of processing operations and purposes; assessment of necessity and proportionality; identification of risks to data subjects' rights and freedoms; measures to mitigate those risks; and mechanisms for data subjects to exercise their rights including Article 20 review. ANPD can make DPIAs public — they serve as accountability evidence but also create external scrutiny of AI system design decisions.
LGPD vs GDPR: Key Differences for Global AI Compliance Teams
Organizations deploying AI globally commonly serve both Brazilian and EU/UK users, creating dual compliance requirements. Critical differences: Article 20 vs Article 22 — LGPD Art. 20 creates an unconditional review right with no exemptions; GDPR Art. 22 creates a restriction on automated decisions with three exemptions (contract, legal authorization, explicit consent) and a right to human review within those exemptions — structurally opposite. Legal bases — LGPD has 10 bases vs GDPR's 6; LGPD legitimate interest requires proportionality analysis similar to GDPR but codified differently. Sensitive data — LGPD Article 11 requires explicit consent or legal compulsion; GDPR Article 9 requires explicit consent or 9 specific exceptions. Penalty structure — LGPD per-infraction (multiple R$50M caps possible); GDPR per-incident with a single higher cap (€20M / 4% global turnover). Data subjects' access rights — both require meaningful access; LGPD specifically includes AI-derived records in access scope. For global compliance: a decision ledger that captures complete AI decision traces with input attribution satisfies both LGPD Article 20 (criteria disclosure) and GDPR Article 22 (meaningful information about logic) simultaneously.