Fair Housing Act and AI: What Rental Screening, Mortgage, and Valuation AI Must Document
The Fair Housing Act (42 U.S.C. § 3604) prohibits discriminatory housing decisions based on race, color, national origin, religion, sex, disability, and familial status — regardless of whether an AI or human makes the decision. HUD and DOJ apply both disparate treatment and disparate impact theories to algorithmic rental screening, mortgage underwriting, and automated valuation models. The three-step burden-shifting test for disparate impact requires pre-deployment business necessity documentation for features causing statistical disparities across protected classes.
FHA Protected Classes and AI Proxy Feature Risk
FHA protects race, color, national origin, religion, sex, disability, and familial status. State and local laws typically add source of income, sexual orientation, gender identity, marital status, and age. AI creates FHA liability through two mechanisms: disparate treatment (intentional discrimination, including through proxy features functioning as protected class stand-ins) and disparate impact (facially neutral practices with disproportionate adverse effects). Common housing AI proxy features: ZIP code (race via redlining), eviction history (COVID-era disparities affecting Black renters), income source type (Section 8 protected in many jurisdictions), certain credit score components, language and name patterns encoding national origin, and social media activity encoding religion or disability.
FHA Obligations by Housing AI Use Case
Rental screening AI must maintain per-applicant records with inputs and output, adverse action notices with specific reasons for rejection, disparate impact analysis by protected class, and disability accommodation process documentation. Mortgage underwriting AI faces FHA and ECOA simultaneously — requiring specific reason codes under Reg B, HMDA monitoring for geographic disparities, SR 11-7 model documentation, and CFPB fair lending exam readiness. Automated Valuation Models must comply with the 2024 Interagency AVM Rule — quality control standards including nondiscrimination testing under ECOA and FHA. Targeted advertising AI must document demographic reach analysis to demonstrate no protected class exclusions from housing ads — the Meta/Facebook FHA consent decree (2022) establishes the enforcement standard.
FHA Disparate Impact: The Three-Step Burden-Shifting Test
Under HUD's 2013 Disparate Impact Rule (reinstated 2023), the test proceeds in three steps. Step 1: plaintiff establishes prima facie case through statistical evidence of disproportionate adverse effect on a protected class — approval rate disparities, pricing differentials, selection rate analysis by race/national origin. Step 2: defendant must demonstrate business necessity — the specific practice causing disparity is required for a legitimate business interest, not merely convenient. Predictive accuracy alone may not be sufficient; the specific feature or methodology at issue must be shown to be necessary. Step 3: plaintiff can show a less discriminatory alternative with substantially equal predictive performance — defeating the defense even after business necessity is shown. Documentation of less discriminatory alternative analysis must be created before deployment, not assembled in response to investigation.
Documentation Requirements for Housing AI
Required documentation across FHA, ECOA, and EU AI Act: per-decision records with all inputs and AI output at decision time (re-scoring historical applications does not substitute); adverse action notices with specific reasons tied to this applicant's actual inputs; disparate impact analysis by protected class before deployment and periodically thereafter; training data demographic composition and known limitations; proxy variable identification with business justification; less discriminatory alternative analysis for each feature causing disparity; and ongoing monitoring records showing demographic performance over time. ECOA adds 25-month retention for mortgage decisions. EU AI Act adds human oversight documentation for EU-jurisdictional systems.
Recent FHA AI Enforcement: Meta/Facebook, AVM Rule, DOJ Investigations
HUD and DOJ have been active in algorithmic housing discrimination. Meta/Facebook consent decree (2022): DOJ and HUD settlement for algorithmic ad targeting that excluded protected class neighborhoods — Meta required to rebuild its housing ad system. HUD algorithmic advertising guidance (2023): confirmed FHA disparate impact applies to AI housing ad targeting. CFPB Interagency AVM Rule (2024): implements Dodd-Frank §1473, requiring institutions using AVMs in mortgage origination to document nondiscrimination testing under ECOA and FHA. DOJ pattern-or-practice investigations: AI-assisted rental screening platforms and mortgage underwriting AI systems are under active investigation for demographic disparities in approval rates and pricing.