HIPAA Security Rule Technical Safeguards for AI Systems: Access Controls, Audit Logs, Encryption, and BAAs
HIPAA Security Rule (45 CFR Part 164.312) establishes four categories of technical safeguards for electronic PHI: access controls, audit controls, integrity controls, and transmission security. AI/ML systems handling ePHI — clinical decision support, diagnostic AI, EHR analytics, LLM-based clinical tools — are subject to all four categories. The audit controls specification (§ 164.312(b)) is a required specification with no flexibility: AI inference logs that record ePHI access are the primary technical implementation. Business Associate Agreements are required for all AI vendors receiving ePHI. The proposed 2024 Security Rule update would add AI system asset inventory requirements, mandatory encryption (removing the addressable designation), and annual penetration testing of AI infrastructure.
What ePHI Means for AI Systems Under HIPAA
ePHI for AI includes: clinical notes and medical records used as model inputs; model inference outputs linked to specific patients (risk scores, diagnosis predictions, recommendations); training datasets containing identified patient data; vector embeddings and representations of patient records (functionally re-identifiable); AI audit logs containing patient identifiers; and model evaluation datasets from patient populations. De-identified data under the HIPAA safe harbor (18 identifiers removed) is exempt — but data that is re-identified through AI inference or re-identification attack regains ePHI status. LLM prompts containing patient identifying information (name, DOB, MRN, diagnosis) transmitted to external APIs are ePHI in transmission and require the same security controls as any other ePHI transmission.
Access Controls (§ 164.312(a)) for AI Service Accounts and Model APIs
Access controls require technical policies that allow only authorized persons or software programs to access ePHI. For AI: unique user identification (§ 164.312(a)(2)(i)) requires that AI model service accounts have unique IAM identities — shared service account credentials are non-compliant. Emergency access procedures must be documented for AI-assisted clinical workflows. Automatic logoff is addressable — AI API sessions and agent sessions holding ePHI in memory must implement session timeouts. Encryption is addressable — ePHI at rest in model registries, training datasets, and inference stores must be encrypted (AES-256). The 2024 proposed rule would make encryption required for both data at rest and in transit.
Audit Controls (§ 164.312(b)) — Required Specification for AI Inference Logs
Audit controls is a required specification with no flexibility: covered entities must implement hardware, software, and procedural mechanisms to record and examine activity in ePHI-containing systems. For AI: every model inference call that reads ePHI must be logged (which patient, which model version, what output, what time). AI systems that write outputs to patient records must log write operations. Logs must be queryable — compliance officers must be able to pull all AI accesses for a specific patient or model version. Logs must be tamper-evident — append-only storage with cryptographic verification prevents modification. Retention: 6 years minimum. OCR cited lack of audit controls in its $2.4M OHSU resolution agreement (2016) — directly applicable to modern AI clinical systems.
Transmission Security (§ 164.312(e)) — AI APIs and LLM Vendor BAAs
Transmission security requires technical security measures for ePHI transmitted over electronic networks. For AI: EHR-to-model data feeds require TLS 1.2+ with mutual TLS authentication; inference result pipelines to the EHR require TLS 1.2+ plus integrity verification (HMAC or digital signature on the inference payload); cloud AI training data transfers require encrypted transfer (private network link or encrypted SFTP) with access logging. The LLM API problem: if your clinical AI sends patient data to an external LLM API (OpenAI, Anthropic, Google), that LLM provider must be a business associate with a BAA in place before receiving any ePHI. Sending patient notes, diagnoses, or other HIPAA identifiers to an LLM API without a BAA is a transmission security violation — and potentially a Privacy Rule violation.
Business Associate Agreements for AI Vendors and 2024 Security Rule Updates
Any AI vendor that creates, receives, maintains, or transmits ePHI on behalf of a covered entity is a business associate and must sign a BAA. BAAs with AI vendors must address: use limitation (vendor cannot use ePHI for other customers' model training), safeguard obligations (vendor must implement HIPAA Security Rule), subcontractor flow-down (sub-processors must also sign BAAs), breach notification (60-day notification requirement), and return/destruction on termination (including training data). The proposed 2024 Security Rule update adds: mandatory technology asset inventory (including all AI systems accessing ePHI), annual penetration testing of AI infrastructure, documentation requirements for all Security Rule implementations, and encryption as a required specification. AI teams should treat these as the emerging compliance baseline.