Immuta, the leading provider of enterprise data management solutions for artificial intelligence (AI), released its Health Insurance Portability and Accountability Act (HIPAA) Compliance Playbook for applying access policies to healthcare data. Based on its work with global healthcare clients, Immuta outlines how organizations and medical providers can govern data disclosure and ensure HIPAA compliance for AI and machine learning initiatives.
The HIPAA regulation sets forth two main strategies to legally use and disclose protected data. Immuta’s HIPAA Compliance Playbook illustrates how healthcare organizations can use its data management for AI platform to easily and quickly apply data privacy and anonymization policies critical for de-identification and data sharing. It works with sample medical data, which includes protected health information (PHI) that has potentially protected fields such as name, address, and the date the information was collected.
According to a recent Accenture survey, two-thirds of healthcare executives believe they are developing platforms and services that fall into “regulatory grey areas” that do not clearly address privacy, security, and ethical concerns. The Immuta platform applies a range of controls on data dynamically, enforcing data access and policy restrictions based on the organization’s needs in real-time and ensuring HIPAA compliance.
Andrew Burt, Chief Privacy Officer and Head of Legal Engineering, Immuta
“As healthcare organizations increasingly embrace AI to personalize patient care, provide accurate diagnoses and improve outcomes, the ability to have fast, personal and compliant data access is paramount. Immuta’s HIPAA Compliance Playbook illustrates how these organizations can use Immuta for quick and intuitive HIPAA compliance.”
Immuta recently announced that Cognoa, a Palo Alto-based provider of AI-based solutions for pediatric behavioral health diagnostics and digital therapies, utilizes Immuta’s platform to ensure data access policies are consistently and accurately enforced across a wide variety of data sources and users driving their machine learning programs. Cognoa trains algorithms to aid in the diagnosis of behavioral health conditions, including autism and ADHD, with highly sensitive data from a production database which lives in a HIPAA environment. Data privacy and security concerns are paramount for the company, and Cognoa needed a platform that would enforce data access roles, permissions, and policies beyond the standard resource or table-based control levels.
Immuta works with healthcare customers and partners around the world to enable regulatory compliance by enforcing:
- Differential Privacy: Immuta allows organizations to extract maximum value from large data sets while providing mathematical protections on sensitive data. The platform autonomously injects a specific, tailored amount of noise into query results to ensure that privacy is protected.
- Purpose-based Data Restrictions: Immuta dynamically enforces data access and policy restrictions based on the data scientist’s needs in real-time. By limiting how data is used across a healthcare organization, medical providers gain full visibility into how and why data is being used.
- Minimization: Immuta reduces the sample size of data so not all records within the data are available to the data user. Immuta allows the setting of minimization levels to specific percentages of the data source when creating policies for a data set.
- Masking: Immuta applies broad masking techniques to replace values in data with a set of alternate values. The platform defaults to providing consistently hashed values for masking, but enables users to tailor masking policies to maximize the utility of the data.
- Generalization: As an advanced form of masking, Immuta preserves privacy by rounding across a range of values to minimize the uniqueness of any single object.
- Attribute-Based and Row-Based Security: Immuta ensures that data being used for data science is not shared beyond groups or individuals who have approved access or who would benefit from that data.