Privacy Hub's monthly synthesis of the major news items
affecting and shaping health data privacy,
with expert analysis and commentary
To subscribe to our newsletter, click here.
The last few weeks in a flash:
Iowa becomes sixth US state to enact comprehensive consumer privacy legislation
IAPP (March 29, 2023)
"On 29 March, Iowa became the sixth state to pass a comprehensive privacy law, joining Connecticut, Utah, Virginia, Colorado and California. The law will go into effect on 1 Jan. 2025, giving organizations 21 months to comply with the new requirements from this state with over 3 million residents." Keep reading
Indiana Legislature Passes Consumer Data Privacy Bill
Husch Blackwell LLP (April 14, 2023)
"The Indiana legislature is the seventh state legislature to pass consumer data privacy legislation. On April 13, 2023, the Indiana legislature passed SB 5. The bill largely tracks the Virginia Consumer Data Protection Act (VCDPA) with some limited variations." Keep reading
Washington State Poised to Enact “My Health My Data Act”
WilmerHale (April 18, 2023)
"On Monday, April 17, the Washington House passed an amended version of the My Health My Data Act (HB 1155) (the 'Act'), a bill that would impose sweeping new requirements on the collection, processing, and sale of consumer health data in the state. The Act had been passed by the Senate on April 5 and now moves to Governor Jay Inslee’s desk for signature. If enacted, the My Health My Data Act would constitute a major development in the U.S. privacy law landscape. While we have seen an increased interest in the regulation of health data by the Federal Trade Commission, the My Health My Data Act would represent a novel step towards regulating health data at the state legislative level. And the Act’s impact would be significant." Keep reading
Hospitals pledge to protect patient privacy. Almost all their websites leak visitor data like a sieve.
STAT News (April 3, 2023)
"A new study found that 99% of U.S. hospitals employed online data trackers in 2021 that transmitted visitors’ information to a broad network of outside parties, including major technology companies, data brokers, and private equity firms. . . The ubiquitous use of the tracking tools may clash with the privacy expectations — if not the legal protections — that consumers take for granted as they browse online in search of medical care and information." Keep reading
HHS proposes rule shoring up HIPAA to protect reproductive health data, including around abortions
Healthcare Dive (April 12, 2023)
"The Biden administration has proposed a new rule that would ban providers, health insurers and other entities covered by the HIPAA privacy law from sharing patient information that could be used to investigate abortions. The proposed regulation released Wednesday from the HHS Office for Civil Rights is meant to protect patient-provider confidentiality and prevent private medical records from being used against people seeking, obtaining or providing legal reproductive healthcare, including an abortion or miscarriage management, the HHS said." Keep reading
Unstructured health data—physician notes, patient histories, and free-form lab results, primarily—presents challenges to the clinical landscape that are equally profound as its opportunities.
Within this reservoir of fluid, undefined information are rich insights and learning potentials that are now beginning to be tapped through the power of large language models, medical training corpuses, and named entity recognition. This utility makes unstructured data an increasingly valuable clinical resource.
However, the absence of structure also risks personally identifying information as direct as patient names or social security numbers easily propagating in free-text records. In order to de-identify unstructured data under HIPAA, privacy experts must be able to quantify this risk. The most reliable approach would be to manually review every record in such a dataset; however, this becomes extremely expensive in time resources for datasets larger than tens of thousands of records. Rule-based interrogations of data—though quicker to execute—are insufficient to accurately detect high-risk information with limitless potential forms.
The same language model technologies developed for clinical utility extraction offer a potential solution path if they can be deployed to capture patient identifiers like names, addresses, and birth and death events. Several organizations have already trained models to provide this capture for select identifiers with encouraging results. To translate these successes to a robust framework for efficient evaluation of unstructured data, two key obstacles remain:
1. The HIPAA Privacy rule requires that the risk of re-identification must be very small for de-identified data. So, any automated tool would therefore have to achieve the steep challenge of finding all of the patient identifiers present in any given dataset with only a very small margin for missing any necessary redactions.
2. HIPAA requires that information is assessed directly. This means that either the output of a tool would have to be explicitly reviewed to determine if it was de-identified or that the input data would have to be shown to be sufficiently statistically similar to data that had previously undergone this testing.
While these challenges are far from trivial from technical and compliance standpoints, the rapid evolution of large language models and their application to unstructured health data points is extremely promising both for extracting clinical value and mitigating the risk of patient re-identification that accompanies it..
American Data Privacy and Protection Act
California Consumer Privacy Act
HIPAA reform should protect patients, scale back silos around medical data
Healthcare Dive (March 24, 2023)
"Lifespark chief executive Joel Theisen argues for an update of HIPAA that acknowledges an advanced technology landscape and gives providers a fuller picture of patient health." Keep reading
Automated De-Identification For Personal Health Data Privacy
Forbes (April 11, 2023)
"People create data. Every interaction we humans make with our apps, machines, devices, services and computing platforms inititiates computing ‘events’ which in turn create log files and ultimately form some part of the planet’s ever-growing data mountain. As we now increasingly digitize our lives, more and more of that ‘people data’ is individually-specific to ourselves and therefore sensitive from a privacy and security perspective." Keep reading
Treating Healthcare Data Responsibly
Forbes (April 11, 2023)
"Gathering and analyzing various datasets about the different points in the customer journey and putting them together compliantly is critical to delivering high-quality, personalized care to patients. By analyzing data to identify patterns and trends, healthcare providers can deliver faster care, bespoke treatments and lower costs. Compliance with regulations is critical to building trust with patients and ensuring that data is used in a responsible and ethical manner." Keep reading
THREE-COMPANY INTERVIEW:
How Synthetic Data Can Help Train AI and Maintain Privacy
Information Week (April 17, 2023)
"It is not always feasible, or ethical, to use live data to train AI or test out software platforms -- making a case for synthetic and augmented data to solve certain development needs. Stakeholders such as IBM, Gartner, and Datavant share some insights on benefits synthetic data can offer." Keep reading