Privacy Hub's fortnightly synthesis of the major news items
affecting and shaping health data privacy,
with expert analysis and commentary
To subscribe to our newsletter, click here.
The last few weeks in a flash:
Leading Stories
States' long-awaited data privacy laws are going into effect
Axios (January 3, 2023)
"Starting this week, companies operating in Virginia and California are subject to a new set of data protection laws. . . After federal lawmakers failed (again) to pass a privacy law last year, companies face what they've always feared and lobbied against: a patchwork of state-level laws that dictate how they collect, store and share consumer data." Keep reading
OCR Joins Chorus of Regulators Warning About Health Data Tracking Technology
The National Law Review (December 29, 2022)
"Organizations must conduct a fact-based analysis to determine whether health data collection and tracking technology deployed on their websites and mobile apps complies with the federal Health Insurance Portability and Accountability Act ('HIPAA') and other applicable laws and guidance." Keep reading
NIST's De-Identifying Government Data Sets: Third Public Draft - Comments Due on Jan. 15
NIST (November 15, 2022)
"[On November 15th, the National Institute of Standards and Technology] released the third public draft of NIST Special Publication (SP) 800-188, De-Identifying Government Data Sets, [a document that provides specific guidance to government agencies that wish to use de-identification], for public comment. The comment period closes on January 15, 2023." Keep reading
Judge Denies Injunction Banning Meta from Collecting Patient Data via Meta Pixel Code
HIPAA Journal (December 29, 2022)
"Plaintiffs in a consolidated class action lawsuit against Meta recently sought an injunction to stop the company from collecting and transmitting data collected from the websites of healthcare providers through Meta Pixel tracking code. The plaintiffs claim the use of Meta Pixel code on appointment scheduling pages and patient portals allows sensitive information, including patient communications, to be collected and monetized by Meta, which violates federal and state privacy laws. William Orrick, U.S. District Judge for the Northern District of California, has recently issued a ruling denying the injunction." Keep reading
Building off Anita Allen’s recent remarks at HIMSS regarding the shifting narrative of data sharing and privacy, Dr. Patrick Baier, HIPAA Privacy Expert at Privacy Hub by Datavant, continues his previous comment regarding current attitudes towards privacy and the most significant compliance blind spots amongst entities that use health data:
While I think HIPAA has broadly been successful in advancing patient privacy, there are obviously challenges and shortcomings. The more widely de-identified health data is shared, the harder it gets to retain effective control over the environment and policies and procedures surrounding the data use.
Moreover, today, not all identified health information is Protected Health Information (PHI). A patient’s heart rate is PHI when taken in a GP surgery, but not when measured by a sports watch and uploaded to a website. Data collected by a health app may or may not be PHI, depending on whether the app was supplied by a medical provider or just downloaded privately by a patient. From the viewpoint of ethical data use, it is hard to see why one blood pressure reading should be accorded a different level of protection from another, but legally the regulations have not quite caught up, and this can cause ambiguity and confusion.
The Covid pandemic has highlighted both the potential benefit of using health data for the common good, and the risks to patient privacy. Without sharing of health data, it would not have been possible to study the effect and spread of the virus, and the epidemiological benefits of health data use have become ever more obvious since the days of John Snow in Victorian London. However, there has also been a change in attitudes towards patient confidentiality as a result. Governments, airlines, restaurants, and all sorts of public venues would expect disclosure of health information, almost as a condition of participation in public life.
Much of this was maybe understandable as an immediate reaction to a poorly understood public health threat. But we now need to review these arrangements, advance our privacy practices, develop new technologies, and ensure that the importance of patient privacy remains at the forefront of our thinking, and remains embedded and effective in all these new ways of using health information, ideally without hampering the beneficial use of health data.
Every patient deserves to have their privacy protected and their data used ethically. Confidentiality of patient information should require no external justification; it deserves to be protected in its own right. And privacy protection is of course also a practical necessity, because if patients do not trust that their doctor will keep their health information confidential, they may not seek treatment for fear of negative personal consequences.
Therefore, de-identification is key to making health data available for wider use. A failure to honor the expectation of confidentiality, and any material damage to individuals as a result thereof, has the power to undermine the willingness of the public to share their de-identified health information and damage any consensus between society and the de-identified healthcare data industry, thus depriving us of a vital tool which has the potential to do much good.
So there is a lot to do. As privacy professionals, we should constantly promote privacy awareness and invest as much effort as we can in developing new and better ways to protect patient privacy, highlight and stop or fix any failures, adapt our tools to the many new ways of using health information, so as to preserve de-identified health information as an asset for the future.
Food for Thought
Op-Ed: Another threat to abortion privacy? Health websites tracking and sharing your data
Los Angeles Times (December 29, 2022)
Opinion piece by Ari Friedman and Matthew McCoy
"Our research team at the University of Pennsylvania’s School of Medicine and Carnegie Mellon’s CyLab found that 99% of abortion clinic webpages surveyed have at least one third-party tracker on their site. The situation is usually worse: The average clinic webpage transfers data to nine different companies. Across all abortion clinics in the U.S., we found that trackers sent data to 66 different companies, from tech behemoths to businesses with little or no consumer-facing presence." Keep reading
Could the EU's decision against Meta affect data privacy policies in the U.S.?
Healthcare IT News (January 6, 2023)
Andrea Fox interviews Odia Kagan
"The social media giant says it will appeal the European Union's decision that Meta Platforms violated GDPR. We asked one privacy lawyer whether the decision might penetrate the company's reliance on contractual necessity in the U.S. . . According to [them], the decision means: The company can no longer rely on a legal basis of contractual necessity to run behavioral ads and will instead have to ask users for their consent; within three months, Meta must enable users to have a version of its social media apps that does not use personal data to surface ads; the company must allow users to withdraw consent at any time, and it may not limit the service if users choose to do so; [and] Meta may still use nonpersonal data to personalize ads or to ask users for consent to ads with a yes or no choice." Keep reading | Share your thoughts
REVIEW:
Biometric privacy 2022 year-in-review
Biometric Update (December 27, 2022)
"2022 was another banner year for biometric privacy, with a number of high-profile developments taking place in this space, the most notable being the first Illinois Biometric Information Privacy Act ('BIPA') jury verdict in Rogers v. BNSF Ry. Co., No. 19 CV 3083 (N.D. Ill.). In addition, class action filings continued apace, several decisions on key BIPA issues extended the boundaries of liability exposure for non-compliance even further, and a number of eight- and nine-figure class action settlements pushed the already-inflated value of BIPA claims even higher. At the same time, state and municipal lawmakers in other parts of the country unsuccessfully attempted to install greater controls over the collection and use of biometric data, and are likely to continue these pursuits during the 2023 legislative session. At the federal level, lawmakers also introduced legislation that would have governed biometrics practices in a uniform fashion across all 50 states, while the Federal Trade Commission (“FTC”) commenced its own rulemaking activities which (among other things) focuses on evaluating the need for more stringent regulation over biometric technologies by the country’s de facto federal privacy regulator." Keep reading
INTERVIEW:
AI is fast addressing data requirements and advancing interoperability, says one expert
Healthcare IT News (December 27, 2022)
Andrea Fox interviews Vignesh Shetty
"Leveraging AI, machine learning and neural networks can help healthcare standardize data, comply with info blocking requirements and improve health outcomes. . . [Shetty says], 'With the enforcement of the info blocking rule, we notice that the saliency of insights becomes the main source of value. Investments that would appear unnecessary in the context of data scarcity become far-sighted in the context of abundance. . . In the case of structured data, machine learning algorithms can identify protected health information features to stay compliant from a privacy standpoint. For unstructured data, we can use a combination of deep learning algorithms and natural language processing to structure them and address interoperability.'" Keep reading