Regulations

Impact Of the EU-U.S. Data Privacy Framework For Health Data And Clinical Trials


The European Commission has adopted an adequacy decision for EU-U.S. data transfers. This decision includes the EU-US. Data Privacy Framework (DPF) which regulates data transfers between the U.S. and the EU. Special provisions were created for health data and in particular for clinical trials. Provisions from the DPF include the need for information to be provided to patients in case of withdrawal from clinical trials as well as information in relation to express consent. Additionnally, the DPF underlines the obligation for sponsors to inform patients of any data transfers.

Click to read more

DPF Self-Certification Process


The EU-U.S. Data Privacy Framework (DPF) requires U.S.-based organizations to self-certify their adherence to the DPF principles. U.S.-based organization submit their self-certification to the U.S. Department of Commerce’s International Trade Administration (ITA). The self-certification process requires organizations to go through a number of steps such as the drafting of a privacy policy with explicit mentions, the designation of a contact person, the existence of a verification mechanism... Once the certification is validated, the organizations are part of a list.

Click to read more

Rise of U.S. Health Data Privacy Regulations


In the wake of Washington's My Health Data Act, other U.S. States realised the need for a regulation pertaining to health data, especially in the absence of a comprehensive national privacy law. The States of Nevada, Connecticut and Illinois passed their own bills regarding health data privacy. The U.S. Federal Trade Commission (FTC) also took part in this movement with the Health Breach Notification Rule that gave birth to the GoodRx, BetterHelp and Prenom enforcement actions.

Click to read more

The European Commission Reinforces The GDPR In Cross-Border Cases


The European Commission proposed a law to create procedural rules for the authorities enforcing the GDPR. This will lead to legal certainty for businesses and quicker remedies for individuals. The law will harmonise the procedural rules between Member States data protection authorities, facilitate cooperation and dispute resolution with deadlines for cross-border cooperation and dispute resolution.

Click to read more

PETs - Privacy Enhancing Technologies

New MIT Privacy Technique For LLMs

Large language models (LLMs) in the clinical domain use considerable amounts of sensitive data that could be extracted by malicious agents. The Massachusetts Institute for Technology (MIT) developped a privacy technique that would allow LLMs to remain accurate while protecting sensitive data. The MIT developped the "Probably Approximately Correct" (PAC) Privacy Metric that creates enough noise, enough randomness in the data, in order for sensitive data to remain protected.

Click to read more

Synthetic Data As Privacy Enhancing Strategy


UK's Data Protection Authority, ICO, published a study paper intituled 'Exploring Synthetic Data Validation – Privacy, Utility and Fidelity' in cooperation with the Financial Conduct Authority (FCA) and the Alan Turing Institute. This PET uses a mathematical model or algorithm to generate statistically realistic, but ‘artificial’ data. This would help businesses mitigate data quality issues, model new and emerging scenarios, and protect commercially sensitive data. In healthcare, most notably clinical trials, this technique would alleviate privacy risks concerning sensitive data.

Click to read more

Confidential Computing : A Solution For The Application Of Privacy Principles In AI?


Generative AI uses many datasets and consequently data governance issues may come up. Confidential computing delivers proof of processing, providing hard evidence of a model’s authenticity and integrity. Confidential computing ensures code integrity, encrypts data, and data and IP are completely isolated from infrastructure owners and made only accessible to trusted applications running on trusted CPUs. Data and code are protected and isolated by hardware controls, all processing happens privately in the processor without the possibility of data leakage.

Click to read more

Artificial Intelligence

Google AI's For Medicine: Accuracy Over 90% For Clinical Answers


Med-PaLM, Google AI's for medicine, shows promising results concerning answers in the clinical domain. These past few months the model was opened up to a select group of healthcare and life science organizations for testing. The model still carries inherent risks linked with LLMs such as misinformation or bias. In the face of rising risks, harnessing massive amounts of data becomes an essential strategy. However, the healthcare sector finds itself entangled in the ensuing tug-of-war between data-driven solutions and the looming spectre of privacy breaches.

Click to read more

LLM, Health and Regulations


Different regulations might be needed for Large Language Models (LLM) and AI-based medical technologies as they are different. The authors argue that regulations must strive to ensure safety, to maintain ethical standards, and to protect patient privacy without stiffling innovation. They propose the implementation of a framework that guarantees privacy while using a large quantity of data to train the models. Regulations should be adapted to this new technology and take into account its specificities. The FDA already regulates some aspects of AI-based medical technologies.

Click to read more

Cybersecurity

U.S. Health Department breached by MOVEit Hacking Campaign


The U.S. Health Department experienced a data breach that could affect 15 million people, a number inferred by a data analyst. This breach is the result of an attack of hackers through a vulnerability of the file-transfer software called MOVEit. Files from companies and organizations that had been uploaded to MOVEit are concerned. Among the impacted organizations were public institutions such as the Department of Education from New York City and Louisiana Office of Motor Vehicles which could impact all persons with a state-issued licence.

Click to read more

HCA Healthcare Data Breach Impacted 11 million People


HCA Healthcare, a healthcare organization of 180 hospitals and 2,300 ambulatory sites of care in 20 states and the United Kingdom, suffered a data breach impacting 11 million people. The data breach was located on an external storage location used exclusively to automate the formatting of email messages. The breach concerns information used for email messages, personal data such as patient names, cities, states, zip codes, email addresses, phone numbers, gender, dates of birth, and appointment information.

Click to read more

Data Privacy Enforcement

FTC Bans BetterHelp From Sharing Sensitive Health Data For Advertising


The Federal Trade Commission (FTC) launched an enforcement action against BetterHelp in March 2023. The FTC has finalized its order requiring BetterHelp to pay a $7.8 million fine and prohibiting it from sharing consumers’ health data for advertising. The order also requires the company to respect privacy principles. The company shared personal information from its users to Facebook, Snapchat, Criteo, and Pinterest for advertising purposes despite promising consumers that it would only use or disclose personal health data for limited purposes.

Click to read more

GoodRx, Google, Meta and Criteo Seek Dismissal Of A Class-Action Privacy Complaint


After the enforcement action from the FTC against GoodRx, a class-action privacy complaint was filed by GoodRx users for wrongly sharing health information with third parties. GoodRx, as well as Meta, Google and Criteo are seeking the dismissal of the class-action. GoodRx argues that it informed consumers about data sharing, and that people have consented to share their information in exchange for coupons. The users argue that for a period of time, GoodRx indicated in its policies that it would never provide third parties with users' personal health information.

Click to read more

Home

Discover our latest newsletter

View All Newsletters
Dec 2024
Regulations & Guidelines
Biotech & Healthtech
AI
Data Governance
Data Privacy Enforcement

Newsletter #20

🌎 This month, key updates include Brazil’s introduction of a new SCC-based framework for international data transfers. 📋 The EDPB shared its evaluation of the EU-US Data Privacy Framework. 🤖 Advancements in AI-driven health solutions, such as Sanofi’s Muse for clinical trial recruitment, were also highlighted. 🧬 Discussions focused on genomics privacy, neural data protection, and the transformative role of AI in healthcare and compliance landscapes.

Nov 2024
Regulations & Guidelines
Podcasts
AI
Data Breach & Cybersecurity
Data Privacy Enforcement

Newsletter #19

In October, key developments in data privacy, AI, and cybersecurity emerged, including new GDPR accountability guidance for controllers, the introduction of the UK’s Data Bill 2024, and the FDA's call for coordinated AI regulation in healthcare. High-profile data breaches also highlighted vulnerabilities in health data, underscoring the need for stronger, globally aligned privacy standards.

Oct 2024
Data Privacy Enforcement
Healthcare
Regulations & Guidelines
AI
Biotech & Healthtech

Newsletter #18

Get up to speed with the latest in data protection regulations and healthtech innovations, including updates from Brazil, the UK, and California, along with advancements in AI-driven healthcare solutions. Plus, explore major privacy enforcement actions and key developments shaping the future of digital health.