We Need to Update Privacy Protections and Support Clinical AI Innovation
- Ozzie Paez
- Oct 25, 2024
- 3 min read
Updated: Feb 3
Privacy protection for personal health-sensitive data is devolving into a patchwork of laws and regulations, unnecessary complexities, security gaps, and mounting vulnerabilities. Well-intended laws like the Health Insurance Portability and Accountability Act (HIPAA) and implementing regulations have saddled clinicians and providers with risks and uncertainties while providing patients with limited privacy protections. HIPAA’s limited scope excludes most collectors of health-sensitive data, including makers of wearable monitors and providers of direct-to-consumer (DTC) analytical services.

A notable example is limited regulatory oversight of DTC DNA profiling services that base privacy protections on end-user agreements. It’s not clear how those agreements would hold up in cases where providers go out of business, merge with competitors, and are acquired. For example, financial struggles at 23andMe, the second-largest DTC DNA vendor, is raising concerns about control, access, and disposition of the 14 million profiles stored and archived on their systems.
Similar concerns apply to makers of wearable health and fitness devices like Apple’s iWatch and Google’s Fitbit. These devices continuously collect and analyze panels of physiological data that overlap and frequently exceed those collected by medical providers. Manufacturers often share their customers’ physiological data with partners and third parties. For instance, Apple shared iWatch collected physiological and other data with institutions like Stanford, Harvard, and the National Institute of Environmental Health & Sciences. These are legitimate research institutions, and Apple requires iWatch users to acknowledge and approve sharing requests. Other makers may not be as thorough with their policies. As with DTC services, makers of wearable fitness monitors base their privacy protections on end-user agreements that they can unilaterally change.
Effective privacy protections are indispensable for building and sustaining broad patient and consumer confidence. In this context, access and control over personal and patient monitoring data and information are key. On the other hand, researchers and innovative makers of AI-based health, fitness, and clinical technologies need access to large, representative datasets to train, validate, and test their models. We need to update legal and regulatory reforms to clarify, balance, and manage conflicting interests between patient and customer demands for robust privacy protection and legitimate researchers and innovators of health and fitness technologies. Unfortunately, it is not clear at this point which private institutions, organizations, and government agencies will lead these efforts, and, to our knowledge, there are only limited initiatives in place.
References:
Joly, Y., Feze, I. N., & Simard, J. (2014). Data sharing in the post-genomic world: The experience of the International Cancer Genome Consortium (ICGC) Data Access Compliance Office (DACO). PLoS Computational Biology.
This study highlights the complexities of data sharing in genomic research and underscores the importance of strong regulatory frameworks for protecting sensitive health data.
Grishin, D., et al. (2019). Direct-to-consumer genetic testing: a comprehensive view. Yale Journal of Biology and Medicine.
Discusses the regulatory gaps in DTC genetic testing and its implications for data privacy, especially in light of financial uncertainties faced by companies like 23andMe.
Piwek, L., et al. (2016). The rise of consumer health wearables: Promises and barriers. PLoS Medicine.
Explores the growing use of wearable devices in health monitoring and the associated privacy concerns, particularly regarding data sharing with third parties.
Gostin, L. O., & Halabi, S. F. (2019). AI in Healthcare: The Legal, Ethical, and Policy Implications. Journal of Law, Medicine & Ethics.
Examines the legal and ethical challenges of AI in healthcare, particularly the need for balanced frameworks that protect privacy while enabling innovation.
Topol, E. J. (2019). Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again.
Emphasizes the role of large datasets in driving AI advancements and the critical need for reforms to ensure data privacy and security in healthcare.
Comentarios