A health information and privacy expert offers insights into HIPAA compliance, wearables and voice assistants.
Health privacy and security still apply to emerging technologies like wearables.
Navigating healthcare privacy regulations has never been simple, but compliance has only grown more complex with the advent of cutting-edge digital technologies. So, how can healthcare leaders satisfy the demands of the Health Insurance Portability and Accountability Act (HIPAA) as wearable technology, automated voice assistants and other digital health innovations enter the clinic?
Some answers to this question are surprisingly straightforward, while others demand more thought and research. But there are a few ways that health systems, developers and healthcare leaders can best position their innovations and tech acquisitions to achieve HIPAA compliance, thus enabling better patient outcomes and greater efficiencies.
>> LISTEN: Healthcare Regulations Lag in the Digital Transformation
Nicholas Heesters, an information privacy and security specialist and HIPAA compliance and enforcement official for the U.S. Department of Health and Human Services, Office for Civil Rights (OCR), provided insights into this healthcare privacy challenge last week at the Infosecurity North America meeting in New York City.
The first thing to note: No matter which new tech enters the care continuum, health privacy and security laws are designed to handle it, Heesters said. So, by consulting with OCR experts and scrutinizing guidelines, healthcare providers and other stakeholders should, in theory, be capable of mapping out a plan to use groundbreaking tech in the clinic.
“The HIPAA security rule gets a lot of grief for being too vague,” he said. “By design, it’s not vagueness, but it’s the fact that it’s flexible, scalable, technology-neutral.”
The OCR has fielded many more HIPAA complaints and undertaken many more investigations than it has doled out penalties. The office has struck 56 resolution agreements, with penalties ranging from thousands of dollars to last month’s $16 million settlement with the insurer Anthem, who suffered the largest ever health data breach.
But where do healthcare providers, vendors and other stakeholders go wrong? The most common misstep occurs during risk analysis, Heesters said. Too often, health privacy violators don’t know where all of their electronic protected health information (ePHI) is, how the data flow through the environment and the risks of each step.
HIPAA, of course, applies to covered entities — like providers and insurers — and business associates, meaning vendors. The rise of wearable technology has raised questions as devices such as smartwatches have begun collecting more health data, sometimes for clinical use. But data gathered via wearables don’t always fall under HIPAA security guidelines.
For example, if a person buys a Fitbit and then uses it to track information like number of steps taken per day, calories consumed and heart rate, the data aren’t protected under HIPAA, Heesters said. Why? This equation lacks a covered entity or business associate.
But consider this: At the direction of a healthcare provider, a patient downloads a smartwatch app that monitors health data points that are then integrated into an electronic health record. The app developer or marketer, meanwhile, is receiving money from the provider for the digital service. In that case, the developer is generating, collecting, storing and sharing data on behalf of a covered entity — and, as a business associate, it must abide by HIPAA.
Amazon’s Alexa, Apple’s Siri and other artificial intelligence voice assistants are enjoying growing adoption rates, sparking interest in the technology’s application in healthcare. Startups have cropped up to develop such products, and some are already entering clinics across the country. But designing voice assistants to meet health privacy and security standards is no easy task.
“If someone tells you right now that they’re complying with HIPAA in this area, don’t believe them,” Elena Elkina, a former attorney and a data privacy and protection expert for Aleada Consulting, claimed while moderating Heesters’ talk. “It hasn’t happened yet.”
Although the high-tech nature of voice assistants might prod some healthcare leaders to think health privacy laws would be more complicated in this area, Heesters pointed back to the flexibility of HIPAA: It’s meant to be scalable and technology-neutral, and that means there is a viable path forward for voice assistants.
Healthcare providers should scrutinize their business associate agreements so that they understand what “HIPAA-compliant services” means. In some cases, the phrase might suggest that the developer has provided tools for auditing, encryption and granular roles of access, but it might be the provider’s responsibility to implement these technologies.
In the end, regardless of the precise tech, healthcare providers must return to the risk analysis process, making sure they protect all ePHI, no matter where it comes from or where it goes. New technology requires new evaluations and examinations of specifications, rules and security features.
Get the best insights in healthcare analytics directly to your inbox.
Related
Amazon’s Alexa Really Isn’t Ready for Healthcare
EHRs Can Be Dangerous. Are New Guidelines Necessary?
Overcoming the Cultural Resistance to Health Tech
Cybersecurity panel: Hospitals threatened by attacks aimed at vendors
November 4th 2024Chief Healthcare Executive presents another installment from our conversation on cybersecurity, with experts from the American Hospital Association, HIMSS and Providence. They talk about breaches tied to business partners.
Cybersecurity panel: The scope of recent ransomware attacks in healthcare
October 28th 2024Chief Healthcare Executive hosted a discussion on cybersecurity with leading experts from the American Hospital Association, HIMSS and the Providence health system. They talked about the growing problem of cyberattacks.