The increased use of technology by health care providers requires new and innovative platforms to provide better services to patients. One platform that has garnered attention since its introduction in mid-2014 is Amazon’s Echo, a brand of smart speakers that connect to the voice-controlled intelligent personal assistant service, Alexa. The device responds to certain “wake words” (the default being “Alexa”) and is capable of playing music, making lists, setting alarms and accessing other real-time information. In addition to every-day task capabilities, Alexa has been slated for use in providing health care services, such as assisting physicians in taking notes, allowing patients to access their medical information, or allowing physicians to remotely monitor patients.
Health care providers have begun to design innovative ways that this platform could be utilized at their facilities; however, one big problem exists—Alexa is not compliant with federal privacy law protections under the Health Insurance Portability and Accountability Act (“HIPAA”). HIPAA compliance is expected to occur in the near future, the proper technical and security safeguards have not been implemented by Amazon. Although HIPAA sets forth the minimum privacy and security standards, state laws may be more restrictive, and any use of the Echo or similar devices would also have to be compliant with applicable state law.
Another concern is Alexa’s “passive listening” function. While the device records one second of ambient sound, listening for the specific “wake word,” Alexa is listening at all times. This capability creates the foreseeable possibility that information, including protected health information, will be recorded, whether intended to not. Accordingly, the security measures in place for storing such information becomes of critical importance to defend against hackers. The vulnerabilities inherent in using a voice-controlled personal assistant highlight the challenges in trying to protect an individual’s medical record information.
Voice-controlled personal assistants can enhance the efficiency of services provided to patients. Providers must, however, implement safeguards when using these assistants, including consideration of the following:
- Until Alexa becomes HIPAA compliant, any use of the device should be limited to non-identifiable health information;
- Once Alexa is HIPAA compliant, providers will need to execute a Business Associate Agreement with Amazon, or its related entities;
- Providers should implement and revise their policies and procedures to ensure device use is compliant with HIPAA;
- Providers should update their privacy notice to include the use of Alexa or other Alexa-enabled device.
For the tech savvy, the benefits of incorporating evolving technology into the day-to-day routine of providing medical care are readily apparent, but the traditional privacy concerns must be considered and addressed, and providers need to be certain that prior to the implementation of new technology, a compliance review is completed and all state and federal legal and regulatory requirements are met.