AVAILABLE 24/7
212-213-8511

December 28, 2018

JD Insider: Wearable Technology


 


Wearable Technology

By: Ashley Pusey

Depending on your definition of “wearable technology,” wearables can be traced back to the 13th century with the invention of eyeglasses. During the 16th century, the Qing Dynasty developed the abacus ring that acted as a tool for counting. In the 21st century—the age of fitness tracking devices and wearable technologies—designers and engineers are changing the functionality of clothing. The marriage of traditional fabrics and technology created “smart fabrics,” which can serve a similar purpose as other wearables, such as the Fitbit and Apple Watch. Smart fabrics have the ability to function like a Bluetooth-garment that syncs with the wearer’s accompanied app, and thus has the means of tracking health information on a daily basis. Personal health information (“PHI”) collected via wearable technology is invaluable and particularly significant in the medical industry. The collection of health information allows doctors and researchers keep up with a patient’s “real time” medical records. However, these technologies aggregate PHI at an unprecedented rate, insomuch that data has become the new gold rush of the twenty-first century. Because data in invaluable to various industries—such as the medical, insurance and advertising industries—PHI poses extreme vulnerabilities against unauthorized third parties who seize this information.

Why are third-parties interested in your PHI? The value of PHI on the black market is extremely high—even more valuable than your Social Security number or credit card information. Electronic health records (“EHR”) contain a wealth of exploitable information which attracts hackers because “your EHR contains all of your demographic information names, historical information of where you live, where you worked, the names and ages of your relatives, financial information like credit cards and bank numbers.” According to Robert Lord, “the medical record is the most comprehensive record about your identity of a person that exists today.” “You can cancel your credit cards and change social security numbers, but “your EHR is immutable.””Our society is already experiencing—and will continue to experience these effects—as technology progresses and the number of data breaches rise. While the law has attempted to make a quick fix on these issues, the law has neglected to take substantial measures to address the necessary privacy and data security concerns.

While there are government agencies that are enumerated to address these concerns, it appears that the government has been hesitant to take substantive measures to combat the rising and very prevalent data privacy and security issues. Data breaches have become the norm, and we are already experiencing the chilling effect of this haphazard legal framework within social media platforms and the health industry. Yet the dichotomy of privacy and data security issues concerning social media and users of health tracking devices is striking. Where it seems almost second nature to put on a garment or a fitness device that tracks PHI—with no regard as to who has access to the data being collected and where this information is being distributed—the same cannot be said with social media platforms. Social media gives users a level of control to privacy restrictions—it allows users to control who can view the user’s profile or story. Studies suggest that there is a general expectation that the information users share “will remain within some closed universe of relationships.” Although social media platforms are subjected to data and privacy breaches, it appears that there is a different level of consciousness concerning privacy and PHI. Are people more concerned about who can view their profile, as opposed to the consequences of their PHI obtained illegally, bought and sold to third parties? Is there a different expectation of privacy when it comes to PHI?

  1. Risks and Implications of Disseminating Health Information

As our society continues to advance in technology, coherent legislation must be implemented in a way that can adapt to the technological changes, while protecting the public’s fundamental rights to privacy and maintaining accountability. The general mentality of our society tends to focus on creating the “next best thing.” This mentality and rapid innovation has impeded on the public’s interest and fundamental rights, namely privacy and security. Instead, we have allowed wearables and other tracking devices to “out dress” the law, as opposed to fashioning the devices in a way that works with government oversight. There are two major concerns with the increasing use of wearable technologies: (1) the lack of safeguards protecting ePHI, and (2) the threat of cybercriminals intercepting ePHI because of the weak security measures.

  1. Lack of Safeguards

On a federal level, data and security privacy laws are weak. There is no federal law that regulates wearables, and it is unclear which regulatory laws govern the dissemination of PHI from these devices. The absence of law allows this information to be shared voluntarily or obtained illegally. Wearable devices are often set in a public default privacy setting, which means that “the profile the consumer created is possibly searchable on the Internet.” Since users are unlikely aware that their data is being collected, these factors contribute to the illegal seizure of this information by hackers and other unauthorized third parties.

A study conducted by the Privacy Rights Clearinghouse (“PRC”) revealed that “information privacy is not currently a priority for developers of mobile health and fitness applications, even though building technical protections are not that difficult.” PRC identified three main privacy risks in health tracking devices that heighten the potential threats of hackers obtaining private information: (1) unencrypted network connections; (2) advertising; and (3) analytics. It further revealed opaque privacy policies among health tracking devices and the associated apps:

We found that among the apps with a privacy policy, the majority of technical practices that we considered a risk to users’ privacy were not accurately disclosed or described in a way that would enable non-technical users to understand what is actually going on. Even users who read most of the privacy policies we found would be surprised to learn what data is actually collected, transmitted, and shared with unidentified third parties.

 

These revelations are quite alarming for various reasons. First, the government has negligently left the law in the hands of corporations and private sectors, permitting self-regulation. Because there are no regulations for wearables and the accompanied apps, developers need not to impose privacy measures for users because there are no laws that will hold them accountable.

Second, the government has essentially permitted private sectors to compel users of wearables and health tracking apps to forfeit the users’ fundamental right to privacy, because the core functionality of the app is distributing the data to developers and brokers. Third, as previously mentioned, developers do not have to implement reasonable safeguards to protect users’ PHI (including personal identifiable information (“PII”)). Thus, by compelling users to forfeit their privacy rights to access the fitness tracking device or accompanied app, users inadvertently succumb their privacy rights to potential hackers because the transmission of the data is not necessarily secured through encryption mechanisms.

For these reasons alone, wearables and accompanied health tracking apps are a double edge sword: while wearables serve beneficial functions and purposes, it comes at the price of your PHI becoming ascertainable by hackers because of the lack of encrypted communications of data, both at rest and in transit.

  1. ePHI Threats

“Encryption is the application of an algorithm to readable information (plaintext) to translate information that would be unintelligible (ciphertext).” It involves turning data into a “scrambled form” that makes it nearly impossible for any “intercepting party to read understand and make any sense of it, except the recipient to whom it is intended.” When the data reaches its “rightful recipient,” the scrambled data is reverts back to plain text and becomes perfectly readable and understandable again. Such application is invaluable in the context of ePHI because medical information is worth ten times more than a credit card number of the black market. Essentially, data mining is the California Goldrush of the 21st century, and both corporations and cybercriminals are eager to get their hands on it. While the reasons for obtaining this information may be different, the underlying purpose is the same: to take advantage of the user in some way. It is likely that we will see the effects of data mining through an economic lens, such as: “credit worthiness, insurance, employability, the revelation of consumer preferences,” and fraud (e.g. Medicare fraud).  “Insurance industry experts say they are one of many expenses ultimately passed onto Americans as part of rising health insurance premiums” because of PHI breaches. This is a serious epidemic because corporations are using individuals’ PHI against them for their own economic gain. As Lifewire simply put it:

Whenever you send private data to another computer or server on the Internet, which you do many times a day, it is like red riding hood's mother sending her to her grandmother's at the other side of the woods. These woods, which she has to cross alone without defense, has wolves and other dangers far more lethal than the wolf of the bed-time story.

 

When ePHI is sent to another server or entity, the ePHI is extremely vulnerable to cybercriminals who intercept and retrieve the highly sensitive information, thereby enabling “full identity theft.” The concept of “full” identity theft relates to the notion that information is at the hands of the cybercriminals, and the hackers can intercept health information, which is arguably more private than personally identifiable information (e.g. name, address, email address, etc.). But despite these threats, many business and medical entities fail to implement encryption standards because it is not required by law.

Data concerning protected health information of more than 120 million people has been compromised since 2009, which is roughly a third of the U.S. population. Since the beginning of 2018, the lax encryption mechanisms continued to have an effect on patient privacy. At least 6.1 million reported individuals, thus far, have been affected from data breaches—many of which resulted from the lack of encryption. While these breaches do not reflect PHI obtained via wearables, individuals were still affected by these data breaches in spite of the privacy regulations already in place. Of the breaches reported to HHS’ Office for Civil Rights, almost 4.4 million individuals were affected because of hacking-related incidents and unencrypted devices. Thus far, July of 2018, was the worst month for healthcare data breaches this year. The breaches reported in July 2018 impacted more than two million patients and health plan members. Of the top five reported healthcare data breaches in 2018 (thus far), three of the five resulted from hacking or IT incidents. Presumably, the trend of increasing data breaches resulting from hacking will continue by the end of 2018 (and presumably 2019), based on the previous top 20 breaches from the past three years. To compare, in 2016, hacking incidents only accounted for 11 out of the top 20 data breaches, and 12 of the top 20 in 2015.  However, in 2017, hacking incidents account for 17 out of the top 20 data breaches, which affirms the suggestion that breaches via hacking will continue.

  1. (In)adequacies of HIPAA

HIPAA offers an attempt to protect individuals’ privacy of PHI by implementing the Privacy and Security Rule, as well as the Omnibus Rule. The Security Rule’s main purpose is to protect the privacy of individuals’ health information, while remaining flexible for covered entities to implement new policies and standards in order to adapt to technological advances. While that is a sound policy in theory, it is apparent that covered entities are not implementing the best standard of patient protection because millions of people have been affected by data breaches via criminal hackers.

The rule provides “addressable implementation specifications," which allows covered entities to exercise their discretion to meet the standards with a particular security framework. The concept of “addressable implementation specifications” include: (a) implement the addressable implementation specifications; (b) implement one or more alternative security measures to accomplish the same purpose; (c) not implement either an addressable implementation specification or an alternative. To comply with the addressability standards, a covered entity must implement an addressable specification “if it is reasonable and appropriate to do so, and must implement an equivalent alternative if the addressable implementation specification is unreasonable and inappropriate, and there is a reasonable and appropriate alternative.”

What constitutes as reasonable and appropriate “addressable implementation specifications”? For the sake of the argument, let’s say that encryption—more specifically, end-to-end encryption—is the most adequate security framework, but the argument against encryption is that it is too costly and time consuming. Based on the addressability standard, the covered entity would need to find an alternative security framework equivalent to encryption. If there was an alternative equivalent to encryption, one can reasonably conclude that there would be far less data breaches resulting from hackers because the alternative would be the equivalence to encryption. But if there is no equivalent alternative, then covered entities are inherently non-compliant with HIPAA because the security framework is less than the reasonably appropriate equivalence, which seems to be the case. Thus, anything less than the equivalence to encryption is a means to an end.

It has been established that the current HIPAA framework does not necessarily consider app developers as business associates, even though the app developers of fitness tracking devices and accompanied apps maintain PHI of the user. Moreover, there is no mention of wearables, leaving privacy concerns uncertain. However, the HHS has provided an explanation when an app developer would be considered a HIPAA business associate, many app developers need not be compliant if they do not satisfy the following scenario:

A patient is told by her provider to download a health app to her smartphone. The app developer and the provider have a contract for patient management services that includes remote patient health counseling, patient messaging, monitoring the patient's food and exercise, and electronic health record (“EHR”) integration and application program interfaces. Furthermore, the information the patient inputs into the application is automatically incorporated in the EHR.

 

The Apple HealthKit is an example that would require the developer to comply with HIPAA. While some studies suggest that privacy of PHI is on the backburner for many developers, some are taking to initiative to make the wearables and accompanied apps HIPAA compliant. For example, in 2015, Fitbit announced its HIPAA compliance program—Fitbit Wellness—that provides companies business-to-business “turnkey software and services to help organizations drive engaging, effective and motivating wellness programs.” Proponents who support the expansion of defining “covered entities” and “business associates” would likely agree that the HHS should considered these efforts, as they provide more reasons to include developers, wearables and accompanied apps as a business associate. While this is a positive initiative towards the protection of individuals’ PHI privacy, unauthorized parties obtaining this information through unencrypted networks remains a significant concern.

On October 29, 2019, the HHS opened its Health Sector Cybersecurity Coordination Center (“HSC3”), which would satisfy proponents who advocate additional government oversight dedicated to cybersecurity. The new addition of HSC3 can potentially unify the “patchwork” system of data and privacy security networks. HSC3’s mission is to strengthen and improve ePHI sharing within the healthcare industry, gain a better understanding of the current threats, and develop strategies to combat these threats. It is undeniable that HHS’s efforts to establish the new department is to address the security breaches within the healthcare industry. In 2017, the healthcare industry faced the worst in terms of the number security breaches reported. While some of these breaches involve non-HIPAA compliance, the main cause of security beaches in healthcare resulted from hacking/IT incidents and unencrypted electronic devices containing ePHI. At this point, there is not much information how they plan to minimize the threats of criminal hackers, but time will tell.

  1. Application of the Fourth Amendment

In the last twenty-five years, the Fourth Amendment has sustained scholarly attention in the criminal justice system and law enforcement apprehension using “sensorveillance,” which are data trails collected by smart devices and wearables. Many conversations involve wiretapping, and the tracking of a suspect’s geolocation by law enforcement to undermine an alibi defense, and whether those acts constitute as an unreasonable “search” or “seizure.” The Fourth Amendment establishes the “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.” The question is, is whether the Fourth Amendment extends the same privacy protection to PHI collected in the form of data, as it would if it was in a tangible form? Is there a reasonable expectation of privacy by individuals who use wearables to track the myriad of health information capabilities that technology offers? While privacy and data encryption are at the forefront of the legal debate concerning the apprehension of smart devices by law enforcement, there is an overwhelming silence concerning the application of the Fourth Amendment to security breaches by hackers who seize ePHI.

Connect with us

Visit our FacebookVisit our InstagramVisit our TwitterVisit our LinkedInVisit our YouTube channel
The information on this website is for general information purposes only. Nothing on this site should be taken as legal advice for any individual case or situation. 
The viewing of this website does not constitute an attorney-client relationship. Attorney Advertising: Prior results DO NOT guarantee similar results.

Copyright © 2024 Pardalis & Nohavicka LLP. All Rights Reserved.
Website Designed & Developed by Ruxbo
magnifier linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram