Data Privacy in the Age of Wearables
by Daniel Pyne
With the recent explosion in the power of technology, big data has never been as important as it is today. Companies across the globe all use data in some way, whether it is Google or Facebook building a profile of users to better serve ads, or a wellness program tracking the successes and failures of their initiatives. This data is de-identified, a process where they remove personal information like name and address but leave information like ZIP code, date of birth and gender, in addition to all of the information collected from all of the apps and devices you use, they are thus ultimately passed on to a third party for analysis. Many employees are under the assumption that this information is protected by laws like the Health Insurance Portability and Accountability Act of 1996 (HIPAA) or the Genetic Information Nondiscrimination Act of 2008 (GINA), but this not always the case. Information generated by a participation-based wellness program, for example, does not need to be HIPAA compliant. Some employers go so far as to have their employees waive these rights as part of their health risk assessment (HRA). There is little understanding on the part of consumers as to how their data is protected, what data is protected, how it is used and, how to best protect themselves.
How Companies Collect Data
There are many ways to generate health data Apps, wearables, self-reported information and medical exams all generate lots of data points. Take the popular MapMyFitness app for example. This app helps users track their jogging route, daily physical activity and what they eat and drink, and lets them share it with their friends and family via social media. In 2013, athletic appeal company Under Armour bought the app as well as the data of more than 20 million registered users.12
While this may seem innocuous, after all, they are just using information to serve their customers; the scope of what they know is staggering. If you connect a wearable device to the app and regularly update it, then Under Armour – like Santa Claus – knows when you’re sleeping, knows when you’re awake and knows if you’ve been bad or good. They know your personal habits better than you do.
Under Armour is far from unique in these practices. Companies collect this data at every opportunity, even if it is an email address or phone number when you get to the checkout lane. Not doing so would be losing the data arms race companies are currently engaging in, causing them to fall behind consumer trends. This means losing money. While this might
seem innocent, there is a major threat to individuals’ right to privacy if this data were to fall into the wrong hands.
This does not only apply to data collected from apps. Employers can take and accumulate this data from their wellness programs.
“The industry has grown a lot since 2000,” said Kathy Downing, HIPAA Expert and Senior Director of Information Governance at the American Health Information Management Association (AHIMA). “Then it used to be just a pedometer, but now employers can gather a lot of data on their employees, whether they know it or not. They can effectively get the health of their entire operation”
The Myth of Anonymous Data
The data that comes to the companies that collect it, whether it is a data warehouse contracted by an employer to collect and analyze the data for their wellness progr3am, or the data coming directly to a major wearable device manufacturer, often comes in de-identified. However, his layer of anonymity is no longer enough as our ability to analyze data improves.
A recent working paper from Harvard demonstrates this. A team of researchers was able to take publically available de-identified health information from the Personal Genome Project (PGP), compared the information to data found in publicly available voter lists – chiefly five-digit ZIP code, gender and date of birth and were able to “re-identify” names to 22 percent of the data with 84 to 97 percent accuracy. If they can do this, then hackers and other entities invested in big data can as well.
The PGP allows users to post their personal genetic information for use in research and includes medication prescribed, procedures and any illnesses they may have. Prior to publishing their information, participants are required to sign consent forms and pass an entrance exam, so there was some indication to participants that sharing this information
may become a problem in the future, information users may not even be aware of.
What’s the Worst that Could Happen?
An important lesson from 2015 was the vulnerability of personal health data when major hacks at Anthem, Premera Blue Cross and others affected millions of people. This trend had been growing long before 2015. According to the Journal of the American Medical Association, data breaches doubled in the period between 2010 and 20 13 . So why exactly do these hackers want personal health data?4
According to cybersecurity firm Dell Secure Works, the value of personal health data is 20 times higher than stolen credit card numbers on the black market. This information is more valuable than a credit card because of its relative permanence.5 If a criminal is buying a stolen card, there is a good chance that it may not be valid anymore; however, information like a patient’s height, medical conditions and social security number are unlikely to change.
Collecting this information provides the criminal with an intimate look at victim’s life as well. With wearable data, they know where and when you are asleep. They can even get access to the victim’s password, which may not sound too bad, but many people only use one password for everything. They start trying that password and your email on other sites, and eventually have access to something potentially harmful.
“HR most likely didn’t know that identifying information was in their data sets,” explained Downing. “A lot of thought goes into creating these data sets, but monitoring it can be difficult.”
This is only the criminal side, the information we are willing providing to technology firms and our employers may come back to haunt us as well. This sensitive data, in the hands of an employer, could lead to employment discrimination, where an employer does not want to cover the medical expenses of diabetics, and thus does not hire any for example. This information generated by a wearable could find its way to a bank or credit card company that will deny people who are sedentary because they are at a higher risk of defaulting. A life insurance company might deny a couch potato. Insurer John Hancock is already offering discounts to customers who agree to wear a Fitbit, share their data and reach certain predetermined health metrics.6 There are laws that are supposed to prevent this of course, but like all laws, loopholes and misunderstandings exist allowing this to happen.
HIPAA, GINA and Loopholes in the Alphabet Soup
There are laws determining how, and with whom, an individual’s health plan can share their personal information. HIPAA, in very simple terms, compels health care and insurance providers to disclose how they will use health information, and if they want to share health information, they must receive consent from the individual first. After permission is given, the provider can only supply the minimum amount of information needed.
Wellness programs can also be subject to HIPAA Privacy Rules if tied to a health plan in some manner. For example, if a wellness program supplies an insurance premium discount as an incentive, it falls under HIPAA; whereas a program that only encourages more activity and proper nutrition would not. Furthermore, the individual themselves is not subject to HIPAA and can share their health information with whomever they please.
Some wellness vendors may have an individual waive this right as part of their participation, often as part of an initial HRA, and can still comply with HIPAA. Many of these privacy policies include clauses allowing them to share identifiable data with unnamed agents and third parties “working to improve employee health.” Additionally, HIPAA does not cover de-identified health information, which as we already have seen, can be easily re-identified.7
“Where a workplace wellness program is offered by an employer directly and not as part of a group health plan, the health information that is collected from employees by the employer is not protected by the HIPAA Rules,” said Deven McGraw, Deputy Director for Health Information Privacy, HHS Office for Civil Rights (OCR). “However, other Federal or state laws may apply and regulate the collection and/or use of the information.”
GINA, for example, is an antidiscrimination law, not a privacy law. The 2008 law prohibits group health plans from utilizing genetic information to discriminate in terms of insurance. Title II of GINA bars employers from using genetic information to make decisions about hiring, promotion or firing, as well as placing limits on an employer’s ability to ask for or purchase genetic information. However, GINA does not apply to long-term care, life or disability plans and Title II only applies to companies with 15 or more employees.8
In 2009, Congress adopted The Health Information Technology for Economic and Clinical Health Act or HITECH to promote the adoption and use of electronic health records. While most of the law deals with implementing the infrastructure and usage of electronic health records, portions of the bill act to reinforce HIPAA by mandating the technology complies with the law and establishes notification rules that covered entities and their business associates must follow should a breach occur.9
Security breaches affecting less than 500 people are reported annually to HHS, while those affecting more than 500 people are reported to HHS, the individuals affected and the media immediately.
“An employer that administers a wellness program as part of a group health plan is prohibited from using or disclosing individuals’ health information for employment-related actions or other purposes is not permitted by HIPAA,” said McGraw. “Marketing without an individual’s express authorization, for example. Employers must implement reasonable and appropriate administrative, technical, and physical safeguards to protect the information. In the event of an unauthorized disclosure, the employer that is administering aspects of the wellness program must notify the affected individuals, the Department of Health and Human Services (HHS), and in some cases the media, of the breach.”
There are laws protecting employees but they still do not protect non-HIPAA entities, nor do these protections extend to important voluntary benefits like life insurance. But these protections still hold entities responsible for their data security practices, right?
Since 2009, there have been over 1,500 data breaches affecting over 500 people reported to the Office of Civil Rights.10 There have also been more than 120,000 breaches affecting less than 500 individuals.11 This puts the total number of people affected at well over 150 million according to figures available from the Office of Civil Rights.12
For all of these violations, 32 covered entities have faced fines.13 The Office of Civil Rights has the authority to enforce fines for data breaches of up to $1.5 million per violation per HITECH, but rarely uses it.14 After the high-profile data breaches of 2015, it appears the Office of Civil Rights is taking the issue more seriously. As of February 2015, they had perused just 22 cases, filing another 10 cases after the incidents.15 If the laws do not go far enough, and the regulatory bodies in charge do not show enough teeth, how is an individual supposed to protect their personal health data?
Data will only continue to become increasingly important in all facets of decision-making in the future. As technology becomes better, faster and smaller we will be able to collect more data. We will fit more sensors measuring more data on our phone and wearable devices. The future will see wearable contact lenses that continually monitor blood glucose levels, and wearable devices that will be able to provide health vitals to doctors in real time while the patient goes about their day. These are all potentially vulnerable data points that could profoundly affect someone’s life. The cause of these problems seems to be two-fold. One, as is often the case, the law is behind the capability of technology.
HIPAA protects the health data generated by physicians and insurance companies, but there was no way the people who originally drafted the bill in 1996 could predict the rise of wearable technology and the impact it would have on our daily lives. While the regulators are beginning to take a stand on the issue, until real changes are made there will still be loopholes allowing this data to remain unprotected.
Secondly, the consumers who are generating and sharing this data need to be aware of the dangers it could pose to them. More than marketers and tech companies will use the data they generate; it could fall into the hands of banks, potential employer’s or even hackers. While laws like GINA intend to prevent this, proving discrimination was the basis for an employers or bank’s decision actually occurred is notoriously difficult. Individuals need to be aware that the data they generate is a product that businesses sell, and it is very difficult to track all hands through which data passes.
For example, your wearable company de-identifies someone’s data and sells it to a research organization. After receiving and analyzing your data, the research organization eventually goes out of business. What happens to that data? It could very well still be on a hard drive somewhere, just waiting for someone to access it. This is more than a theoretical example. Recently in their bankruptcy, RadioShack attempted to sell the consumer data they had collected. The deal looked good until Apple stepped in and declared they could not sell any data connected to an iPhone user.16
Until more companies like Apple make data privacy a key issue, data will be vulnerable. The only way these companies will care is if their consumers are aware of the dangers and demand their personal data remains secure or if they take matters into their own hands. So what are employees supposed to do?
But sometimes, this might not be enough. For those who are very concerned about the security of their health data, there is only one option.
“Wellness plans need to have optional participation,” said Downing. “So, the most effective method to preserve data is to opt out. This can be difficult for some because the benefits are too much to pass up. Unfortunately, it is difficult to protect against this.”