Safeguarding health data
October 28, 2016

Data analytics, increasingly used in health care, can promote cures and deliver efficiencies, but massive data collection about individual health and social status may lead to loss of privacy, unequal treatment, and the perpetuation of health inequality, says Virginia Tech business law professor Janine Hiller.
Technology and legislation are transforming health care in the United States, Hiller says. Among the developments of concern are the ubiquitous collection of health and lifestyle information and the increasing commercialization of that data.

In a recent journal article, Hiller notes that harm is done in several ways, including an overreliance on data to produce cost savings; the unquestioned collection and use of data; and the unaudited use of data analytics.
Asking the hard questions
Her article seeks to discuss broad, societal questions, Hiller says. “It’s not just about data, but about how we should use the data responsibly and ethically to make decisions about people’s lives and health.”
Noting the adage about “not seeing the forest for the trees,” Hiller says: “When lots of talented people in the data and health care worlds work on solving problems, it can be difficult to see how all those efforts might add up to potential problems in the wider context.
The more we depend on data to solve health problems, the more we look outside the health system — and into data about the way people live their personal lives: who they associate with, what they eat, and what their financial problems are. All these stressors can affect health.”
Such data is sought after by health-care providers, Hiller says, as under the current payment structure, it pays them to keep people healthy.
What particularly concerns her is the secondary use of health data by employers, insurers, marketers, and others — especially given a public that may not know or understand the implications of supplying health information.
Data brokers are able to infer or obtain health information in a wide variety of ways, she says.
“Data amassed from private-sector health website visits, personal health devices, mobile health applications, and social networks, are being linked together in a big data environment,” she says.
What we don’t know about our health information
“People think that their health information is the most personal type of information, and that it should be protected no matter who has it. But most people don’t know that those commercial websites or health tracking devices are not covered by the laws that protect health information held by hospitals, for example.”
Moreover, she says, health data gets mixed up with other, non-health data. “For example, one health-care system bought the data from a grocery-store loyalty card and then used that data to predict health problems or treatments for patients.”
Indeed, Hiller worries that “predictive analytics” (which uses mathematical algorithms to calculate future outcomes from large data sets) may end up worsening health-care disparities by segmenting groups by income, race, or other characteristics and resulting in differential care.
“Using past health data can build in past discriminatory findings, if the data set and analysis are not carefully designed,” she says.
“Interpretation of data sometimes makes it easy to overlook the real societal problems, like poverty, that lead to personal decisions and actions. We must be careful not to create a discriminatory health system with the use of data analytics. Data is not always objective.”
“Trust needs to be earned.”
Hiller stresses that her study is not an argument against using health data.
“However, it is important that we think about the unintended consequences of data utopianism — depending on data and predictions to solve all our health-care problems. Sometimes we grant the data too much power, which tends to reduce the focus on the individual and her autonomy.”
Her message for health-care providers and policy makers: “ask hard questions about how harms to personal privacy can be avoided, stigmas prevented, and threats of unbridled commercialization ameliorated.”
As for patients and consumers of health-care products and services, Hiller says: “Folks need to be vigilant about how they share health information. It is very easy to think that it will not go any further, when in reality it will be made a part of a huge commercial database somewhere. Trust needs to be earned.”
Hiller’s article, “Healthy Predictions? Questions for Data Analytics in Health Care,” was published in American Business Law Journal.
Plans for Protecting Data
The interactions among three aspects of the personal health data environment influence personal privacy and health priorities in ways that have not been critically acknowledged, Janine Hiller says.
First, policies promise to deliver equality in health care and to protect personal privacy, yet fail to incorporate specific steps that will do both, she argues. “Data take priority.”
Second, Hiller notes, participation of businesses outside the traditional health-care industry is mushrooming, as entities as diverse as data brokers and consulting firms collect and manage health data from within and outside the health system.
“The aggregation and manipulation of individual health data is occurring in ways that make it impossible for individuals to control its reach,” Hiller says, “and laws are inadequate to provide for robust privacy and antidiscrimination protection.”
Lastly, she says, data-driven discrimination is a real prospect when “policy that treats data as a solution” is coupled with “increasing data fusion.”
Addressing these problems will require a multipronged approach, Hiller says, “beginning with policy leadership that recognizes the existence of the problems and the importance of addressing them with specific strategies.”
She suggests that health-care and other entities adopt a risk management framework to help them review their data practices and products, assess the problems that they may create for individuals, and develop a plan to address them.
Lastly, Hiller calls for legislative action to address potential discriminatory applications of data analytics and the resulting surveillance, and to modify incentives.
The goals of the Affordable Care Act — improving health care and eliminating health disparities — can and should be “achieved by means of healthy data policy and practices, so that they do not lead to unhealthy consequences.”
Hiller’s research focuses on the intersection of law, ethics, and technology in the context of the business environment. She has examined privacy and cyber security laws and regulations in diverse areas.

Positive Data Sharing
The total health data stream includes individual patient social media posts, such as tweets, blogs, and Facebook updates, says Janine Hiller.
Websites or Facebook pages devoted to patients with specific health problems, she says, are often maintained by for-profit organizations that encourage individuals to share personal health successes and failures.
One example, she says, is PatientsLikeMe, which ran a promotion asking patients to “simply share their health data for good.”
By creating a personal profile, tracking their symptoms and treatments, and making the information public, “patients not only help themselves, but help others who can learn from their experiences, and advance research,” the site said.
PatientsLikeMe is part of a wider initiative to encourage individuals to share health information for the public good of curing illnesses, Hiller says.
“This is great that individuals can make that choice. The problem is, however, that anyone signing up to contribute to the social good will likely be contributing information to a far wider circle than perhaps he or she anticipated, and benefiting purely commercial organizations as well as medical researchers.”
Individuals can also earn stars for sharing personal health information, she adds — three stars earn the person a t-shirt.
The PatientsLikeMe website, Hiller says, does state that it sells the information patients share to its partners, and that it recognizes the potential harm of sharing health data and personal information, including the possibility that a member could be identified and could be discriminated against or experience repercussions as a result.
“For example, it is possible that employers, insurance companies, or others may discriminate based on health information,” the site notes.
“Not surprisingly,” Hiller says, “neither the data use nor the patient warning information is found explicitly at the main page or at the promotional page that advocates for the social good of sharing. Instead, the foregoing is provided as a link at the registration page where one must check ‘I agree’ in order to enroll.”
