For centuries, medical procedures, prescriptions, and other medical interventions have been based largely on experience—what is known about a set of symptoms. The doctor looks at those symptoms, tests you in various ways (blood tests, X-rays, MRIs), and interprets the results based on experience with past patients or what is widely known in the medical community. Then she prescribes a treatment. There are two problems with this.
First, diagnosis relies on the doctor’s or medical profession’s interpretation of an examination and test results. Second, treatments themselves target populations, not people: This is the treatment that worked for most people in the past, so this treatment should work for you.
This isn’t to bad-mouth the medical or pharma community. But medicine has been, and still is, essentially statistical in nature. It’s based on populations, not individuals. That has been the most cost-effective way to treat the most people in the most efficient way possible. It hasn’t been possible, either technologically or, more important in terms of time, to test every patient for every possible pathogen that he might ever have been exposed to, or personally interview every family member to understand the patient’s family health history.
That was true until recently. Over the past few decades, diagnostics—including reading your DNA—and treatment methods have improved exponentially to the point where we’ve approached the age of personalized medicine. The Personalized Medicine Coalition defines this as “an evolving field in which physicians use diagnostic tests to determine which medical treatments will work best for each patient. By combining the data from those tests with an individual’s medical history, circumstances, and values, healthcare providers can develop targeted treatment and prevention plans.” (Emphasis mine.)
By collecting ever larger amounts of data, doctors are increasingly able to say what is wrong with you, not just a group of people who had similar symptoms. They can create personalized medications aimed just at you. They can 3D-scan your body and 3D-print implants, prostheses, even tissue, specifically for you. I’m no longer a statistic; I’m a me.
More data, more accuracy, less privacy
But, is there a cost? It depends. All of the technologies listed above are the result of being able to collect and store all the data that describe the minutiae of your body’s makeup collected by your healthcare provider Personalized medicine hinges on sharing everything about our health. The more data, the more accurate—and personalized—the diagnosis and treatment. over time, from blood work, to EKGs, to your genetic makeup, to 3D images of every part of your body. Further, if you are a person who tracks your health using some sort of fitness watch or device, those data can be collected, too. Have an IoT-enabled medical device, like a pacemaker? That’s hoovered up as well. We have the capability to track everything about ourselves, and correlate that to what our body is doing, and how it’s changing at the most fundamental level, sometimes in real time. Doctors and pharmaceutical companies can then mine those data and potentially save or prolong our life. Personalized medicine hinges on sharing everything about our health. The more data, the more accurate—and personalized—the diagnosis and treatment. And if it’s our life that is saved, we are probably OK with that.
On the other hand. What’s happening with this huge amount of data when it isn’t saving my life? Who’s using it, and for what purpose? I might not mind my doctor tracking my health, but I might not be excited if my insurance company knows I am genetically predisposed to cancer. Do I really want advertisers to know what I eat (donuts), how much I lift (a 12 oz. beer), or how many miles I run each day (... what?). I can stand only so many Nugenix ads, after all. And do I really want all the liability attorneys in the world knowing that I was exposed to asbestos?
What data are protected?
The problem with the huge trove of personal medical and fitness data is not the medical uses for it. It’s what else it might be used for. So what is being done to protect these data and make sure they get where they’re needed—e.g., my doctor—but don’t go where I don’t want them—advertisers, data brokers, insurance companies, or law enforcement?
It’s an incredibly complex and quickly evolving problem, but there’s a lot being done on several fronts, as it turns out, from consumer pressure to legislation intended to protect your data. For instance, the Tom’s Guide article, “Here Are the Most (and Least) Secure Fitness Trackers” uses data culled from results of in-depth testing from the noted AV-TEST security research lab to rate how well fitness trackers protect your data.
Author Caitlin McGarry writes, “A fitness tracker can tell where you are at any given moment... how fast your heart beats, how stressed you are, and even how deeply you sleep. That’s why it’s essential for wearables to prove they can be trusted with that data.”
According to AV-TEST, the candidates featured in the list were required to perform in four test categories: security of local and external communication, app security, and data protection. The quality of how well data were protected ran from very good, to “egregious security deficiencies.” The latest AV-TEST results can be found here.
Public testing such as AV-TEST’s shines a light on lackluster products. No manufacturer wants that. The same is true with reporting on data breaches, such as Under Armour’s MyFitness Pal smartphone app breach, which affected 150 million user accounts and lead to a 4.6-percent drop in Under Armour’s share value. Consumer pressure works.
HIPAA doesn’t always protect you
But while the devices, their communication, and the data can be protected technically, what about the companies’ ability to sell or use the data? Having the app developer provide a secure platform for your data is meaningless if the company sells your data to other entities.
In the past, most health data were handled by doctors and medical institutions, and those data are protected by the Health Insurance Portability and Accountability Act (HIPAA). But today, a huge amount of shopping—including pharmaceutical shopping, biometric, and health data are collected by entities that don’t legally fall under HIPAA: wearables for instance (e.g., fitness bands or watches), or the estimated 165,000 health and wellness apps.
So, why not expand HIPAA or something similar into the commercial space, making it illegal to share health data that are tied to a specific person? For the most part, it is illegal to share or use data that are specific to you without your consent. The problem is, just what is health information? My cancer diagnosis? Most likely. My doctor appointment? Maybe. That I parked outside the only doctor’s office on the block? Probably not.
Furthermore, the problem isn’t so much the protection of innocuous, individual pieces of data. It’s how each of these pieces connect to reveal a bigger picture about us.
The Guardian reported that “data, such as information from fitness devices and search engines, are completely unregulated and have identities and addresses attached. A third kind of data called ‘predictive analytics’ cross-references the other two and makes predictions about behavior with... a surprising degree of accuracy.” Commercial companies can mine your personal “big data” and connect the public “you,” e.g., GPS data, wearable data, purchases, which aren’t protected by HIPAA, to the private you, and make pretty good inferences about your health. At a minimum, that makes us targets for ads, but it could also breed discriminatory practices, such as being turned down by a bank or insurance company.
The U.S. government certainly isn’t turning a blind eye to cybersecurity risks and data mining, but the problem is complex. Just how complex is addressed in a U.S. Department of Health and Human Services (HHS) report, “Health Information Privacy Beyond HIPAA: A 2018 Environmental Scan of Major Trends and Challenges.” The report is an in-depth look at healthcare big data and its expanding uses and users, including cybersecurity threats and approaches, evolving technologies for privacy and security, and even evolving consumer attitudes toward security.
If 50 percent of health is governed by social determinants, then nutrition, sleep, exercise, smoking, drinking, and friends may be health data.The report is an eye-opener about what is and isn’t covered under medical privacy rules and touches on data mining from seemingly innocuous data. For instance, suppose Google Maps or similar GPS monitoring tracks me to an address, and the only occupant of that address is a psychiatrist. A data broker then correlates that information with my purchases from medical supply companies or pharmacies. Is location data—any of these data, really—actually protected medical data? Nicole Gardner, vice president of IBM’s Services Group, was cited in the report as suggesting that if “50 percent of health is governed by social determinants not traditionally classified as health data, then nutrition, sleep, exercise, smoking, drinking, and friends may be health data.
According to the HHS report, “health data, whether it originates entirely in the commercial, unregulated [HIPAA] sphere, or ‘leaks’ into commercial databases from the HIPAA-regulated world, can remain essentially forever in files of data brokers and other consumer data companies. Health data may remain valuable for the entire lifetime of the data subject, and it may have uses with respect to relatives of the data subject.” And that means, misuse of your data may not only affect you, but your relatives and children, as well. Think in terms of family history information and your genetic information.
The good, the bad…
So there’s the tradeoff. On the one hand, there’s a lot to be said for how data mining can help us tackle medical issues. FDA Commissioner Scott Gottlieb wrote in FDA Voices that “mobile wearable technologies can complement traditional [patient reported outcomes] surveys by generating objective, continuous activity and physiologic data. Obtaining reliable wearable device data on activity level, coupled with direct patient reports on their ability to carry out important day-to-day activities, can provide information on physical function that is directly relevant and important to the quality of life of cancer patients.”
Furthermore, writes Gottlieb, advanced computing and systems biology is making healthcare more personalized, and technology is closing the gap between clinical research and real-world patient care.
On the other hand, as the HHS report points out, as the use of trackers, smart watches, internet-connected clothing, and other wearables becomes more widespread and more sophisticated, the type and volume of data will explode. Biosensors can already capture your heart rate, body temperature, and movement, and are quickly moving to also monitor brain activity, moods, and emotions. “These data can, in turn, be combined with personal information from other sources—including healthcare providers and drug companies—raising such potential harms as discriminatory profiling, manipulative marketing, and data breaches.”
… the worse
Where we go from here is the question. We know that government regulations constantly lag behind technology. So, while biometric, pharmaceutical, and healthcare privacy laws are being introduced—and challenged—at both the state and federal level, the amount of health data that can be shared about you is still considerable and continues to grow.
It would be nice to say that there is something you can do about it. But, even if you are willing to completely unplug yourself from the world and never visit a doctor or pharmacy, there probably isn’t. We spew data, and there is always someone out there to lap it up. The only recourse is to pressure lawmakers to protect your privacy and shun hardware and software products that play fast and loose with your data.
As with any medical instrument, big health data is just a tool. It can be wielded with the care and consent of the consumer and improve their quality of life, or it can be used indiscriminately to serve someone else’s bottom line, disregarding your needs. The trick is how to leverage one and mitigate the effects of the other.
Add new comment