:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/KKY5RE2QRBCSPNRQK3F4CL5Y6Y.jpg)
PHOTO ILLUSTRATION: BRYAN GEE. SOURCE IMAGE: iSTOCK
Catherine Stinson is a senior policy associate at the Mowat Centre.
At a recent appointment, my doctor was stumped by my big toe. What looked like a fungal infection in my nail had unexpectedly tested negative, so she took a quick picture on her phone to send to a friend who specializes in that particular gross corner of the medical world. She used an app where she can call up my medical chart, and add impromptu pictures such as the one she’d taken of my toenail. Her professional opinion was, “Check this out! This is so cool!”
Storing medical records online can make the work of health-care providers easier, and benefit patients, but given the recent mess of stories about leaked data from just about every app ever made, I couldn’t help but wonder how secure my chart was on my doctor’s phone. Are Russian agents looking at my toenail right now? Please get in touch, Boris and Natasha, if you have a diagnosis.
The privacy of our health-care data may not seem like a big deal. Nobody other than my mother wants to know the details of my colonoscopy. It’s all the same to me if the entire readership of this newspaper finds out that I have a prescription for cortisone cream. There is a certain doctor who might find it awkward if it were to get out that he misdiagnosed my textbook case of appendicitis, but even that isn’t very valuable information. Or is it? Weird things can happen when data is let loose on the internet. Think of how Michael Chong’s picture mysteriously ended up on posters for bathroom hygiene in Guatemala.
Obviously, some information is more sensitive than a politician’s posed picture. STDs, abortions, tuberculosis, nose jobs and cancer are sometimes closely guarded secrets. Even minor, treatable medical issues can be embarrassing if they’re contagious, yucky or involve our private parts, although apparently many people are open enough about their pus-filled cysts that they brazenly upload the popping videos to YouTube. (I know because I watched them for about an hour.) Some medical details also have legal implications – such as HIV status – or carry enough stigma that the information might put off potential in-laws, bosses or landlords.
Even for people who don’t have any medical skeletons in the closet, the move toward storing health records online and sharing that data with other providers and third parties such as medical researchers and tech startups has potentially creepy repercussions.
Let’s say Health Canada wanted to know whether toenail-fungus rates are going up among 30- to 50-year-olds. The most efficient way to get that information would be to get doctors to automatically send in all their records about patients in that age group. That level of data sharing isn’t possible yet, because provinces and health-care providers use a hodgepodge of different storage methods, but this is on the near horizon. Of course, when this kind of information is shared for research purposes such as fungus forecasting, all of your personally identifying information – including your name, health card number, address, etc. – should be removed before the file is sent.
Unfortunately, in the age of big data, removing personally identifying information isn’t nearly enough to protect your privacy. Take the information I’ve divulged here: colonoscopy, prescription for cortisone cream, appendectomy, undiagnosed toenail weirdness. On their own, none of these details are particularly interesting, and each of them is pretty common, but together, they make for a fairly specific profile of me. If you add a few more easy-to-guess details about me (female, age between 30 and 50, cyst-popping-video enthusiast), it may well be enough to uniquely identify my medical records and link them to my social-media profile. So even if all of the personally identifying information is stripped off my record before it’s stored online, it could still be traced back to me. In the data-science world, that’s called re-identification, and it’s what Facebook was planning to do with the anonymized health records it was arranging to get from major U.S. hospitals before the Cambridge Analytica scandal hit the news and they decided it was poor timing.
The sort of information that many of us freely share online without a second thought makes re-identification easy. I tweeted about how my recent trip to the dentist was a bloodbath (has regular dental cleanings, doesn’t floss). When I announced the births of my kids on Facebook, I didn’t divulge exact birth dates to protect their privacy (it’s not paranoia if we really are under surveillance!), but nevertheless divulged more health information about myself (two live births). Boris and Natasha have already figured out how many drinks a week I had when I was 25, and how many sexual partners.
To take a simpler example, consider someone with a rare disease that only a handful of people in Canada have. Knowing just their diagnosis and maybe one other detail about them (such as the province in which they live or their approximate height) would be enough to uniquely identify that person based on their anonymized medical records. But rare diseases are rare, right? Well, actually, no. There are more than 7,000 different rare diseases affecting about one million Canadians. If you have a rare disease, that one fact about you makes it impossible for your chart to be made anonymous. Your diagnosis gives you away just as surely as your name does, or more so if you have a common name such as Mary Lee.
Doctors, tech startups and pharmaceutical companies are eager to get more access to medical information. People who study the spread of disease could do a better job of predicting where the next measles outbreak will happen if they had access to the social networks of unimmunized children. Dozens of tech companies are competing to make the best app for fighting depression. If they could link data about mental-health history with social-media use and GPS tracking, they might be able to design effective interventions. Of course, cyberbullies could use the same information to harass their neighbours or to prey on vulnerable people, and big pharma wants to deliver personalized drug ads straight to your desktop as soon as you notice the first symptom. There are a lot of things that could be done with our health data. But should we do it, and, if so, how?
Data-sharing projects are spreading like lice in a kindergarten. The Children’s Hospital of Eastern Ontario got $13-million to expand a project to better diagnose rare diseases by sharing patient data across Canada. The Centre for Addiction and Mental Health is using a $15-million gift to open a centre to use big data, artificial intelligence (AI) and machine learning to identify, manage and treat mental illness. I talked to medical researchers working in fields as diverse as cancer research, public health and internal medicine. All of them had projects in the pipeline to share patient data and to use AI to discover patterns in that data. Torontonians with heart problems can choose between the “data lake” at the Peter Munk Cardiac Centre, and the “biobank” at the Ted Rogers Centre for Heart Research.
Hospitals across Canada are rushing to collect large data sets for AI analysis because the potential rewards are huge. One of the biggest promises is personalized medicine. Medical treatments work differently on different people. One notorious example is the drugs commonly prescribed for depression, which work well for some patients, have no effect on others and occasionally make things much worse, increasing suicide risk. Trial and error is how doctors currently decide how these drugs will work on you.
Obviously, a better way of predicting drug responses would be very useful when the wrong drug can lead to suicide. Or if you’re facing a deadly diagnosis – cancer research is one area where “personalized medicine is starting to gain traction and treatment is beginning to be tailored to the individual,” according to Christopher Taylor of the New England College of Optometry.
Although the potential benefits are many, increased data sharing has its downside, too. In some cases, preventing suicide can depend on not sharing data. Trans people have a very high risk of suicide (43 per cent have attempted suicide), yet this population avoids seeking medical care (21 per cent have avoided going to the ER even when they needed emergency care) because of discrimination by health-care workers. These numbers are from the Trans Pulse project, which surveyed hundreds of trans people across Ontario to gather data that would otherwise have been impossible to get, given that population’s mistrust of doctors. People’s willingness to participate in the survey depended on trusting the researchers to keep their personal information safe, and not to use it for purposes that might harm them.
That trust was nearly jeopardized by a court order asking the lead researcher, Greta Bauer of Western University’s Department of Epidemiology and Biostatistics, to hand over the study’s raw data. “Without trust, we risk producing biased research where participants do not reflect the actual population, and in which they either do not answer sensitive questions or do not feel they can answer them honestly,” Ms. Bauer says. Finding solutions to critical health issues requires accurate data, which Mr. Bauer says “unconditionally requires establishing the data privacy required to build and maintain trust.”
Alexander McClelland, a researcher from Concordia University who studies the experiences of people living with HIV, notes that in Canada, unlike most countries, “research and participant communications are not legally protected, and so while confidentiality may be promised, it is not legally enforceable.” We all stand to benefit from more research about the effects of stigma and criminalization on HIV transmission, but current laws don’t protect the privacy of research subjects, which means many are reluctant to talk. That’s bad for all of us.
In order for medical research to happen, the public needs to participate by letting researchers biopsy our tumours, record our symptoms, analyze our blood samples and by donating our bodies to science after death. But the public is only willing to get on board if we know how our information will be used, and trust that we won’t be harmed by the research. If I thought I might end up as the poster child for dowager’s hump, I might not have signed my organ donation card.
Henrietta Lacks is probably the most famous case in which medical information was used without informed consent. In 1951, cells were taken to diagnose Lacks’s cervical cancer, but without her knowledge, the cells were also used for research. It turned out that her cells reproduce much more easily than most, creating an “immortal” cell line that came to be known as HeLa. HeLa cells were instrumental in developing the polio vaccine, the invention of cloning, and in vitro fertilization, and are still bought and sold for medical research today. Lacks’s children meanwhile couldn’t afford medical insurance, and had no idea that their mother played an essential role in modern biology. Rebecca Skloot tells their story in The Immortal Life of Henrietta Lacks. At the time, it was normal not to ask research subjects for permission, not to inform them about how their information would be used and not to ensure that they would stand to benefit from the research.
These days, medical information is only supposed to be shared with consent. But there are loopholes that sometimes lead to harmful over-sharing. Many patients don’t know that their records are shared within a “circle of care” that includes their doctor, plus any specialists, hospitals, paramedics or pharmacists they visit, based on “implied consent.” Some information is also shared without consent, such as cases of infectious disease that are reported to public-health organizations. People who have been treated for issues such as psychosis or addiction (even briefly, long ago) report that because of their medical histories, health-care workers don’t believe them when they describe their symptoms, so they end up with longer wait times, are denied needed drugs or are sent away without treatment. Opting out of that data sharing can only be done on a case-by-case basis, which doesn’t help much when you’re being wheeled into the ER.
In other cases, more sharing of data can help patients. The way you look and talk is known to affect your care. Research shows that black patients and women are less likely to be given painkillers, and people who are perceived as uneducated (including anyone not fluent in the local language) get less medical attention. Doctors decide on a likely diagnosis within 18 seconds of seeing you, and when they peg you wrong, it can be deadly. An Inuvialuit man with a history of strokes died after nurses insisted his stroke symptoms were drunkenness. A woman with sickle-cell anemia, which predominantly affects black and Middle Eastern people, died after she couldn’t get an ambulance or the high dose of painkillers she needed to treat her sickle-cell crisis, because paramedics assumed she was a drug addict. In cases of chronic illness, it is in patients’ interests to have their treatment information available as soon as possible, so that bias doesn’t get in the way of speedy treatment.
What’s the difference between cases in which data sharing is bad and data sharing is good? Control. When patients have control over what information is shared and with whom, and can have incorrect or misleading information removed from our records, our interests can be protected. The European Union just started enforcing its new General Data Protection Regulation (GDPR) on May 25, and there is talk that Canada may follow their lead. One promise of GDPR is stronger consent rules: The owner of the data gets to decide who can see it, what they can use it for and how long it can be kept on file. If you and your psychiatrist decide that nobody else needs to know about your crippling fear of clowns, the nurse setting your broken leg doesn’t have to find out, but if you suffer from asthma attacks, and want to support any and all research that might help you breathe, you can share your medical records widely. Stronger consent rules allow for choice.
Some researchers are trying to move in the opposite direction. Maya Goldenberg, who studies evidence-based medicine and public trust in science at the University of Guelph, says that some health-care insiders are trying to chip away at informed consent. “I’ve heard that ‘consent bias’ distorts research outcomes; that consent is prohibitively burdensome for some studies (given the large numbers of data sets…) and even that consent may not be necessary since the information is already available,” Ms. Goldenberg reports. But she dismisses these worries as “over-enthusiasm about big data” and insists that there is “no moral justification for lessening the ethical standards.”
There are ways of making informed consent more efficient without destroying trust. Consent forms can use simplified language, and researchers can automate many stages of the informed consent process, such as opting-in, revisiting consent periodically and informing participants about what their data will be used for. New techniques for anonymizing data and preventing re-identification are continually being developed. GDPR implements former Ontario Privacy Commissioner Ann Cavoukian’s Privacy by Design principles.
The Ontario Brain Institute’s Brain-CODE database is an example where strict protocols are being followed for encrypting, anonymizing, and controlling access to data for only select purposes and fully vetted partners. The tools are available. We need to mandate their use before the rash of big data projects spreads.
It’s in everyone’s interest to make sure that the public trusts health-care providers and researchers. If we lose trust, we’ll stop sharing our data, stop trusting medical advice and both treatment and research will suffer. In order to have trust, we need transparency, privacy and consent, which means clear information and choice. Some people want to share the videos of their cyst-popping procedures, and they should have that choice. I would prefer my colonoscopy video not to go viral, and I hope my doctor’s phone is secure enough to prevent leaks.