Two years ago, Latanya Sweeney created a graphic on the widespread sharing of medical files that shocked lawmakers, technologists and doctors.
Sweeney, who founded the Data Privacy Lab at Harvard University, produced a “health data map” that looks like a windshield cracked by a few big rocks. At the center is someone’s health record, medical provider and insurance company. Emanating from them are webs of more than two dozen organizations that could have legitimate access to the file, including transcription services, medical researchers, and even data-mining firms and pharmaceutical companies.
“Collectively, you’d hear a gasp and then a moment of silence — that was pretty universal,” she said, describing the reaction during her congressional testimony and presentations to privacy summits, academic conferences and medical schools.
However, Sweeney said there are limitations in tracking the movement of medical data, and many doctors are in the dark about where their patients’ data go. So at the Health Privacy Summit in Washington, D.C., which starts Wednesday, she plans to unveil a new project to harness the collective knowledge of doctors, data-breach victims, whistle-blowers, technology specialists and others to build a new, more comprehensive health data map.
“If we can get a lot of people to march in this direction and keep them there and entertained and incentivized, I think what we’ll uncover will be mind-blowing,” said Sweeney, who is a computer scientist.
Her project comes amid a U.S. push to digitize patient records, which has created lifesaving benefits but has also made it easier for medical files to end up in unexpected places. As I reported last month in a special report for Bloomberg.com, loopholes in the federal law have allowed the collection and sharing of private medical information without patients’ consent.
Sweeney’s work has focused on identifying those unexpected places and on showing that it’s possible to determine some people’s identities from medical data, even after the records have been stripped of personal information. Adding to the alarm, she said the number of third-party entities receiving medical data has more than doubled in the past decade, and some firms that once received only “anonymized” data now get records that identify people.
While Sweeney’s earlier mapping effort drew on her experience as a legal expert and her work with the privacy center, her new project, thedatamap.org, needs submissions from others to help sketch a more complete picture of how medical data are shared.
At first, she’s seeking submissions of Internet links that show data-sharing relationships between medical providers and others. People will sign up with an e-mail address to be “data detectives,” and the accuracy of their submissions will be checked by other people who have signed up to submit links. Eventually, the map could include information from other sources.
Deborah Peel, a physician and founder of Patient Privacy Rights, the Austin, Texas-based group putting on the conference, said a promising aspect of Sweeney’s project is its open nature, which will help ensure accuracy by allowing organizations that are mentioned on the map to respond.
“There’s some self-regulation there — we’re pretty hopeful that if somebody says something wrong about a hospital or a corporation, that they’d respond and provide the right information,” Peel said. “It’s kind of ridiculous we’re forced to resort to this because there’s no chain of custody for our data.”
Even if the project gets little public input, the research can still be used to pressure lawmakers into mandating that data-sharing arrangements become more transparent, Peel said.
Sweeney said a goal of the research is to identify areas where patient data might be vulnerable to theft or abuse. It’s not to prevent the sharing of medical data entirely, she said.
“Because you don’t know where your data is going, harms are almost impossible to report and detect,” she said. “We don’t want to stop data sharing. There are a lot of uses and benefits that come from it. But how do we do it in a responsible way? As long as the data sharing is invisible, you can’t possibly do that.”