A box of vials filled with Pfizer’s Covid-19 vaccine five doses per vial sits next to a pile of stickers that read “I’ve been vaccinated for Covid-19” on the preparation table at Thomas Jefferson University Hospital in Philadelphia, Pennsylvania, on December 16, 2020.
The city of Philadelphia had to end its relationship with a vaccine clinic provider after it changed its privacy policy to say data might be sold. | Gabriella Audi/AFP via Getty Images

How a vaccine clinic scandal in Philadelphia shows the need for better health privacy laws.

Open Sourced logo

When the city of Philadelphia announced its “unique public/private partnership” to create a mass vaccination clinic with an upstart nonprofit called Philly Fighting COVID in early January, it seemed like an objectively good thing. The clinic could vaccinate thousands of people per day, and Philly Fighting COVID’s website allowed not-yet-eligible Philadelphians to preregister for vaccines by supplying their name, birthday, address, and occupation — which the city encouraged residents to do because it hadn’t made a preregistration site of its own.

It’s not looking so good now. Philadelphia ended the partnership after it was reported that the company changed its nonprofit status to for profit, and its privacy policy to say that it could sell the preregistration data its site collected (Philly Fighting COVID maintains that it had no intentions to sell data and didn’t even realize that language was in its privacy policy). Now, the city is scrambling to reassure residents that their data won’t be sold and to reschedule their vaccine appointments with other providers. Philadelphia’s district attorney and Pennsylvania’s attorney general are threatening to launch investigations. A clinic nurse has accused Philly Fighting COVID’s CEO of taking unused vaccines from the clinic.

The Philly Fighting COVID debacle is a cautionary tale about the importance of properly vetting health vendors. It’s also a cautionary tale about the importance (and lack) of privacy protections for sensitive health data during the Covid-19 pandemic.

“Across the country, we are seeing governments and their private contractors collecting lots of our Covid-related data, often with insufficient privacy safeguards,” Electronic Frontier Foundation senior staff attorney Adam Schwartz told Recode. “This is bad for public health efforts, which depend on public trust. More must be done to secure our private information.”

Now Democrats in both houses of Congress are trying to pass better health privacy laws that could both prevent what happened in Philadelphia from happening again and reassure the public that their sensitive health information stays private.

Sens. Richard Blumenthal (CT) and Mark Warner (VA) and Reps. Suzan DelBene (WA), Anna Eshoo (CA), and Jan Schakowsky (IL) announced their Public Health Emergency Privacy Act on Thursday. Among other things, the bill would prohibit the use of health data for anything but public health — it couldn’t be used to sell ads, or given to other (unrelated) government agencies. And tech companies would have to take certain measures to keep user data secure and delete it once the pandemic is over.

“Technology has become one of our greatest tools in responding to the Covid-19 pandemic but we need to build trust with the broader public if we are going to reach its full potential,” DelBene said in a statement. “Americans need to be certain their sensitive personal information will be protected when using tracing apps and other Covid-19 response technology and this pandemic-specific privacy legislation will help build that trust.”

Privacy advocates have long sounded the alarm over how the pandemic response may erode civil liberties, including health privacy. Over the last year, many government agencies have touted public-private partnerships to facilitate contact tracing, testing, data collection, and now, vaccine distribution. Private companies have stepped up to do what public health authorities didn’t have the resources to do themselves. But these efforts have had mixed results, and come with privacy issues that threaten to undermine public trust — and public health.

Verily, the life sciences company owned by Alphabet, created an online platform in March for people to sign up for tests and receive their results. But users needed a Google account to use the portal, and they had to supply personal information (Google is also owned by Alphabet). California’s San Francisco and Alameda counties ended the program in October over accessibility and data privacy concerns, noting that some people didn’t want to give their information to Google, even though the company said their data wouldn’t be shared without their consent.

In April, North Dakota became the first state to use digital contact tracing with its Care19 app. A month later, a privacy software company discovered that the app sent data to Foursquare via an SDK (Foursquare told the Washington Post that it discarded any data received from the app). Adoption of digital contact tracing has remained slow in America, partially because of privacy concerns.

And in Florida, some counties resorted to using Eventbrite to schedule vaccine appointments after their own registration sites failed or weren’t ready in time. That’s arguably better than not having a vaccine registration system at all — some counties forced people to wait for hours in first-come, first-serve lines — but Eventbrite doesn’t appear to have any special protections for data for vaccine registrants (the company did not answer questions from Recode regarding its handling of vaccine registration data).

Again, there’s no evidence that these companies sold or misused health data in these cases. The issue is that there isn’t much to stop them from doing so. The Health Insurance Portability and Accountability Act (HIPAA), which dates back to 1996, doesn’t cover a lot of data that many of us consider being health-related, nor does it cover many of the health-related services we now use. And in some cases where data would be protected, the government has granted special exceptions to HIPAA compliance requirements. Meanwhile, we’re relying more than ever on private companies to assist with the pandemic because public health authorities were woefully underprepared, understaffed, and under-resourced to do it themselves.

“Technologies like contact tracing, home testing, and online appointment booking are absolutely essential to stop the spread of this disease, but Americans are rightly skeptical that their sensitive health data will be kept safe and secure,” Blumenthal said in a statement. “Legal safeguards protecting consumer privacy failed to keep pace with technology, and that lapse is costing us in the fight against Covid-19.”

If people don’t trust that their health data will be protected, they may be more reluctant to seek out treatment — that includes getting a vaccine that many are already wary of, and which requires widespread adoption to achieve herd immunity. Better health privacy laws might reassure the public that their health data will be kept safe. Unfortunately, there hasn’t been much interest in passing those laws. Last year, Republicans and Democrats in both houses of Congress proposed pandemic-related health privacy bills. None of them went anywhere, and they joined an ever-growing stack of failed privacy laws.

But now there’s a new Congress and a new administration, and lawmakers are trying again. Maybe this time they’ll succeed.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Wishlist 0
Continue Shopping