About two weeks ago, the Federal Trade Commission announced that it will be settling accusations with online counseling service BetterHelp that the company allegedly “revealed consumers’ email addresses, IP addresses, and health questionnaire information to Facebook, Snapchat, Criteo, and Pinterest for advertising purposes.”1 A month earlier, the FTC settled with prescription discount service GoodRX for similar reasons.2 Since these two settlements, a number of additional companies have come forward notifying the public (and likely the commission as well) that they also have been sharing this data with social media companies.

In a short period of time, the spread of companies sharing such sensitive information about patients’ lives potentially points to a unhealthy trend within the health technology industry: a significant lapse in understanding (or discipline) in how these companies share and protect sensitive data about their customers.

What about HIPAA?

By this point, you’re likely wondering why this wasn’t taken care of earlier. After all, we have an entire law dedicated to health data privacy in the United States? Well, kind of. HIPAA only addresses health data processed by “covered entites” which includes doctors, hospitals, health insurance, and healthcare businesses. I won’t attempt to rehash the details of what the law covers, but it’s a rather extensive set of practices that apply when sharing data related to an individual’s medical care including physical and digtal security, consent for sharing, requirements for deidentification, and so on.

But many of these companies are not necessarly “healthcare providers” under the protection of HIPAA. This means that you’ll have many companies that are collecting the same data with much less restriction about how it’s used.

Digging into the legal weeds a bit, both of these settlements were brought under the FTC’s Section 5 authority. That’s the general statutory provision that allows the FTC to take action against “unfair or deceptive” business practices. The FTC has had a rule on the books for improper sharing of healthcare data since 2009, but GoodRX was the first case that rule was invoked in the context of an enforcement action.3

The Data on the Desperate

The most disturbing part of these stories to me is that both the companies here used the data provided by customers that they know are people who are struggling. Mental health in the United States is becoming an endemic crisis that is taking an increasing toll on individuals and society at large. Most people who will take part in programs like this likely don’t have access to quality healthcare services, adequate insurance coverage, or any of the typical resources found in traditional healthcare settings. That combined with the various sympathetic advertising done by these companies feel like an insult to injury.

The examples we see in these cases paint a disturbing pattern of exploitation that we would consider shocking were it to come from a major hospital network or insurance provider. It provides an effective punishment for the poor and setting a dangerous precendent that the most sensitive parts of individuals’ lives are for sale by default.

More concerning, is that we’ve heard nothing from any of the major companies that received this data. In lieu of any clarity, we have to assume that the data is being handled just like any other data being sent: it’s combined and linked to an increasingly intimate profile of the individual.

The Myth of Analytics

In general, the use of health data in this way is not a new cause, but rather the expression of a general practice of using data for the sake of it. There are many opportunities to profit off data collected from consumers and an unquestioning willingness to use it. The myth you often hear is the more data you have, the more you can leverage it for useful things.

This leads to a pattern where we will use data thinking it’s better without questioning quality, ethical sourcing, or critical thinking when applying the use of data to large inferences. Companies aim to build comprehensive pictures of individuals oftentimes for the sake of not knowing exactly what they aim to do with it. Perhaps there was a plan or goal that the likes of GoodRX and Betterhelp wished to pursue “eventually.” Unfortunately, without discipline, foresight, and caution applied to the use of data, many people will be less apt to trust those services for a while.

If there is a lesson to be extracted from what we’ve seen, one likely lesson is this: data without direction or criticism can be at best confused and at worst exploitative and destructive. In a world where an entire picture of our selves exists in the digital realm, there needs to be a stronger discipline of trust and care.

Updated: