- Data brokers sell personal health and mental health information collected online.
- A Duke University study shows the magnitude of this issue.
- While this is legal, there are ways you can protect your personal information.
In a world that makes it almost impossible to stay completely offline, it’s safe (or not-so-safe) to say that your personal information is out there.
For the most part, Americans accept that what they post online is up for public exposure, but many may not know that their health and mental health information is being sold based on their digital footprint.
According to a study by Duke University’s Sanford School of Public Policy, the names and addresses of people diagnosed with conditions like depression, anxiety, post-traumatic stress, or bipolar disorder, and their medications are sold to data marketers.
“Marketers and people in the data broker industry…collect information from third parties and find people who can use it and sell it to them,” John Gilmore, head of research at DeleteMe, told Healthline. “[Personal] health information has always been a very highly valued metric.”
For instance, third-party apps used to help manage mental health conditions often sell information to brokers, the Duke report discovered.
For the study, researchers connected with data brokers and found that 11 companies sold data about health information including antidepressants people took and conditions they lived with such as anxiety, insomnia, Alzheimer’s disease, bladder-control difficulties, and more.
While some of the data sold included grouping of information like “X amount of people living in zip code X have depression,” other information included names, addresses, and incomes of people who might have certain conditions.
“While this is quite alarming, all of this is legal and under the general public’s radar. It’s been happening for years and is a long-standing breach that places health information at risk,” Deborah Serani, PsyD, author of Living with Depression and professor at Adelphi University in New York, told Healthline.
While it seems that the Health Insurance Portability and Accountability Act of 1996 (HIPAA) should protect people against this type of invasion, it doesn’t.
“Just because ‘privacy’ is in the name, it’s wrong to think of it as a law that keeps data private,” said Gilmore. “[Data] brokers are not regulated entities under HIPAA. There is not a law that regulates data brokers. If they collect and purchase health information about people, they can do with it what they want.”
HIPAA has no impact on private use of information that is voluntarily handed over in commercial transactions or other sources, he added.
The U.S. Department of Health and Human Services states that HIPAA applies to health plans, healthcare clearinghouses, and healthcare providers who conduct certain healthcare transactions electronically. The law establishes national standards to protect people’s medical records and other individually identifiable health information as it relates to these entities.
“Deliberate sharing of patient data outside of HIPAA protection is legal,” said Serani. “Our entire healthcare system depends on patients trusting their personal mental health and medical information is confidential. And while this may feel true in the consultation office, outside in the digital world, we have learned that it’s not.”
Over the years, the number of sources that can feed into a person’s personal health profile have gotten richer, Gilmore noted.
“Most hospitals have data sharing agreements and will directly sell sets of data on patients and conditions for epidemiological reasons. But there are no rules on who can buy them, so even though that information might be extremely valuable to people who are developing drugs or treatments, there’s no restriction on a consumer marketer buying that same set of data…and [building] products from it,” he said.
Having others know about your health and mental health information without you telling them can feel like an invasion of privacy. But experts say it can also have the following serious ramifications.
When it comes to mental healthcare, Serani said people may be less likely to reveal their challenges if they are worried about their privacy.
“Some patients may even refuse seeking psychotherapy or medication for mental illness struggles,” she said.
To assure her clients, she doesn’t use electronic records and instead keeps handwritten notes and files.
“I let my patients know this. It’s my way of keeping privacy for the children and adults I work with,” she said.
Gilmore said the issue could also discourage people from seeking out options for care or information from reputable websites.
For instance, if someone who is experiencing anxiety and sleeplessness wants to use a mobile health application to help, but learns that what they share on the app is collected and sold, they may be discouraged to use it or even research information about their struggles.
“Mental health is not always a permanent medical condition, so people who are experiencing temporary problems will find themselves saying, ‘I’ll just gut it out because I don’t want to end up being flagged,’” said Gilmore.
Insurance premiums could also be affected, he said.
For instance, to get insurance coverage, a medical exam from a doctor is required, which determines base coverage and premiums. If during the physical, it’s discovered that you’re in good health, but the insurance company discovered through a third party that you took Prozac five years earlier for depression, it could be interpreted that you are at higher risk for depression and in turn, you may pay higher premiums, said Gilmore.
“The issue is that all the information is coming from commercial third-party sources that are making judgments about people without any transparency or control,” he said.
People don’t have the right to access information that insurers are looking at.
“I can request to delete it not knowing what it is, but I don’t have the ability to correct the record and if their record is being created from passive data that is being collected by commercial third parties, it’s giving authority to sources of information that are scraped off the internet and highly unreliable,” said Gilmore.
Reputational and financial harm
Because the costs of hiring employees gets higher, Gilmore said employers turn to companies that offer data analytics and consumer credit reporting to evaluate potential employees.
“[People] may not know that their red flag is based on mental health data. Employers may be scoring lower confidence in this person because they’re considering potential mental health risk,” he said.
The same is true for credit scores.
“You’d assume a credit score is based entirely on a person’s credit history, but it’s not; the people who build credit scores, they integrate every piece of information they can,” Gilmore said.
Risk of legal consequences
When Roe vs. Wade was overturned, Gilmore said it led to real examples of how health data could turn into potential prosecution of people.
“If you went and searched for an abortion in a state that restricts it, Facebook could share that information with law enforcement to create lists of people that they should investigate,” he said.
Other forms of legal harm that could occur include civil litigation. For instance, a person could give testimony in court but then be discredited if information gathered online shows they were taking medication for psychotic episodes.
“A lawyer could simply say, ‘have you ever experienced times in your life where you were delusional?’ and the person says ‘no’ and then the lawyer says ‘well I have evidence here that you once took a drug,’” said Gilmore.
Additionally, he said data collected by third parties is used by law enforcement for general warrant purposes. A general warrant means law enforcement doesn’t have a suspect, so they look into groups of people to try to find a suspect.
Under the 4th amendment, law enforcement doesn’t have the right to pursue general warrants, but data services are allowing a loophole for them to do so legally.
“For instance, if you have a hate group incident where someone paints something racist on a wall and the police have no suspects or camera footage. They’ll go ‘ok who in this zone is currently a mental health patient? Let’s go talk to them,’” said Gilmore. “You can find yourself suddenly subject to investigation because you fit a certain kind of category or profile.”
While in theory, this is unconstitutional, he said because information gathered this way is not used in prosecutions, it’s never submitted as evidence and because it’s never submitted as evidence, it’s not unconstitutional.
To some degree, it’s impossible to completely protect all your personal information.
“Every single American including many non-citizens, have thousands of data points about themselves sold every day,” said Gilmore.
However, there are actions you can take to minimize your digital footprint. Consider the following:
1. Skip third-party apps
While health apps offer benefits like help with monitoring mood, timing medication, learning self-care techniques, and connecting with others going through the same things, Serani said knowing that such apps share information makes them a deal-breaker.
“Digital access makes things easy. But easy isn’t always better,” she said.
She tells her clients to go old-school and consider going to the library to check out a book on meditation or buying a blank journal for a diary to track feelings and moods.
“Before computers entered our lives, we knew how to do self-care. It was more active and required more from us. In fact, I often believe that doing more hands-on things offers better problem-solving skills. We have to think, plan, and put things into action. Read, write, and use skills that are becoming less desirable,” Serani said.
These approaches help access brain regions that aren’t reached when we use digital apps, and they help people be more accountable for their health and self-care, she noted.
“If we can do this more independently, without aids, we internalize the skills more fully,” Serani said.
2. Create a burner email address
Gilmore suggested creating an email address that doesn’t include your name in it to use for apps and websites you don’t trust and to sign up for webinars and such. With the address, you can create a dummy profile to use the site or app to get the information you need.
“This way whatever you do online stays segmented away from your core personal profile — your real number and email address,” he said.
He noted that 30% of DeleteMe customers use Proton Mail for this reason because it is an encrypted email and has no connection to your personal profile.
He also suggested always using a fake phone number because they are more valuable than email addresses since they link to your mobile device, which can link to your location, and the location can reveal your address.
3. Turn off tracking on your phone and computer
Whatever computer you use, consider going into settings, privacy, and refusing permission for tracking.
“Also, if your computer system holds active history, lose that feature. Further, if your computer has a feature that wants to send information, reports, etc., to your Windows, Apple, etc., click that off too,” said Serani.
If you decide to keep apps on your phone, she suggested setting privacy settings to not allow apps permission to track you.
However, Gilmore noted that phones map your offline life, too. Even if you disable location tracking, turn off GPS, and go to every app that collects it and remove them, your location is still being tracked by your ISP, the cell company, and more.
“You don’t need to use internet or search for anything; it’s creating a profile of you based on where you go and what you do. How often do you travel to this place, who do you spend time with. It’s done collectively and done passively…and it’s being collected and being sold,” he said.
Still, eliminating as much of your footprint can make some difference, he noted.
4. Clean up your social media footprint
If you have years of posting on social media, Gilmore said everything you shared is mined by bots and artificial intelligence.
“No one cares about you as a person; they care about you as a set of data, a set of information that when you put millions and millions of you together can be sold,” he said.
Because the data can have real-world consequences, he suggested erasing your online presence as much as possible by deleting old accounts that you don’t use on social media, deleting old tweets and posts, and removing online resumes that contain personal information you don’t use anymore.
5. Clear your browser history
Serani suggested clearing your web history every day or setting it to self-delete.
“I know this means you have to sign into mail, websites, etc., but it’s worth it because data brokers use this to gather information,” she said.
Using a VPN (Virtual Private Network) when on public networks may also provide protection as VPNs encrypt your internet traffic and disguise your online identity.
“There’s generally a cost for this. But you may feel its security is worth it,” Serani said.
6. Discuss digital information with your therapist
When in therapy, Serani said to ask your therapist how your health information is being recorded. If they use electronic notes, ask if they would consider using handwritten notes instead.
“But do be aware that when it comes to insurance claims, electronic or handwritten bills may still be at risk once they get to the insurance company,” she said.
While it is concerning that your personal information may be accessed and sold, try not to let it overwhelm you or cause a deepening of mistrust or suspiciousness in the health field.
“Most doctors and health professionals are champions of keeping personal information confidential,” said Serani. “Take steps to control what you can and accept for the time being that legal and ethical laws are coming.”