The Cambridge Analytica scandal that’s rocked Facebook both in headlines and on Wall Street, has revealed the potential risks for users who want to keep their online data private.
As the story of the breach has unfolded and expanded to include a second data firm, it’s brought up incidences in which social media platforms and other large tech firms have questionably handled the personal health data of their users.
How big tech and private health information can collide
The Cambridge Analytica scandal started after it was revealed that the “political firm” obtained millions of users’ Facebook data under false pretenses in order to create “psychographic profiles” of U.S. and U.K. voters.
But even before this scandal, there have been multiple stories about how social media companies like Facebook can compromise personal health information and privacy.
In 2016, Splinter News reported that Facebook’s “people you may know” feature was suggesting that patients of the same psychiatrist friend one another, effectively identifying each patient in the practice to one another. In that case, the psychiatrist hadn’t even “friended” her patients, but Facebook had her phone number, which her patients also had.
The algorithm then likely thought they would want to connect online after analyzing the contacts. Facebook said at the time that they weren’t sure why the patients were recommended to be “friends” with one another, without additional information.
Other companies including mobile health apps used by doctors and even potentially Google, have also been caught in the crosshairs, due to a lack of protection on user data.
That at least appears to be the case for a young woman named Rose (who Healthline agreed to identify by her middle name). After she and her partner experienced a pregnancy scare, Rose began to Google questions about Plan B, her menstrual cycle, and a week later, pregnancy symptoms.
“A condom broke and I was being paranoid. I didn’t get to the point of thinking I was pregnant,” she told Healthline. Several weeks later, she received a text from her mother. A promotional StrongMoms package of baby formula from the company Similac had been shipped to her family’s home.
Rose said she’s close with her parents and she would tell them if she became pregnant.
Her mother’s text read along the lines of “‘Oh, this is funny, this came as an accident,’ and I was like, ‘Correct. It did come as an accident,’” she said.
Rose said she didn’t know why she was targeted for this promotion. And she has no proof that her parents received the package as a result of her activity online.
But searching online, she found a pattern. She saw other stories of women whose actual pregnancies were revealed by the package, or who’d received the packages months after miscarrying.
Rose told Healthline that she did purchase a baby shower gift from Babies R Us six months prior to receiving the package, and said potentially that could also have had something to do with the formula arriving. She saw online that other women who bought something at that store had received the formula.
A local 2017 NBC report on the packages suggested that the list of names to send the package to were generated by third-party data aggregators, although they had no definitive answers on if it was related to Google searches.
In response to Healthline’s questions about cases like Rose, Abbott Nutrition said in part: “We have partnerships that provide us with information about expectant parents who may find our information or products useful. The overwhelming majority of people we send our gift packs to enjoy them and we receive a lot of positive feedback from parents on the StrongMoms program. We also work quickly to ensure anyone who tells us they’d like to be removed from the StrongMoms program is quickly removed from our mailing list.”
Google didn’t immediately respond to Healthline to clarify if Abbott Nutrition could have received information from user search history.
No matter how the formula arrived addressed for Rose, the story points to a larger issue with companies being able to infer information about users — even sensitive health information — from their online activity.
DePaul University computer scientist Jacob Furst, said third-party data aggregators are likely looking at activities that might include a user’s Google searches.
He said that these aggregators may be able to find information from a user’s Google searches and then send related materials.
“There are so many interconnections among applications, among data aggregators. There are often many paths to the same end,” Furst told Healthline, “It can be very hard to understand, and it can be a lot of sources put together.”
A common theme that emerges in stories like Rose’s is the interconnectedness of each platform. It’s hard to disentangle a Google search from an ad served on Facebook because of an app downloaded on your smartphone.
But the lesson is that your online interactions can mean companies may have more information about your health conditions than you realize.
And in the wake of the news that Cambridge Analytica obtained user data under false pretenses, reporters are finding more and more examples of the misuse of people’s personal information.
Facebook faces more scrutiny after it came to light that they had plans to share user medical data with hospitals for research purposes, according to CNBC.
A spokesman responded to CNBC that the plans hadn’t progressed past the planning stage.
Earlier this month, Buzzfeed reported that the app Grindr was sharing its users’ HIV statuses with other companies. After the report, Grindr announced that it would no longer share this data with third-party companies.
Grindr’s chief security officer told Buzzfeed that the situation was “being conflated with Cambridge Analytica.”
Weeks before that news broke, the company was praised for its move to remind users to get tested for HIV.
What can users do to protect themselves?
While these problems will be difficult to solve, Furst said he recommends individual users restrict access to data from apps when they ask. This means not just blindly clicking yes when an app wants access to your contact list or your location or photos.
But more importantly, he thinks there needs to be a larger movement to regulate these tech companies to protect users.
“I think ultimately there needs to be enough of a societal outcry that we need to get the government to pass laws,” Furst said.
Furst added that he doesn’t expect that will happen right away, but he points to Europe as a potential way forward. The passage of Europe’s General Data Protection Regulation is poised to impose much stricter regulations on the handling of people’s data.
While Furst is skeptical as to how each aspect of the regulation will be enforced, he said it gives him reason to think the United States may follow the E.U. in the coming years.
Joseph Frankel is a writer and reporter based in New York. He is on Twitter @JosephFrankel.