Applying HIPAA would help Facebook transform mental health for good

Facebook and other social media companies have accurate information about the thoughts, feelings, and behaviors of millions of individuals. What these companies know is often more than what user’s therapists know.

If therapists and other health care professionals must guard what they know about a patient’s mental health as protected health information under the Health Insurance Portability and Accountability Act (HIPAA), Facebook and other social media companies should, too.

HIPAA itself

Under HIPAA, protected health information includes information that is “recorded in any form or medium” that “relates to the past, present, or future physical or mental health or condition of an individual.” Applying this law to Facebook would ensure that the user’s health information the company possesses is protected with the highest privacy standards and is disclosed and used only with the user’s permission and only when it is considered to be in the benefit for the user’s health.

If Facebook needs to use a user’s information for targeting ads or any other purpose, HIPAA would require explicit permission from the user, a process in which they would be made aware of the potential health risks of participating on Facebook’s platform.

Social media companies infer a lot from what, when, and how long a user “engages” on the platform. They use this information to build a model of a user’s personality, including their attitudes, choices, and aspirations that can be used to serve highly targeted ads to the user. For Facebook, more eyeballs on the ads leads to more revenue.

Many researchers have spent time trying to understand how social media relates to mental health. This work has aimed to answer two main questions:

  • How does social media use affect mental health?
  • How can social media be used to predict mental health status?

Frances Haugen’s testimony before Congress and the documents leaked about Facebook’s internal research on teen mental health covered only the first question. The hearing confirmed that Facebook “knows” about the harmful effects of Instagram on teenage girls’ mental health but has decided to remain silent. Facebook’s response to the report on how Instagram affects teen mental health claims that its platform has done more good than harm and that the company continues to maximize the good and minimize the harm.

Whether the use of social media directly causes worse mental health remains somewhat murky

Studies, including Facebook’s internal research, show only a correlation between social media use and poorer mental health. Some studies have gone so far as to show that the correlation continues over time. Correlation, of course, does not equal causality. It is possible that participants in these studies were already predisposed to certain mental health conditions and their use of social media reflected that. Or that factors outside of social media use are predominantly causing their mental health to decline.

To conclusively say if social media use in whole or part causes worse mental health would require researchers to look at all the information Facebook and Instagram, which Facebook owns, have on users and analyze it in relation to their mental health — information that is inaccessible to researchers. This is not unique to Facebook and Instagram. Platforms like TikTok and Snapchat, which are more popular among teens than Instagram, are even more restrictive.

If lawmakers are able to implement Haugen’s recommendation for more transparency and independent research, they might be able to pressure these companies to allow broader access to the data they have amassed, which would help researchers more definitively understand how social media affects mental health.

Social Media – Tool To Predict Mental Health

The second main research question, about whether social media can be used to predict mental health status, has essentially been answered in the affirmative. Facebook has the data that can be used to diagnose an individual’s mental health status and its artificial intelligence knows more about a user’s mental health state before the user does.

Studies have repeatedly confirmed that users’ mental health state is accurately reflected on social media. Things like stress and mood variability can be measured by looking at a user’s media content, captions, and usage patterns. Furthermore, it is possible to predict the mental health state of participants using just their social media archives. These predictions can often be as accurate as diagnoses given by therapists.

The rich online behavior data present on platforms like Facebook can be used to predict anorexia, depression, schizophrenia, anxiety and more. For example, the presence of distorted thoughts, often used by therapists to diagnose depression, can be algorithmically detected by analyzing the language in a user’s posts. Some of these predictions are accurate up to 3 months before a clinical diagnosis.

Perhaps unintentionally, Facebook and others have collected highly sensitive health information about billions of people that can be easily used to determine mental health status the same way a therapist would, but better and faster. There aren’t currently any incentives for these companies to use this information for good, or to help researchers in doing so.

While Facebook does provide resources to support mental health, like connections to online communities and educational materials, that is far from enough. But if Facebook agreed to comply with HIPAA, that would open doors for transparent research as well as make the company more responsible and liable for its users’ health.

This would help create new technologies that can use mental health information collected from users to make health care products that can warn users of potential upcoming health challenges and navigate them to the appropriate resources. Mental health providers would be able to use this information to point people to the right treatment and monitor their progress.

The U.S. is in the midst of its worst mental health crisis in history, with 4 in 10 Americans needing mental health support. The average time before someone gets treatment is 11 years. In such a crisis, it makes sense to leverage the research findings to use the power and scale of social media to develop tools for the early detection of mental health disorders to help millions of people get the timely support they need.

In addition to updating privacy laws to catch up with social media, lawmakers should strongly consider applying highly protective laws like HIPAA to social media companies. This would truly maximize the good these companies say they do and create the right incentives for them to turn the tide on the mental health crisis.

Param Kulkarni is a practitioner and researcher with Cornell Tech, founder of GetAwareHealth.com, and leads the machine learning team at Ginger.io.

We will be happy to hear your thoughts

Leave a reply

hipaa-software.com
Logo
Register New Account
Reset Password
Compare items
  • Total (0)
Compare