The Digital Therapist Guide: The Hidden Privacy Dangers of Mental Health Apps

Posted on Oct 16, 2023 by Julia Olech

More than 20% of Americans live with a mental health condition, yet only 60% have access to the treatments and care they need. An array of mental health apps have emerged to fill this growing market gap, offering everything from mood tracking to stress management tools. Although their popularity amounted to a staggering $5.2 billion market value in 2022, many of these apps have serious underlying privacy concerns.

Our research found that most mental health apps breach your trust and exploit your privacy, compromising all parts of your most vulnerable data. Therapy apps, like BetterHelp, TalkSpace, and Youper, are among the biggest data-harvesting machines, with vague, basic, or even deceptive privacy policies.

We ran in-depth checks on the 2023 privacy policies of popular therapy apps and analyzed how they put your data at risk. We’ll explain some of the difficult-to-understand terminology and share tips on how to safeguard your most personal information.

Privacy Pitfalls of Mental Health Apps

Some studies have found that over 80% of tested apps collect your most personal data, including symptoms of your mental health condition. Worse yet, many culprits even exchange this information with third parties — usually without your permission or knowledge.

We looked into this ourselves. We created a list of therapy apps that are frequently recommended by medical and health websites, and verified their popularity by checking for high app store download volumes. What we found in their 2023 privacy policies was worse than expected, with extensive data collection being just the tip of the cyber iceberg.

Mass data collection and distribution is possible because mental health apps aren’t licensed medical platforms, so they don’t fall under HIPAA (the Health Insurance Portability and Accountability Act). US federal law also doesn’t regulate what mental health apps do with your data, giving them free rein over it. Although the FDA can review and approve platforms, it wouldn’t certify the app’s data security.

While using tools like a VPN for PC or your mobile can secure your location and metadata, it can’t protect the information you freely give away. This is why the app you’re using should have trustworthy privacy policies in place and not harvest the data a VPN can’t secure. 

A Concerning Lack of Privacy Protection

Most mental health apps are a privacy nightmare

You can currently pick from around 20,000 mental health apps on multiple platforms, so there’s a lot to choose from. Though Forbes found that around 77% of therapy apps have privacy policies, they often use tricky language that makes it hard to understand what’s there for your protection, and what allows for marketing data collection. This allows them to hide shady practices in plain sight, and we found they do just that.

BetterHelp — 1 Million+ Google Play Downloads

We found BetterHelp’s privacy policy to be very well-written, often hiding data exploitation and false promises in legal jargon. The red flags are very subtle though. For example, BetterHelp openly says it tracks your IP address, which can reveal your location — but it doesn’t really share why. This sentiment extends to all other parts of your data, including your address, worksheets you fill in with your therapist, and journal entries.  

BetterHelp purposely avoids saying whether it sells your data or not

Instead of providing a straight answer to the question “Do you sell my data,” BetterHelp evasively states that they “aren’t paid for data.” Though the company may not receive money directly, they could still be sharing your details with partners or third parties.

Even if BetterHelp doesn’t sell your data, the company can hand over your details if subpoenaed. This implies they can easily access, view, and read your files, including most (if not all) messages. That’s because BetterHelp encrypts only some of your texts, and doesn’t state whether or not it uses end-to-end encryption. Without this, your messages could be decrypted and held on the company’s servers, making it easy for them to view everything you share with the platform.

You may think it’s not such a big deal since you can request for the company to delete your data. Unfortunately, it’s not that straightforward: you have to engage in a long back-and-forth with the company to ask them to wipe your details from the system. Otherwise, BetterHelp keeps your data for 10 years and continues to exchange it with others as it pleases.

Talkspace — 500,000+ Google Play Downloads

Talkspace isn’t much better. Its privacy policy reveals that it collects a huge amount of highly invasive data. It offers an unclear definition of what its data processing involves, but mentions “using” and “transferring” your data — without specifying where or for what. The document states that it’s to provide you with the best service, but doesn’t mention any third-party partners who may be involved in handling it. 

Talkspace collects your medical data which could identify you

Though Talkspace complies with HIPAA, it logs various parts of your medical data, including medication prescribed by the company’s psychiatrist. The app also exchanges these details with various medical professionals and therapists, but it doesn’t specify whether it’s de-identified or if it’s protected from re-identifying. 

If it collects so much data, you’d think Talkspace boasts impressive security features — but no. The privacy policy briefly mentions “commercially reasonable steps” the app takes to protect you, but this could mean anything. Without encryption, log files, regular security testing, and other industry-standard features, the app leaves your data at risk of breaches and leaks

Talkspace also reserves the right to change its privacy policy without any notice. This means you’re responsible for regularly checking for potential updates. If you don’t, you might inadvertently agree to some significant changes to your data privacy, which you may not be comfortable with. This is completely unacceptable and takes away your right to consent to what happens to your data.

Better Stop Suicide — 10,000+ Google Play Downloads

The Better App Company, responsible for apps like Better Stop Suicide, offers an extremely vague privacy policy. It doesn’t specifically state what data it collects or processes, but does say it will only share your data with trusted partners and not strangers — as if it sounds any better. We couldn’t find any information on who these partners are or what role they play in your mental health journey apart from “enhancing your experience.” 

Another worrying part is the constant use of phrases like “might” and “may”, indicating speculation rather than concrete policies. This allows for a lot of leeway as to what actually happens with your data.

Take this as an example: “This might be your email address, name, billing address, home address, etc – mainly information that is necessary for delivering you a product/service or to enhance your customer experience with us.”

Reading this statement (and many more) left us with more questions than answers. Will the app collect your details or not? Will it take more data and what type? The whole privacy policy is written in the same style. This concerns us as it lets Better Stop Suicide take any data it deems valuable and use it however it sees fit.

You have the right to delete your data, but only in unnamed circumstances

This vague approach extends to the rights you have to your own data. Better Stop Suicide lets you delete your details only in certain circumstances without explaining what they are. You can technically request that the company restrict its data collection, processing, or storage of your data, but it isn’t obliged to listen.

Cerebral — 100,000+ Google Play Downloads

Cerebral’s privacy policy raises serious concerns, especially when considering the sensitive medical data the company handles.

It asks you to enter a lot of information, which feels extremely invasive and unnecessary. The app needs your full medical history, complete with prescriptions and lab results, social security number, and emotional and physical characteristics. This includes collecting audio, images, and videos of you during your sessions, so anyone with access to your files can identify you in seconds.

Cerebral doesn’t do much to protect the information it collects, either. In fact, the company actively exchanges your details with partners, advertisers, social media platforms, and other third parties. Just earlier in 2023, Cerebral admitted to sharing the private data of 3.1 million users with Facebook, TikTok, and other social media giants.

You can opt out of targeted ads – but Cerebral says it may not work!

That’s not all. Cerebral openly says it uses your data for tailored advertising, but that you can choose to opt out. The caveat? The company doesn’t actually guarantee doing so will work, saying they don’t know how accurate the mechanism is. This potentially allows the app to continuously exchange data with advertisers without your consent.  

Youper — 1 Million+ Google Play Downloads

Youper’s privacy policy is open about the types of data the app collects, like login, usage data, and device details. It gets really invasive because Youper is an AI chatbot, which requires it to store everything you share with it to diagnose you and facilitate appropriate treatment. 

Youper doesn’t specify what security measures it uses to protect your data

Additionally, Youper mentions it can’t guarantee to keep your data 100% safe. The company promises it implements “appropriate and reasonable” security measures — but we’re not entirely sure what this means. So you can only hope it’s strong enough to keep persistent cyberattackers, scammers, and other unauthorized parties away from your personal sessions with the bot.

Luckily, Youper doesn’t seem to share your data with third parties, like advertisers. The only mention of data sharing relayed in the privacy policy relates to possible mergers or exchanges of company assets. While it’s not completely trustworthy, Youper seems a little better at protecting your data than some other apps.

What Data Do Mental Health Apps Actually Collect?

Image showing the type of data mental health apps collect
 You may share a lot of personal details.

We found that BetterHelp, Talkspace, Better Stop Suicide, Cerebral, Youper, and many other apps actively monitor and store your mental health details. This includes your journal entries, mood checkers, medical symptoms, medication, and even full chat transcripts. Collecting all this without protection or consent can easily lead to unintentional or malicious misuse.

Of course, the level of infiltration depends on the app you use. Mood trackers or meditation apps may not get as much data as platforms that require you to fill in a questionnaire to sign up, like BetterHelp and Talkspace. These often ask for more personal details, like your address, date of birth, sexual orientation, and medical history, forcing you to share more vulnerable data. 

A lot of mental health apps also collect your metadata. This is information that describes how you interact with a platform, such as the time you log in, how long your sessions last, and which features you use most often. It may not seem immediately concerning, but your metadata lets companies build a more complete profile of who you are, especially when paired with the data you share.

Top Risks Data Collection Expose You To

Since mental health apps collect enormous amount of extremely personal data, it puts you at numerous privacy risks, including:

Data breaches — Compromised mental health data can have serious personal and social implications leading to multiple other issues.

Identity theft — Using your mental health information, hackers and other malicious actors can create more complete profiles on you, allowing for elaborate identity scams.

Advertising and third-party sharing — Many apps share your vulnerable data with partners and advertisers to earn money and target you with tailored ads. 

Data inaccuracy — Mental health apps analyze your data, but misinterpretation could lead to inaccurate recommendations or interventions.

Long-term storage — Many apps don’t disclose how long they store your data for or how secure their storage is, increasing the chances of it being compromised in a leak or a breach.

Emotional manipulation — Detailed insights into your mental state put you at risk of emotional manipulation for advertising, political campaigns, or other purposes.

Integration with other services — Some mental health apps link your account with calendars, smart home devices, and other services, letting others create detailed digital profiles to exploit you.

Stigma in your personal life — Sadly, many mental health issues come with prejudice, so leaking such data could lead to potential misuse by insurers or employers, discrimination, and stigmatization.

AI in Mental Health Apps and Its Impact on Privacy Issues

At its very core, AI is still in its infancy, especially when dealing with human emotions. Because of this, mental health apps with AI heavily rely on both their information databases and your user input to facilitate a correct treatment. This creates additional privacy and security issues many developers don’t warn you about. 

When you share your details with AI, it immediately logs your data and sends it to a shared database many computers learn from. Usually, it’s not just the one app that can access it — any other app the company is developing could see it, too. This means your most vulnerable information can resurface anywhere at any point.

Mental health chatbots are the biggest data-sucking machines. Since they imitate in-person conversations, they may prompt you to share your deepest thoughts and feelings, just like you would with a human therapist. They then collect various parts of your data, including information you haven’t shared with anyone else, such as suicidal thoughts or depression symptoms. 

In many cases, mental health apps promise to anonymize your data before storing it. However, AI systems can often re-identify your information and link it back to you, sometimes without any human incentive. In fact, studies show that computers can re-identify 99.98% of available online information, allowing anyone with access to the servers to learn about anything you’ve shared.

Due to high demand, many AI systems have been developed too quickly and released with potential software vulnerabilities. This leaves them open to cyberattacks carried out even by more novice criminals. All they need to do is gain entry into the system to gain a wealth of your data. They can then use your data for online harassment, scams, fraud, and stalking. 

VR Therapy: A Future of Psychology or a Privacy Nightmare?

If AI is in its infancy, Virtual Reality (VR) is in even earlier developmental stages. Up until recently, you could utilize this new technology only in professional settings under a therapist’s guidance. This meant any apps used for mental health treatment had to adhere to the strict HIPAA patient confidentiality and data protection laws

However, as VR headsets sell in record numbers each year, more developers release mental health apps on popular virtual platforms. These are not too different from apps you get on mobile phones or PCs, but their data collection is often a lot more invasive. That’s because VR headsets need more intimate details to work than handheld devices

Image showing the type of data VR mental health apps collect
A virtual experience requires a lot of data to run smoothly.

When you put on a VR headset, the device maps your environment, interprets your gestures, and mimics your movements in the virtual world. It captures details like voice, eye movement, pupil size, height, heartbeat, and even brain activity. Mental health apps available on a VR headset can easily access this data and use it not only to ensure a smooth virtual experience but also for their own benefit.

Such extensive data collection, coupled with the mental health information you share, can constitute a grave privacy risk. It intensifies your exposure and, in malevolent hands, could facilitate sophisticated identity theft, credit card fraud, or even strategize unauthorized property intrusions based on spatial understanding. Ensuring privacy and security in VR mental health apps is, therefore, not just advisable but imperative.

How to Protect Your Privacy While Seeking Emotional Support

Tips to protect your personal data while using mental health apps
Take care of your mental and digital well-being!

1. Find privacy-conscious mental health apps

Even though a majority of mental health apps put your privacy at risk one way or another, you can still find reliable and trustworthy options. In our investigation, we were impressed by Wysa and PTSD Coach as they have clear privacy policies and are far more careful with your data than the other apps we looked into.

2. Download apps only through official stores or sites

Cyberattackers often create fake apps and websites to trick you into downloading malware on your devices. The best way to find safe mental health apps is to use official app stores. This also lets you browse through user reviews, so you can check the quality of service it provides based on first-hand experience of other people.

3. Sign up with a disposable email and secure password

When registering with a mental health app, use a throwaway email to shield your primary accounts from unwanted spam, potential threats, and data breaches. Paired with a robust password, this approach offers an added layer of security and privacy. 

4. Read through the privacy policy and terms & conditions

Instead of blindly accepting T&Cs, give all legalities a quick once-over. It’ll give you a general understanding of what may happen to your data and what privacy protection you can expect. The legal language can be tricky to understand but if you copy and paste it into ChatGPT and ask it to summarize it for you, you’ll easily get a full picture.

5. Opt out of cookies, ads, and analytics

Before you dive deep into its features, take the time to turn off analytics and cookies in the app’s settings. These collect the smallest tidbits of data, used primarily for marketing and personalized ads — so nothing that could help your wellbeing. 

6. Adjust the apps’ privacy and security settings

After you opt out of data analytics, go through available app settings and see if you can disable any further infiltration. This includes permission to access your camera or microphone, location sharing, and optional diagnostics. Most mental health apps don’t need them to function, so it won’t hinder your experience. 

7. Limit how much personal information you share

Some details you have to share, others aren’t required. As a rule of thumb, if you don’t need to reveal certain information, keep it to yourself. This even includes using a fake name and address as it’s not always necessary to facilitate the same treatment, but it may boost your online privacy.

8. Don’t connect other accounts, like Facebook or Google

Many apps allow you to sign up with social media accounts or Google. While it is convenient, it links the platforms together, letting them exchange data as they please, usually without asking you for consent. This means your Facebook updates may fuel ads you see in your mental health app and vice versa.

Prioritize Your Privacy in Digital Wellbeing

Without federal laws and HIPAA regulations, mental health apps can do with your data as they please. At least 80% of apps don’t meet industry standards for privacy protection. They’re guilty of collecting identifiable and vulnerable details, storing them even after you delete your account, and even sharing your files with multiple third parties.

As a user, staying informed and vigilant is your first line of defense. As you continue to embrace digital solutions for mental health, make sure your digital well-being doesn’t put your real-world security at risk