The Digital Therapist Guide: The Hidden Privacy Dangers of Mental Health Apps

About 1 in 5 Americans live with a mental health condition, yet only 60% have access to the treatments and care they need. To meet this growing demand, a wave of mental health apps now offer everything from mood tracking to stress management. The mental health app market reached $7.48 billion in 2024 alone.
While these apps promise support, many quietly expose users to serious privacy risks. It turns out that even apps designed to help can leave your most vulnerable information at risk, and most people aren’t aware of what they’re agreeing to. A Pew Research survey found that 78% of Americans don’t fully read the privacy policies they accept.
Our investigation into some of the most popular mental health apps, including BetterHelp, Talkspace, and Headspace, found widespread issues with data collection, sharing, and storage, often hidden behind complex, hard-to-read policies.
In this article, we break down how mental health apps really handle your data, reveal the biggest privacy pitfalls we uncovered, and offer clear steps you can take to better protect yourself.
Table of Contents
Privacy Pitfalls of Mental Health AppsA Concerning Lack of Privacy Protection
BetterHelp — 1 Million+ Google Play DownloadsTalkspace — 500,000+ Google
Play DownloadsBetter Stop Suicide — 10,000+ Google Play
DownloadsCerebral — 100,000+ Google Play DownloadsYouper — 1 Million+
Google Play DownloadsWhat Data Do Mental Health Apps Actually Collect?
Top Risks Data Collection Expose You ToAI in Mental Health Apps and Its Impact on Privacy Issues
VR Therapy: A Future of Psychology or a Privacy Nightmare?
How to Protect Your Privacy While Seeking Emotional Support
Prioritize Your Privacy in Digital Wellbeing
Privacy Pitfalls of Mental Health Apps
Recent research shows that up to 88.5% of iOS apps track private user data, while 74% of the most popular mobile apps collect more information than they actually need to function. Among these, mental health apps stand out for gathering especially sensitive details, including symptoms of your mental health condition. Worse yet, many share this information with third parties — often without your permission or knowledge.
We looked into this ourselves. We created a list of therapy apps that are frequently recommended by medical and health websites, and verified their popularity by checking for high app store download volumes. What we found in their latest privacy policies was worse than expected, with extensive data collection being only part of the larger privacy concerns.
Mass data collection and distribution is possible because mental health apps aren’t licensed medical platforms, so they don’t fall under HIPAA (the Health Insurance Portability and Accountability Act). HIPAA protections only apply to healthcare providers, insurers, and their partners, not independent apps like mood trackers or therapy chatbots. Unless an app voluntarily adopts stricter policies, it can legally share, sell, or use your sensitive data.
There’s also no comprehensive federal privacy law in the US regulating these apps. While the Federal Trade Commission (FTC) reviews some apps for safety and can step in after a data breach or deceptive practice, there are no proactive privacy standards in place. In fact, investigations have found that many mental health apps continue sharing user data with advertisers and social media companies, despite promising confidentiality.
While using a VPN for Android or VPN for iOS can secure your location and protect metadata, it can’t safeguard the personal information you willingly share within the app. This is why it’s critical to choose mental health apps with strong, trustworthy policies that minimize the amount of sensitive data collected.
A Concerning Lack of Privacy Protection
Most mental health apps are a privacy nightmare
You can currently pick from around 20,000 mental health apps on multiple platforms, so there’s a lot to choose from. Though Forbes found that around 77% of therapy apps have privacy policies, they often use tricky language that makes it hard to understand what’s there for your protection, and what allows for marketing data collection. This allows them to hide shady practices in plain sight, and we found they do just that.
BetterHelp — 1 Million+ Google Play Downloads
- Website: https://www.betterhelp.com/
- App: iOS | Android
Over 5 million people across more than 100 countries have used BetterHelp’s services, making it one of the most popular online therapy platforms today. In 2024 alone, it served more than 900,000 new clients. Despite its growth, BetterHelp faced major scrutiny over its privacy practices.
In 2023, the FTC finalized an order against BetterHelp after charging the company with sharing sensitive health information with advertisers like Facebook and Snapchat. BetterHelp was fined $7.8 million and is now banned from using consumer health data for advertising.
Even still, we found BetterHelp’s privacy policy to be carefully worded, often using legal language that makes it difficult for users to fully understand how their data is collected and shared. For example, BetterHelp openly says it tracks your IP address, which can reveal your location, but offers limited explanation for why this is necessary. It also collects sensitive personal information, including onboarding questionnaires, worksheets, journal entries, and communications with therapists..
While BetterHelp claims it does not “sell” user data for payment, under California privacy laws, the sharing of certain Visitor Data for targeted advertising purposes can still be considered a “sale.” Users who opt into advertising cookies or tracking may have their information, such as device IDs and browsing behavior (but not therapy messages), shared with third parties for marketing.
BetterHelp states that all messages between you and your therapist are encrypted with 256-bit security. However, the company does not confirm the use of end-to-end encryption, meaning your messages could potentially be stored and accessed internally if necessary.
Even requesting the deletion of personal data can be complicated. While non-critical data may be erased within 24 hours of a request, BetterHelp retains key therapy-related records for up to 10 years, even if you delete your account. Without completing a formal erasure request through your account settings or customer service, your personal data remains stored and may continue to be shared with authorized partners under their policies.
Talkspace — 500,000+ Google Play Downloads
- Website: https://www.talkspace.com/
- App: iOS | Android
Talkspace isn’t much better when it comes to protecting user data. Its privacy policy reveals that it collects a wide range of highly invasive data, including medical history, therapy session details, symptom tracking data, and even clinical progress notes. While Talkspace states that therapy services comply with HIPAA regulations, other types of data collection, such as app usage, device tracking, and marketing analytics, may not be protected under those regulations.
The company’s privacy policy offers an unclear definition of what its data processing involves. It mentions “using,” “analyzing,” and “transferring” data, but provides little detail on where the information goes or how it may be handled by third parties. Although Talkspace claims it does not sell client information, it does use persistent identifiers (like device IDs) and collaborates with third-party platforms for marketing, analytics, and app functionality.
Talkspace collects your medical data which could identify you
Talkspace collects medical data that could easily identify you, including prescriptions issued by its psychiatrists. It also shares information with therapists, counselors, and support staff to deliver services. However, it does not clearly explain whether this data is fully de-identified or if there are safeguards to prevent re-identification.
Adding to the concern, Talkspace reserves the right to update its privacy policy at any time without notifying users directly. This means you are responsible for checking the policy periodically for changes, otherwise, you could unknowingly agree to new terms that weaken your privacy protections.
In 2025, Talkspace introduced AI-powered features that synthesize client data to help therapists track progress and generate personalized podcast episodes for clients. The company states that all AI features are HIPAA-compliant, but the integration of machine learning into sensitive health communications raises new questions about data security and the limits of privacy protections.
Headspace — 10 Million+ Google Play Downloads
- Website: https://www.headspace.com/
- App: iOS | Android
Headspace is one of the biggest mindfulness and meditation apps, with over 80 million downloads and nearly 3 million paid subscribers.
Its privacy policy reveals that Headspace collects a wide range of personal information, including health data, wellness goals, demographic details (like race, ethnicity, sexual orientation, and gender identity), and extensive usage data. If you access the app through a workplace program or healthcare plan, Headspace may also share enrollment information with your sponsor. The company claims this does not include specific details about your therapy or meditation sessions unless required for healthcare operations.
Headspace also discloses that it shares user data with third-party platforms for targeted advertising and analytics. While it emphasizes that it does not “sell” sensitive health data for payment, it admits that certain disclosures for advertising purposes may legally qualify as “selling” or “sharing” under privacy laws like California’s CCPA. Opt-out links are provided, but targeted ads and broader data sharing still occur unless users actively exercise these rights.
In 2024, Headspace expanded into virtual reality therapy with the launch of Headspace XR, aiming to offer immersive therapeutic experiences. Although Headspace XR is not specifically mentioned in the privacy policy, the app reserves the right to request data from Meta Quest device hardware, including information processed by the device’s sensors and functionalities.
Cerebral — 100,000+ Google Play Downloads
- Website: https://cerebral.com/
- App: iOS | Android
Cerebral’s privacy policy raises serious concerns, especially considering the sensitive medical data it collects.
The app requests a large volume of personal information, including your full medical history, prescriptions, lab results, social security number, and emotional and physical characteristics. It also collects audio, images, and video recordings during sessions, making users easily identifiable.
Despite collecting such sensitive data, Cerebral has shown a poor track record in protecting it. In 2023, the company disclosed that it had inadvertently shared the private health information of more than 3.1 million U.S. users with platforms like Facebook and TikTok through embedded tracking pixels. The data shared included names, contact information, dates of birth, IP addresses, mental health self-assessment responses, and potentially treatment and insurance details. The incident triggered an investigation by the U.S. Department of Health and Human Services (HHS), and Cerebral now offers credit monitoring services to affected users.You can opt out of targeted ads – but Cerebral says it may not work!
Cerebral claims to offer users an option to opt out of tailored advertising. However, the company openly states in its privacy policy that it “makes no representation” about how effective these opt-out mechanisms actually are. To opt out of advertising cookies and tracking technologies, it directs users to follow instructions, but fails to provide a working link for them to follow. This leaves room for user data to continue flowing to advertisers even after opting out, without clear or reliable protections in place.
Youper — 1 Million+ Google Play Downloads
- Website: https://www.youper.ai/
- App: iOS | Android
Youper’s privacy policy is relatively transparent about the types of data it collects, including email, device details, and app usage information. Since Youper uses an AI chatbot for mental health support, it also stores sensitive data like health information you share during conversations. The app says this helps it deliver services and improve its features, but it means your private mental health data is being permanently processed.
The policy also warns that while Youper implements “appropriate and reasonable” security measures, it cannot guarantee your data’s safety. In plain terms, your sensitive mental health records could still be vulnerable to hackers, breaches, or unauthorized access, despite their efforts.
On the positive side, Youper does not appear to actively sell or share your data with third parties, like advertisers. Data sharing is only mentioned in the context of corporate changes like a merger or acquisition. That’s better than many competitors, but it still leaves your personal information exposed if the company changes hands in the future.
What Data Do Mental Health Apps Actually Collect?
You may share a lot of personal details.
Many popular mental health apps, including BetterHelp, Talkspace, Headspace, Cerebral, Youper, and more, actively collect and store your sensitive information. This can include your journal entries, mood logs, mental health assessments, medical symptoms, medication details, and even full chat transcripts. Without strong protections, this personal data is at risk of both accidental leaks and intentional misuse.
Of course, the level of data collection depends on the app. Basic mood trackers or meditation apps might only gather limited information. But therapy apps that require questionnaires during signup, like BetterHelp and Talkspace, often ask for deeper personal details, like your address, date of birth, sexual orientation, and medical history. Using these services often means handing over vulnerable data before you even see a therapist.
Most apps also collect your metadata. This includes how often you log in, how long you stay, which features you use, and other patterns. On its own, metadata might seem harmless. But when combined with personal information, it can help companies create surprisingly detailed profiles about you, often for purposes you never agreed to.
Top Risks Data Collection Expose You To
Since mental health apps collect enormous amount of extremely personal data, it puts you at numerous privacy risks, including:
⛔ Data breaches — If hacked, your mental health information could be exposed publicly, causing financial, emotional, and social harm.
⛔ Identity theft — Hackers can combine stole mental health data with other information to create more complete profiles on you, allowing for elaborate identity scams.
⛔ Advertising and third-party sharing — Many apps sell or share your vulnerable data with partners and advertisers to target you with personalized ads, or worse, sell your profile to data brokers.
⛔ Data inaccuracy — The algorithms analyzing your mental health aren’t perfect. Bad data could lead to poor advice, misdiagnoses, or harmful recommendations.
⛔ Long-term storage — Many apps don’t disclose how long they store your data or whether it will ever be deleted, raising long-term privacy concerns and increasing the chances of it being compromised in a leak or a breach.
⛔ Emotional manipulation — Detailed insights into your mental state could be used to push products, sway opinions, or influence you in subtle and harmful ways.
⛔ Integration with other services — Apps connected to calendars, smart home devices, email accounts, and other services, allow even broader profiling that’s hard to track or control.
AI in Mental Health Apps and Its Impact on Privacy Issues
AI is still a developing technology, especially when dealing with human emotions and mental health. Many mental health apps that use AI heavily rely on user input and internal databases to deliver advice or support. This creates serious privacy and security risks that many developers fail to clearly disclose. When you share your personal details with an AI-powered app, your information may be logged and stored in databases that could be used to train other AI systems. It’s not always limited to the app you’re using. In some cases, the company’s other products may also access that data, raising concerns about where sensitive information, including emotional struggles or mental health disclosures, could eventually resurface.
Mental health chatbots are particularly aggressive in collecting data. By imitating human conversation, they encourage users to open up and share deeply personal information, sometimes including details they have never shared elsewhere.
Even when companies promise that your data is anonymized, the protection is not guaranteed. Research has shown that AI systems can re-identify up to 99.98% of anonymized online data, especially when cross-referencing it with other sources. This means that even information that was stripped of obvious identifiers could still potentially be traced back to you.
VR Therapy: A Future of Psychology or a Privacy Nightmare?
If AI is still developing, Virtual Reality (VR) is even earlier in its evolution for mental health treatment. Up until recently, VR was mostly used in professional settings under a therapist’s guidance. In those cases, apps had to comply with strict HIPAA patient confidentiality and data protection laws.
Now, with VR headsets becoming more common — over 9.6 were shipped worldwide in 2024 alone — mental health apps, like Headspace, are moving onto mainstream VR platforms. These apps work much like the ones available on mobile phones or computers, but the data they collect can be far more extensive. That’s because VR headsets rely on more intimate details to function.
A virtual experience requires a lot of data to run smoothly.
When you put on a VR headset, the device maps your environment, interprets your gestures, and mimics your movements in the virtual world. It captures details like voice, eye movement, pupil size, height, heartbeat, and even brain activity. Some mental health apps available on a VR headset can access this data and use it not only to enhance your experience, but also potentially use it for their own benefit.
When combined with sensitive mental health information, this level of biometric data collection raises serious privacy concerns. If not properly secured, it could increase risks like identity theft, targeted scams, or misuse of personal insights. As mental health care expands into virtual environments, ensuring strong protections around user data will be critical to maintaining trust and safety.
How to Protect Your Privacy While Seeking Emotional Support
Take care of your mental and digital well-being!
1. Find privacy-conscious mental health apps
While many mental health apps pose privacy risks, you can still find more careful and transparent options. In our investigation, Wysa and PTSD Coach stood out for having clear privacy policies and stronger data protection practices compared to other apps we reviewed.
2. Download apps only through official stores or sites
Cyberattackers often create fake apps and websites to trick you into downloading malware on your devices. The safest way to find mental health apps is through official app stores, where you can also review user feedback and gauge the app’s reliability based on real experiences. For extra protection while browsing or downloading, consider using a VPN to encrypt your connection and shield yourself from malicious websites and phishing attempts.
3. Sign up with a disposable email and secure password
When creating an account, consider using a throwaway email address to protect your main inbox from spam, phishing, or data breaches. Pair it with a strong, unique password for better account security and privacy.
4. Read through the privacy policy and terms & conditions
Before agreeing to anything, take a few minutes to review the app’s privacy policy and terms. It can give you a clearer idea of how your data is collected and stored. If the legal language is complicated, you can copy and paste it into a tool like ChatGPT and ask it to summarize it for you.
5. Opt out of cookies, ads, and analytics
When setting up the app, disable analytics, tracking cookies, and personalized ads where possible. These features mainly collect behavioral data for marketing, so turning them off reduces unnecessary data sharing.
6. Adjust the apps’ privacy and security settings
After you opt out of data analytics, go through available app settings and see if you can limit further data access. This may include permission to access your camera or microphone, location sharing, and optional diagnostics. Most mental health apps don’t need them to function, so it won’t hinder your experience.
7. Limit how much personal information you share
Only provide the information that’s strictly necessary for the app to function. If optional details are requested — like your full name, address, or extensive medical history — consider leaving them blank or using general information to maintain greater privacy.
8. Don’t connect other accounts, like Facebook or Google
Although linking social media or Google accounts makes signing up easier, it often enables broader data sharing between platforms. This can lead to more targeted advertising and cross-platform tracking, reducing your control over where your information ends up.
Prioritize Your Privacy in Digital Wellbeing
Today’s mental health apps offer easy access to support, but they often do so at the cost of your privacy. Even major platforms like BetterHelp, Talkspace, and Headspace collect sensitive details without strong protections in place. Inconsistent privacy policies, vague data-sharing practices, and rapid tech developments have left serious gaps in user security.
As AI-driven therapy bots and immersive VR experiences become more common, it’s more important than ever to stay critical of how your mental health data is handled. Not every app is unsafe, but reading privacy policies, limiting what you share, and adjusting your settings can make a real difference. Your emotional well being deserves better than vague promises — it deserves strong, thoughtful digital protections too.