When you tell a healthcare provider about mental health issues, the privacy of the doctor-patient protects those conversations. However, this does not necessarily exist when you use one of the many mental health applications. Let’s see why.
Is the mental health application data secure and reliable?
The short answer is: it depends. One challenge has to do with the huge variety of mental health applications and the way they work. They range from mood monitors and collections of meditations to chatbots and virtual one-on-one visits with trained therapists.
Each application has specific rules and related data protection. There are also many cases of mental health application workers digging or sharing data.
According to Salon, former employees of the mental health app Talkspace said people at the company regularly review transcripts from patients and therapists to find common phrases and use them to improve marketing to potential users.
The investigation also includes the experience of a man named Ricardo Lori, a Talkspace user who took a job in the company’s customer service department. Leaders asked him to read excerpts from his diaries of therapeutic chats, promising anonymity. But somehow it was rumored that Laurie was the patient described in the sessions.
These incidents underscore the need to always carefully consider privacy policies before signing up for a service. It is quick and easy to agree to the terms without doing so, but this can jeopardize your privacy.
Mozilla reveals privacy practices for mental health applications
The Mozilla Foundation has assessed the privacy and security of 32 mental health and prayer applications. The results showed that 25 did not meet Mozilla’s minimum security standards, such as the requirement for strong passwords. Researchers also had strong concerns about how 28 applications process user data.
Jen Caltrieder, project manager, said: Most mental health and prayer applications are extremely scary. These applications track, share and take advantage of the most intimate personal thoughts and feelings of users, such as moods, mental state and biometric data.
It turns out that researching mental health applications is not good for your mental health, as it reveals how careless and thirsty these companies can be with our most intimate personal information.
Privacy risks do not end with data sharing
The unfortunate reality is that the healthcare industry is ferocious with unscrupulous companies and individuals trying to take advantage of people who are often in desperate situations. For example, the US Department of Justice has charged 345 people with more than $ 6 billion in health fraud charges, according to the non-profit organization URAC.
Even when mental health applications keep data safe, they are not always in advance of other aspects. According to Newsweek, Katie Mack published an experience at TikTok regarding registration for the Cerebral Mental Health app. She objected to how the service required payment information before contacting a therapist.
Mack did give her details, but after doing so, he found that there were only two therapists in her area. None of their expertise met her needs. Then Mac learned that the app’s policy was to issue only 30 percent refunds. She called the app a “scam” and threatened to report it to government agencies.
Health data is digitized regularly
People have many ways to maintain their health through various applications. The 23andme DNA Testing Service can tell people about the risk factors for late-onset Alzheimer’s disease.
On the other hand, people who use Apple Health can track everything from inhaler use to asymmetrical walking patterns. UCLA researchers also used data from volunteers from Apple Watches and the iPhone to get a better idea of depression.
When you install a new application, you may see settings for applications that are allowed to read data. Apple Health and similar services can download information from other sources and compile it in one place. Then it’s easier to review trends with your healthcare provider.
However, health data hacks are more common than you think. A Politico study found that nearly three-quarters of health data breaches in 2021 were related to hackers.
Another potential risk is how company acquisitions can give cybercriminals access to more data. Fitbit, which Google acquired in 2019, has more than 30 million active users. Google itself has a number of health initiatives for both users and providers.
Learn why apps need your data and what’s going on with it
Given these alarming revelations, what is the most proactive thing you can do to keep your data in the mental health app protected?
7 Useless Mental Health Apps You Should Avoid
Read the following