App-ocalypse

App-ocalypse

Beware of the reckless boom in mental health apps.

Tanmoy Goswami
🙏🏾
"Standing ovation! I was just cleaning out my inbox of way too many emails and this was a breath of fresh air. Bravo."

"This is exactly what I needed to read today, Tanmoy. Your decision is courageous and makes total sense."

"So glad you listened to your inner wisdom on this."

"Bravo, Tanmoy! I would imagine that this process was rather excruciating. You are trusting the process, which is never easy to do."

I am so moved by the countless loving emails you've sent me after my letter about turning down a funding offer for Sanity. I do this work for you, and I promise to keep trusting the process as long as I can.

If you are a free subscriber, please consider joining the Sanity supporter community by choosing one of the options below. Your contribution helps keep this platform alive, ad-free, and independent. Thank you.

"Our research has found that, in gathering data, the developers of mental health-based AI algorithms simply test if they work. They generally don’t address the ethical, privacy and political concerns about how they might be used."

Piers Gooding and Timothy Kariotis aren't activists or luddites. Gooding is a senior research fellow at Melbourne Law School, University of Melbourne. His work focuses on the law and politics of disability and mental health. Kariotis is a lecturer in digital government and PhD candidate in digital health at the same university. He researches the design of digital mental health technologies.

Together, they are experts on a subject that ought to keep you up at night:

The dangerous rise of unethical, privacy-destroying, venture capital-fattened mental health apps.

If you are a regular reader, you know by now that few things terrify me as much as the dystopia that is mental health technology. There's good reason for my fears. Mozilla Foundation, which published a report on the privacy features of mental health and prayer apps earlier this year, called the vast majority of them 'exceptionally creepy' because 'they track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data'.

This post is for subscribers only

Subscribe
Already have an account? Log in