Skip to content
MoodSync

Why on-device privacy matters for a mood tracker

3 min read · Sources last checked: May 2026

A mood tracker holds some of the most sensitive data a person ever writes about themselves: substance use notes, suicidal ideation flags, medication history, episode records. The privacy posture of the app you choose matters.

What "private" really means

The word "private" is doing a lot of work in app marketing. It can mean:

  • Encrypted in transit (the data is encrypted when it travels to the server)
  • Encrypted at rest (the data is encrypted on the server)
  • End-to-end encrypted (only you have the key, even though the data passes through someone's server)
  • On-device only (the data never leaves your phone)
  • Synced through your own private cloud (e.g. iCloud / CloudKit — the app vendor never sees the data, even though it is on Apple's servers)

These are different postures with different trade-offs. The first two still mean the app vendor can read your data. The last three keep the values out of the vendor's reach.

What to look for

Before you log anything sensitive in a new app, check three things.

1. The Apple App Store privacy section. Apple requires every developer to declare what data their app collects, in a public "App Privacy" section on the App Store listing2024. The labels are developer-declared and self-reported, so they are the first check, not the whole audit. A short label (e.g. "Data Not Collected," or only Health/Product-Interaction declared, with tracking off) is a useful starting signal — but it is worth pairing with the other two checks below.

2. The privacy policy. Look for what the app does with your data. Look for the words "third party" and "analytics." Mozilla's Privacy Not Included project has documented patterns of mental health apps sharing user data with third parties2024.

3. Whether it requires an account. An account-required app needs to identify you. An account-less app does not.

Why this matters specifically for mood data

NIMH notes that the proliferation of mental health apps has outpaced their evaluation, and that data privacy is one of the open concerns2024. The risk is not abstract:

  • Insurance and employer data brokers exist
  • Mental health information is more commonly stigmatized than other health data
  • Once a record exists on a server, it can be subpoenaed, breached, or sold

The simplest defense is to not collect the data in the first place.

The trade-offs of on-device-or-private-cloud storage

Keeping your data out of an app vendor's reach is not free. It costs you:

  • No web dashboard — your phone is the only viewing surface
  • No vendor-side AI — the app vendor cannot run a model on your data, because they do not have it
  • Cross-device access only through your own cloud — typically your own iCloud / CloudKit, which means it is yours, not the vendor's

For a mood log, those trade-offs are usually the right ones. For other categories of app, they may not be.

What MoodSync chose

MoodSync mood values stay on your device, with optional sync through your own private iCloud. There is no MoodSync account or email. The privacy feature page and the full privacy policy walk through the details.

The pragmatic check

Before logging anything sensitive in any new app, do this two-minute audit:

  1. Check the App Store privacy section
  2. Look for an account requirement
  3. Search for the app + "privacy" — see what users have written

If the app collects health data, requires an account, and has no clear independent privacy review, that is the moment to ask whether it is the right tool for what you are about to enter.

Honest limits

This is not legal or security advice. It is a pragmatic guide for the choice you make every time you install an app. Privacy is one factor among several — feature fit, sustainability, your clinician's preference also matter.

Sources

  1. Apple Inc. (2024). App privacy details on the App Store, Apple Developer Documentation. link
  2. Mozilla Foundation (2024). Privacy Not Included: mental health apps, Mozilla Foundation. link
  3. National Institute of Mental Health (2024). Technology and the future of mental health treatment, NIMH. link