Effective date: 2026-04-18 Last updated: 2026-04-18
Lume ("we", "us", "our", the "App") is an AI-assisted emotional wellness companion. This policy explains what we collect, why, and how you can control your data. We wrote it in plain language on purpose — if anything is unclear, email us at support@lume.app.
Lume is operated by an individual developer based in the United States.
All user data is stored on servers located in the United States
(Google Cloud Platform, us-central1 region).
| Purpose | Data used |
|---|---|
| Provide the core service (save moods, show history, personalize AI replies) | Moods, conversations, preferences |
| Run AI replies | Your conversation messages are sent to our AI provider (see §4) |
| Detect recurring emotional patterns (your "elephants") | Mood trigger text, conversation summaries |
| Notify you of admin replies, reminders | Push token |
| Improve the app, diagnose bugs | Technical info, usage events (de-identified where possible) |
| Respond to feedback you submit | Feedback content + attachments |
| Legal compliance, fraud prevention | All of the above, on narrow legal basis |
We do not use your personal content to train AI models.
Lume's chat feature sends your typed messages to a third-party large language model provider to generate a reply. Currently we use:
These providers may temporarily process your message to produce a response but, under our contractual setup with them:
We do not have visibility into these providers' internal logs beyond what is necessary for abuse prevention. Do not share information you would not want seen by a human reviewer if an abuse investigation were triggered (e.g. explicit credit-card numbers, government IDs).
We use the following service providers under data-processing agreements:
| Provider | Purpose | Data shared |
|---|---|---|
| Google Cloud Platform (US) | App hosting, database, file storage | All app data |
| Firebase (Google) | Authentication, push notifications, crash reports, analytics | Anonymous IDs, events |
| OpenAI / Gemini (Google Vertex AI) | AI chat replies | Your chat messages + mood context |
| Apple | App Store delivery, Sign In with Apple | Apple user ID |
| RevenueCat (if you subscribe) | Subscription receipt validation | Transaction ID, product ID, user ID |
We do not share data with advertisers, data brokers, or marketing networks.
You can request immediate deletion at any time — see §8.
We follow standard security practices: TLS encryption in transit, encryption at rest for database storage, access logging, and least-privilege access for the developer. No system is 100% secure; if a breach occurs that affects you, we will notify you within 72 hours.
Since we are a solo-developer app, you should not share highly sensitive information (e.g. Social Security Numbers, medical records) through Lume.
Lume is not intended for children under 13. We do not knowingly collect information from children under 13. If you believe a child under 13 has created an account, contact support@lume.app and we will delete the account.
Users aged 13–17 should use Lume only with parent or guardian consent.
Regardless of where you live, you can:
You have additional rights under California law, including:
To exercise any right, email support@lume.app with subject "CCPA Request". We will respond within 45 days.
We do not discriminate against you for exercising any of these rights.
Lume is a wellness / self-help tool. It is not a medical device, not a substitute for therapy, and not HIPAA-covered. The AI's replies are NOT medical, psychological, or professional advice.
If you are experiencing a mental health emergency, please contact:
We may update this policy. We will notify you in-app for material changes and post the new effective date here. Continued use after changes means you accept the updated policy.
Questions, requests, complaints: support@lume.app
If you are in the EU (even though we do not target the EU market), you also have the right to complain to your local data protection authority.