Privacy
How we handle your data. Written in plain English, honest about the limits.
The short version
No other Tesla logger offers this. We built BlackCurrent around the assumption that we are the threat we protect you from.
What this means for you
- If our database is breached, your vehicle data is useless to the attacker.
- If one of our engineers goes rogue and dumps storage, they get ciphertext.
- If we're acquired, subpoenaed for historical data, or pressured to hand over what we have, we hand over ciphertext. We cannot decrypt what we don't have the key for.
- We cannot run analytics, ad-targeting, or data-brokerage on your content — even if we wanted to, the architecture doesn't allow it.
- Our business model is your subscription. That's the deal.
How it works
When you sign up, your passphrase produces a cryptographic keychain on our server. We encrypt that keychain with your passphrase (via scrypt / PBKDF2 — your passphrase is stretched into something resistant to brute force) and store only the encrypted blob. Your passphrase is discarded before the signup request returns.
When you log in, we briefly re-derive the key, use it to unlock your keychain, re-wrap it with a short-lived session key, store the wrapped version in our session cache, and forget the passphrase again. Each page you load decrypts only as much data as needed to render that page, in memory, then discards it.
When your Tesla sends us new telemetry, we can only write — we seal each batch to your public key so that we can't read it back until you log in.
What we see, and what we don't
| We see (always) | We see (only while logged in) | We never see |
|---|---|---|
| Your email + password hash | The decrypted contents of whatever page you're loading | Your data when you're logged out |
| Stripe customer ID + subscription status | Your drives, charges, or battery history in backups | |
| Your Tesla VIN (to route incoming telemetry) | Exported data you download to your own device | |
| Your public key (can't decrypt anything) | Analytics aggregates across users | |
| Timestamps, byte-counts, error rates | Your passphrase (after login completes) |
How AWS encryption fits in
Three layers of encryption sit between an AWS-operator keystroke and your vehicle data. Each defends a different threat. Only one of them makes the privacy claim hold. We want to be precise about which is which because saying "we encrypt everything" without distinguishing who holds which key is the easy way to mislead.
| Layer | What's encrypted | Who holds the key | What this defends |
|---|---|---|---|
| 1. Infrastructure | EBS volumes, S3 default | AWS, automatically | Lost / stolen / decommissioned hardware. The disk is unreadable to anyone outside AWS without a valid request. |
| 2. Platform | CloudTrail logs, Identity Center session, Tesla OAuth tokens, your session keychain wrap | AWS, on our account | Compromise of our application code or operator credentials. An attacker with our IAM credentials still has to call KMS to decrypt; KMS calls leave a CloudTrail record we can audit. |
| 3. Your vehicle data | Every drive, charge, GPS point, battery cycle in our storage | You alone, derived from your passphrase | Everything else. AWS can read Layers 1 and 2 under lawful compulsion. They cannot read Layer 3 because the key never reaches their hardware. |
What this means concretely:
- A lawful warrant served on AWS can compel them to decrypt at Layers 1 and 2. They will hand over CloudTrail records, your Tesla OAuth tokens (giving the requester the same view of your Tesla that we have — Tesla's API, not your historical data), and the ciphertext blobs that store your vehicle data. They cannot decrypt the blobs themselves; that needs your passphrase, which AWS has never seen and we discard immediately after login.
- A compromise of our code or operator credentials is bounded by the same architecture. Layer 2 is auditable — we can see who decrypted what session and when via CloudTrail. Layer 3 is unreadable to us regardless of whether we're compromised, because the unwrapped keychain only exists in memory while you're actively using the app.
- A compromise of you (your passphrase, your device) is the one thing this architecture cannot defend against. That's the inherent cost of holding the key yourself: nobody else can lose it, and nobody else can recover it for you. Write it down.
The limits of this promise
We want to be honest about where the architecture has edges. In order of who should care:
1. While you're logged in, we could in principle be compromised
Your session keychain lives in our memory during active requests. If our servers were compromised during your session, an attacker with access to that memory could read your data for as long as your session stayed open. Signal-style strict end-to-end encryption would rule this out; we don't.
Mitigations we have in place:
- Aggressive idle session timeouts (default: log out after 30 minutes of inactivity).
- No plaintext ever written to disk or logs — enforced by a structured logger with an allow-list of safe keys.
- Production access is audited; every session-key decryption is a logged call on our cloud provider's key management service.
If this specific threat is your primary concern, a self-hosted alternative like TeslaMate is a better fit.
2. We cannot defeat legal compulsion of your future sessions
If a government orders us to intercept a specific user's future sessions, we cannot technically refuse. The data we already have at rest is useless to them (we can't decrypt it without your key), but we cannot guarantee that a future session couldn't be subject to a compelled intercept. We will publish a transparency report.
3. Tesla OAuth tokens are held with a platform key, not yours
Your Tesla account credentials (the OAuth tokens Tesla gives us to pull data on your behalf) have to be usable while you're offline, otherwise we couldn't stream new telemetry into your account. These tokens are encrypted with our platform key, not with your keychain. We treat them as credentials to Tesla, not as your vehicle data. You can revoke them at any time from your Tesla account, and we delete them when you cancel.
4. The telemetry ingest briefly sees plaintext
When your car streams new data to our ingest server, that data exists in memory for a few hundred milliseconds before we seal it to your public key. It's never written to disk or logs in that form. The ingest encryptor source code is open so you can verify this.
What this means in practice
If we get hacked — the stolen database, object storage, or backups contain ciphertext only. The attacker needs every user's passphrase, independently, to read anything.
If we receive a subpoena for your historical data — we hand over ciphertext. We cannot produce plaintext. We'll tell you the subpoena happened if legally permitted.
If you want your data out — the "Export" function rebuilds a full copy from your decrypted data inside your session and hands it to you as a file. Your browser downloads plaintext; our servers never persist it.
If you lose your passphrase and your recovery phrase — your data is unrecoverable. By us, by you, by anyone. This is the other side of the same design that prevents us from reading it. We will not have a "reset password" button that decrypts your data; no such button can exist.
Verification
We encourage you to not trust our marketing. Three things help you verify the claim:
- The ingest encryptor (the one place where plaintext briefly lives) will be published as open source at launch.
- The full architecture document is published alongside the code.
- A third-party security audit will be published before general availability.
This page is the plain-English privacy description. A separate legal privacy policy covering GDPR data controller details, cookie information, and regional compliance language will be published before public launch. Questions: hello@blackcurrent.app.