Your memory is yours. Always.
Coral exists because persistent memory for AI is too important to do carelessly. This page is the plain-English version of how we handle your data — what we store, what we can read, and what we've committed to forever.
How Coral protects your memory
Encrypted client-side
Every memory is encrypted with AES-256 on your machine before it ever touches Coral's servers. The encryption key is derived from your passphrase and never leaves your device. Even if our database were dumped tomorrow, your memories would be unreadable.
Your keys never leave your device
We don't store your encryption passphrase. We don't have a "master key" that unlocks accounts. If you lose your passphrase, nobody — including us — can recover your memories. This is the trade-off, and we think it's the right one.
Open-source memory protocol
The memory format, the encryption scheme, and the sync protocol are all open-source. You can read the code, verify the claims, and even run your own Coral-compatible daemon if you want. We can't secretly weaken the protocol without everyone noticing.
Works on every Claude model, forever
Coral is model-agnostic. If Anthropic ships a better model, Coral works with it the same day. If Anthropic's pricing changes, your memory doesn't get held hostage. Your Coral data migrates cleanly because it's yours in plain markdown.
Not just Claude — every capable model
From V1.5, Coral's memory classifier works with GLM-5.1 (the 754B open-source MIT-licensed model from Z.ai — currently #1 on SWE-Bench Pro) and any OpenAI-compatible provider. Claude stays the default, but you're never locked in. If the best model tomorrow isn't Claude, Coral still works the same day.
What we commit to, publicly
- We will never read your memories. Technically: we can't — client-side encryption makes it impossible. Philosophically: even if we could, we wouldn't. That's the whole point of Coral.
- We will never sell your data, ever. Not to advertisers. Not to AI training pipelines. Not to data brokers. If a feature would require us to, we don't ship the feature.
- We will never hold your memory hostage. Cancel anytime. Export your entire memory archive in plain markdown whenever you want. No retention emails. No "are you sure" dark patterns. No lock-in.
- We will be publicly honest when we mess up. If there's a security incident, a data issue, or a product decision that turned out wrong, we'll say so on the changelog and in email. Transparently. No spin.
- We will answer your questions personally. If you email founder@trycoralai.com, a real person (Malachi) reads it and responds. Not a support ticket system. Not a bot.
What we actually store on our servers
Transparency matters. Here's the complete list of what Coral's servers see:
- Encrypted memory blobs — your memories, encrypted before upload. We see ciphertext, not plaintext.
- Metadata — timestamps (when memories were captured), project tags (names you chose), agent tags, and memory type (decision/preference/fact/task/idea). No content.
- Your account — email, encrypted passphrase hash (not the passphrase itself), connected devices (up to 5), subscription status.
- Billing info — handled by Stripe, not us. We never touch your card number.
- Usage logs — request timestamps, error logs (anonymized), API call volumes. Rotated out after 30 days.
The harness policy (April 4, 2026)
On April 4, 2026 at 12pm PT, Anthropic's Head of Claude Code, Boris Cherny, announced that Claude Pro and Max subscriptions would no longer cover usage on third-party tools like OpenClaw. Users of those tools now pay via extra-usage bundles or API credits billed separately. Anthropic explicitly confirmed that "Claude subscriptions continue to apply to Claude.ai, Claude Code, and Cowork" — first-party tools are exempt. (Reported by TechCrunch and The Register.)
Coral is structurally outside this policy. Coral is a local daemon that listens to Claude Code's official session hooks — it doesn't wrap Claude Code, doesn't proxy your sessions, and doesn't replay your subscription tokens. Your Claude Code sessions run normally on your Pro or Max plan exactly as they did before Coral was installed. The memory classifier step (which happens after a session ends) calls Anthropic using your own API key directly, so the small cost lands on your API bill where direct API usage has always been billed. Read the FAQ on the landing page for the full architectural breakdown.
Questions about your data?
Email me directly. I read every message and respond personally. If you want to verify a specific claim on this page, I'll walk you through it — including pointing you at the exact lines of code.
founder@trycoralai.comFull legal docs (coming with V1 launch): Privacy Policy · Terms of Service · Security Disclosure
Built by Malachi. Coral is a real product in active development. Last updated: April 2026.