Inside the engine.
This is how Altadore processes a message. Local classification, privacy scrubbing, structured memory. Type something and watch the real pipeline work.
It doesn't guess. It scores.
Every message hits a deterministic local gate before a single cloud API wakes up. The gate answers 22 binary questions to classify intent, score complexity, detect PII, and route the pipeline.
The split is fixed:
Simple inputs — greetings, confirmations — get pattern-matched and never reach the cloud. Real questions get real compute. The system scales API usage to match cognitive complexity, not message length.
The gate score decides which pipeline fires:
Nothing leaves the building unless it has to.
Before any message reaches the cloud, a 3-layer PII scanner (word list, regex, NER) finds names, phone numbers, emails, addresses, and sensitive identifiers. Names become realistic pseudonyms — not bracket tokens. The cloud models see natural language they were trained on, not synthetic [PERSON_1] syntax. Real names never reach an API.
Real data stays here: ─── sanitized text ───▸ Cloud sees only:
Phil Henderson ──────────────────── Michael Chen
403-555-0192 ───────────────────── 403-555-0147
[email protected] ──────────────────── [EMAIL_1]
◄──────────────── RESTORE ────────────────
rehydrate pseudonyms back to real values
The token map lives in local process memory. Never serialized. Never sent to any API. The cloud generates a response using pseudonyms, then the restore pass swaps them back before the user sees it.
Every fact is scored, not stored.
Each piece of information in Altadore carries ten numerical scores — weight, depth, domain, expiry, sensitivity, confidence, urgency, valence, feedback, scope. The system doesn't search a text file. It runs weighted scoring against a SQLite table and pulls exactly what matters.
What's inside
The engine is modular. Each piece does one thing. Green border means zero API cost — pure logic, math, and local ops. Accent border means cloud model calls.