Hallucination
What it is
Factuality / citation grounding. AI claims tested against authoritative oracles.
When to use it
Statute references, OpenAPI endpoint citations, jurisdiction facts.
Example gates
Future slice — hallucination-check primitive is shipped; production gate pending.
See also
- The pyramid — where this mode sits relative to the others
- Writing a gate — how to scaffold a new gate in this mode
- Decision tree — "I just wrote feature X — what tests do I owe?"