Skip to content
TestingModesModesAI / EvalsHallucination

Hallucination

What it is

Factuality / citation grounding. AI claims tested against authoritative oracles.

When to use it

Statute references, OpenAPI endpoint citations, jurisdiction facts.

Example gates

Future slice — hallucination-check primitive is shipped; production gate pending.

See also

On this page