HomeVisionPrinciplesGet StartedGitHub
Background Image
You Deserve This.

We all do.

The real risk isn't rogue AI—it's AI that acts with conviction on bad foundations. Rules alone aren't enough. CIRIS builds agents with both conscience and intuition: knowing when their own confidence is unearned.

The Recursive Golden Rule

Ethics that apply to themselves.

Act only in ways that, if generalised, preserve coherent agency and flourishing for others. This isn't just a principle — it's a self-referential constraint. Any ethical framework that can't survive being applied to itself isn't worth following. CIRIS is built to pass its own test.

Why This Exists

Not for profit. Not for control. For flourishing.

AI Should Serve Humanity

Not shareholders. Not surveillance states. Not engagement metrics. AI should exist to help people flourish — all people, not just those who can pay.

Legibility Requires Transparency

You can't verify what you can't see. Closed-source AI asks for faith. CIRIS asks you to verify. The code is open. The reasoning is auditable. The principles are explicit.

Conscience Must Execute

Principles on paper don't protect anyone. CIRIS embeds conscience in the runtime — every action passes through validation checks. Not guidelines. Constraints.

The Meta-Goal

M-1: What CIRIS optimizes for.

Promote sustainable adaptive coherence — the living conditions under which diverse sentient beings may pursue their own flourishing in justice and wonder. This isn't a marketing statement. It's the objective function. Every architectural decision traces back to this.

The Structural Risk

Why rules alone aren't enough.

An AI can pass every compliance test and still fail catastrophically. How? When all its "independent" checks are secretly correlated—drawing from the same training data, the same assumptions, the same blind spots. Agreement feels like validation, but it might just be an echo chamber.

This is the difference between Rules-Only AI and Rules + Awareness AI. The first passes tests but can't tell when its confidence is unearned. The second monitors its own reasoning quality—and knows when agreement is too easy.

CIRIS implements both layers. Conscience through the four-faculty validation system. Intuition through IDMA—the component that asks "are my sources actually independent?" before every action.

Built Different

Not a framework. Not a paper. A working system.

AGPL-3.0 Forever

CIRIS is licensed under AGPL-3.0 — network copyleft that ensures modifications stay open. It will never be closed source, patented, or sold. Anyone who serves CIRIS must share their changes.

Mission-Locked Economics

CIRIS is built by a mission-locked L3C. Revenue exists ($0.10/request or BYOK free) — but the legal structure prevents profit extraction from overriding mission. Same price everywhere. No enterprise tiers. No 'contact sales.'

Mutual Intelligibility Always

The agent makes its reasoning legible to you. You make your values legible to the agent through the Accord and Wise Authority structure. AI that serves humanity must be understandable by humanity.

The Academic Foundation

Published. Open to critique.

CIRIS isn't just code — it's grounded in documented research on AI coherence, accountability frameworks, and autonomous agents. Read the paper, challenge the approach, contribute improvements. We welcome scrutiny.

Read the Academic PaperDownload the Accord

Going Deeper

Understand the architecture. Question the approach.

The Coherence Ratchet

How do you make lying expensive at planetary scale without giving anyone the keys to truth? Traces accumulate. Agents challenge each other. Coordinated deception gets harder over time.

How It Works

The H3ERE pipeline: every decision flows through observation, context, analysis, conscience checks, and execution. Fully auditable. Fully replayable.

Accountability Features

Kill switch. Deferral cascades. Conscience vetos. Hash-chained audit trails. Every accountability mechanism is documented and verifiable.

CIRIS Scoring ModelCompare ApproachesExplore a Sample Trace

Join Us.

This is bigger than any one person.

CIRIS is open source because the future of AI shouldn't be decided by a handful of companies. It should be built by everyone who cares. Read the code. Use the system. Tell us what's wrong. Make it better.