Quick Start

Get Kernora running in 30 seconds.

Installation

Run the installer:

curl -fsSL https://kernora.ai/install | bash

Setup Wizard

The installer launches an interactive wizard that asks three questions:

  1. LLM Provider — Choose your AI provider (Anthropic, OpenAI, Google, or Ollama). Kernora auto-detects your API keys from environment variables.
  2. Local Backup — Optional folder for iCloud or Dropbox sync. Leave blank to store everything locally at ~/.kernora.
  3. Team Mode — Enable S3 team sync? (Requires AWS credentials. Optional. Default: local only.)

That's It

Once setup completes, use Claude Code normally. When your session ends, Kernora automatically:

  • Captures your session transcript
  • Extracts patterns, decisions, bugs, and rules
  • Stores everything in encrypted SQLite
  • Injects relevant context into your next session

Your knowledge is preserved. Your decisions are never lost again.

CLI Reference

All commands run from your terminal. Use nora or kernora — they're the same thing. Nora is just easier to type.

Command Description
status Check daemon status, database size, and session counts
kiq Display your Knowledge Intelligence Quotient score (0-100, updated daily)
search <query> Full-text search across patterns, decisions, bugs, and prompts
bugs List top recurring bugs sorted by frequency
patterns Show top patterns by effectiveness score and reusability
decisions Recent architectural decisions with context and impact
savings Token savings from memory injection (cumulative and per-session)
freshness Pattern decay analysis (Intelligence Half-Life metric)
self-review <period> Generate self-review for a date range (e.g., --since 2026-03-01)
export <format> Export knowledge base to CSV or markdown
dashboard Open the dashboard at localhost:2742

Configuration

Kernora stores configuration at ~/.kernora/config.toml. Edit directly or use CLI flags to override.

[mode] Section

[mode] type = "byok" // byok (default) or team

byok = everything runs locally. team = syncs knowledge to S3.

[keys] Section

[keys] anthropic_key = "sk-ant-..." openai_key = "sk-proj-..." google_key = "AIzaSy..."

Auto-populated from environment variables. Override here if needed.

[model] Section

[model] provider = "anthropic" // auto recommended preferred = "claude-opus"

Default provider for session analysis. Kernora recommends the best provider based on your API keys.

[analysis] Section

[analysis] run_frequency = "on_session_end" extract_patterns = true extract_decisions = true

[dashboard] Section

[dashboard] port = 2742 auto_open = true

[backup] Section

[backup] local_sync_folder = "~/Dropbox/kernora"

Optional folder for cloud backup. Kernora keeps this in sync.

[swarm] Section (Team Mode Only)

[swarm] s3_bucket = "kernora-team-abc123" aws_region = "us-east-1" team_id = "team-xyz789"

Required if [mode] type = "team". Kernora syncs your knowledge base to S3.

How It Works

Kernora runs silently in the background, learning from every session you have in Claude Code.

The Pipeline

  1. Session Capture — When you end a Claude Code session, Kernora hooks the transcript.
  2. Spool to Daemon — Transcript spools to a local daemon running at localhost:2741.
  3. LLM Analysis — Daemon sends transcript to your configured AI provider (Claude, GPT-4, Gemini, or Ollama).
  4. Extraction — The LLM extracts structured data:
    • session_type — Feature build, debugging, refactor, learning
    • patterns — Reusable approaches, anti-patterns, proven workflows
    • decisions — Architectural choices with rationale and tradeoffs
    • bugs — Recurring issues, root causes, fixes
    • rules — Team conventions, language-specific best practices
    • prompts — Effective AI prompts for future reuse
  5. Storage — All data encrypted and stored in SQLite at ~/.kernora/echo.db.
  6. Dashboard — Web dashboard (localhost:2742) reads the database and displays insights.
  7. Hot Memory — Next time you start a session, Kernora injects relevant patterns, decisions, and context directly into the LLM's prompt.

Zero Data Exfiltration

Your sessions never leave your machine except when YOU explicitly sync to S3 (team mode only). All analysis happens locally with your own API credentials.

Intelligence Half-Life

Patterns decay over time. A decision made 6 months ago is less relevant than one from last week. Kernora tracks freshness and automatically decays pattern confidence as they age. The nora freshness command shows you which patterns are stale.

Uninstall

To remove Kernora:

Keep Your Data

Remove Kernora but preserve all captured knowledge:

bash ~/.kernora/app/uninstall.sh

Your database and configuration remain at ~/.kernora. You can reinstall later and pick up where you left off.

Full Purge

Remove Kernora completely, including all data:

bash ~/.kernora/app/uninstall.sh --purge

This deletes everything at ~/.kernora. The action is irreversible.