You are building the second brain you have always wanted, but you have already burned a weekend on CLAUDE.md iteration and your wiki output is still inconsistent. This page collects everything on aillm.wiki that is written specifically for developers running an LLM Wiki on Claude Code, Cursor, or any CLI-friendly LLM. If you are new to the pattern itself, the plain-English primer takes about 5 minutes. Otherwise, jump straight to the technical guides below.
Every Karpathy LLM Wiki tutorial assumes you can write a working CLAUDE.md on the first try. Most developers cannot — not because they lack the skill, but because the schema is the hard part, not the code. Tuning a schema until the LLM writes consistently takes 5-10 compiles, which is 5-10 weekends if you are doing it on the side. We have done that work for five common project shapes. If you would rather not redo it, the Starter Kit is the shortcut.
If you want to do it yourself, the resources below are the same ones we used.
CLAUDE.md, schema, ingest script, and lint loop.These three pieces will get you to a working wiki in a weekend. If you want more depth, every blog post on aillm.wiki cross-links to related material at the bottom.
We publish 3 production-grade schemas for free — General, Research, and Engineering. The Engineering schema is the one most developers want; it covers modules, decisions (ADR-style), issues, dependencies, and runbooks. Drop it into your project's meta/schema.md and you have a working starting point.
The remaining 2 schemas (Product and SEO) plus the full ingest pipeline ship in the Starter Kit.
If you have read the three core posts and want to skip the schema-tuning weekend, the Starter Kit is the shortcut. It includes:
CLAUDE.md files (general, research, engineering, product, SEO)schema.md blueprintsingest.py with retries and diffinglint.py script with auto-fix suggestionsLaunch price is $19 for waitlist subscribers, $29 after. One-time purchase, no subscription.
Get the $19 launch price + an early-access copy when it ships
Yes. The whole pattern is .cursorrules instead of CLAUDE.md, but everything else is identical. The Starter Kit ships with both formats.
For toy wikis (under 20 pages), free tiers are fine. For real use, you will want at least Claude Pro or a small API budget. Most developers spend $5-15/month on LLM API calls for a single-user wiki.
Yes, but the schema-following quality drops noticeably with smaller models. Llama 3.3 70B is the smallest model we have seen produce consistently usable LLM Wiki output. Anything smaller and you will spend more time fixing the wiki than reading it.
The pattern is just a folder of markdown files, so yes — drop it into Next.js, Astro, MkDocs, or anything else that renders markdown. Several readers run their LLM Wiki as a public-facing knowledge base for their team or product.
We send an occasional email with new schema patterns, fresh implementations, and the best community discussions — no fixed schedule, only when there is something worth your time. Subscribe below.