10 mistakes to avoid in your AI rules file
Common ways AI rules files fail in practice — and how to fix each one.
1. Vague platitudes
Write clean, readable code. Follow best practices. Use TypeScript.
The model already knows abstract good practice. Vague rules give it no leverage on your project specifically.
Fix: be concrete. Replace "write clean code" with specific patterns: "use named exports only," "no any — use unknown," "keep components under 100 lines."
2. Outdated framework versions
This is a Next.js project. Use
getServerSidePropsfor data fetching.
The rule was right when written. Then App Router happened. Now the AI confidently uses an obsolete API every turn.
Fix: pin versions explicitly ("Next.js 15, App Router only") and review on every framework upgrade. Search the rules file for the framework name when you upgrade — anywhere it appears is a candidate for staleness.
3. Letting it grow unbounded
A rules file that started at 200 lines is now 1,400. Every team member added their pet rule. The model truncates the second half (Copilot) or barely attends to it (Claude past ~800 lines).
Fix: target 250–400 lines. When the file grows past 600, audit:
- 30% is usually restating language defaults the model already knows. Cut.
- Per-feature rules that only apply in one subdirectory belong in a CLAUDE.md inside that directory (Claude) or a scoped .mdc file (Cursor).
- Rules that nobody actually enforces in code review belong in the README, not the rules file.
4. Leaking secrets
Our database URL is
postgres://prod-user:correct-horse-battery-staple@db.prod...
Don't laugh — it happens. The rules file is committed to git.
Fix: never put secrets in the rules file. Reference env vars by name, never by value. Add *.env* to .gitignore and verify before every commit.
5. Single-file in a monorepo
One root rules file, three packages with three different stacks (a Next.js app, a Go API, a Python data pipeline). The AI gets contradictory rules and follows the wrong ones.
Fix: for monorepos, use the tool's nested-file support. Claude merges nested CLAUDE.md files. Cursor .mdc lets you scope rules with globs. For tools without nesting (Copilot, Aider), keep the root file generic and rely on README conventions per package.
6. Forgetting behavioral rules
The file lists every code convention but says nothing about how the AI should behave. Result: it installs random dependencies, modifies migrations, and ships unrelated refactors in your PR.
Fix: add explicit behavioral rules:
- Ask before installing dependencies.
- Run the linter and tests before declaring a task complete.
- Never modify a database migration after it's been committed.
- Don't refactor unrelated code in the same change.
- Edit existing files when possible; only create new ones when needed.
These are project policies the model would otherwise violate confidently. Every prevented mistake is a turn you didn't waste correcting.
7. Copy-pasted from someone else's repo
Someone's .cursorrules from a high-star GitHub repo looked great. You copied it. Now your project has rules for a stack you don't use, mentioning libraries you don't have, referencing patterns that don't apply.
Fix: start from a stack-matched template (e.g. the library on this site has 20 hand-curated stacks) and edit. Or generate from scratch with the wizard. Don't copy a stranger's repo wholesale.
8. Rules that contradict the actual code
The rules file says "use Drizzle." The codebase still uses Prisma in three places. The AI follows the rules and produces inconsistent code.
Fix: the rules file describes the current state, not aspirations. If you're migrating Prisma → Drizzle, write that explicitly: "Drizzle is the target ORM. Existing Prisma code stays until migrated; new code uses Drizzle."
9. No examples
Every rule is a sentence. No code samples. The model sometimes interprets the sentence in a way you didn't intend.
Fix: for any non-trivial convention, add a short code example. Especially for the "patterns to avoid" section — show the bad pattern explicitly so the model recognizes it.
10. Set-and-forget
The rules file was written 18 months ago when the project started. Since then: Next.js upgraded twice, Tailwind 3 became Tailwind 4, you switched from Jest to Vitest, you added a new package. None of it is in the file.
Fix: treat the rules file as living code. Update it in the same PR that introduces a convention change. Set a quarterly review reminder. The single biggest predictor of a good rules file is how recently it was edited.
What good looks like
The 20 stack files in the library are written to avoid every mistake on this list. Each one is opinionated, version-pinned, includes examples, has behavioral rules, and was reviewed in the past quarter. Use them as a baseline for what a maintained rules file looks like.