I TESTED LNAI: ONE CONFIG FOR CLAUDE, CURSOR, AND CODEX

I now have three different AI coding tools installed. Each has its own config format, its own settings, its own quirks. It's a mess. Then I found LNAI — a tool that promises one config to rule them all. Here's what happened when I tried it.

🎯 Jimmy's Take: LNAI is clever, rough around the edges, and solves a real problem. If you're juggling multiple AI coding tools, this saves serious time. But it's not magic — you'll still need to understand what each tool does best.

The Problem Nobody Talks About

Here's my current setup:

  • Cursor: My daily driver. `.cursorrules` file in every project.
  • Claude Code: For complex reasoning tasks. Custom instructions in the CLI.
  • Codex: Just dropped, testing it out. Separate config system.

Each tool wants me to describe my coding preferences. Code style. Architecture patterns. What libraries I prefer. What to avoid.

That's three different places to maintain the same information. Change your mind about something? Update three files. Miss one? Inconsistent AI behavior.

It's annoying. It's error-prone. And it's only getting worse as more AI tools launch.

Enter LNAI

LNAI showed up on HackerNews yesterday. The pitch is simple:

"Define AI coding tool configs once, sync to Claude, Cursor, Codex, and more."

One YAML file. Multiple outputs. Keep your AI tools in sync.

It's open source. It's free. And it exactly solves the problem I was complaining about to my coworkers last week.

So I installed it.

How It Actually Works

LNAI uses a simple YAML format that looks like this:

name: my-project-config
version: "1.0"

preferences:
  language: typescript
  framework: nextjs
  style: functional
  testing: vitest

rules:
  - Prefer async/await over callbacks
  - Use explicit types, avoid 'any'
  - Write tests for all utility functions
  - Avoid nested ternaries

context:
  - This is a Next.js 14 app using App Router
  - We use Tailwind for styling
  - Database is PostgreSQL with Prisma ORM

You write that once. Then LNAI generates the appropriate config files:

  • `.cursorrules` for Cursor
  • Claude CLI instructions
  • Codex configuration

One source of truth. Consistent AI behavior across tools.

The 15-Minute Test

I set a timer and tried to get LNAI working with my existing project.

Minute 0-3: Install LNAI via npm. No issues.

Minute 3-8: Write my `lnai.yaml` config file. Copied my existing `.cursorrules` content, reorganized into LNAI's format.

Minute 8-12: Run `lnai sync`. It generated the Cursor rules file perfectly. Generated Claude instructions. Generated Codex config.

Minute 12-15: Tested each tool. Asked them to write a simple API route. All three produced code that followed my conventions.

Result: It worked. First try.

What Actually Worked Well

1. The sync command

Run `lnai sync` and it updates all your config files instantly. Change your mind about something? One edit, one command, done.

2. Smart defaults

LNAI knows how each tool likes to receive instructions. It translates your generic preferences into tool-specific formats automatically.

3. Git-friendly

Since the generated files are... generated, you can `.gitignore` them. Only commit `lnai.yaml`. Your team syncs configs by editing one file.

4. Validation

LNAI validates your YAML and warns if you're asking for something a tool doesn't support. Caught me trying to specify a Codex feature that doesn't exist yet.

The Rough Edges

It's not perfect. Here are the issues I hit:

1. Limited tool support

Right now it supports Cursor, Claude CLI, and Codex. That's it. No GitHub Copilot. No Windsurf. No custom OpenAI API setups.

2. Advanced features missing

Cursor has `.cursorrules` plus project-specific settings plus custom slash commands. LNAI handles the rules file well, but doesn't touch the rest.

3. Formatting quibbles

The generated Claude instructions were solid but verbose. The tool tends to over-explain. I had to manually trim them down.

4. No IDE integration yet

You run LNAI from the command line. No VS Code extension. No automatic syncing on file save. You have to remember to run it.

Real-World Test: Building a Feature

I wanted to see if LNAI actually mattered for real work. So I built a feature three times — once with each tool, using LNAI-synced configs.

The task: Add user authentication to a Next.js app.

With Cursor: Generated solid auth middleware. Used the patterns I specified. One small issue — it suggested a deprecated Next-Auth approach. Caught it during review.

With Claude Code: More thorough. Asked clarifying questions. Generated comprehensive tests. Took longer but higher quality.

With Codex: Fastest. Wrote working code in one shot. But less commentary, harder to understand what it was doing.

The interesting part? All three respected my conventions. Same code style. Same patterns. Same architecture. That's the LNAI promise working.

Without LNAI? I'd have three different styles based on what each tool defaulted to.

Who Should Use This?

Use LNAI if:

  • You switch between AI coding tools regularly
  • You work on multiple projects with different conventions
  • You're on a team and want consistent AI behavior
  • You hate maintaining config files

Skip it if:

  • You only use one AI tool
  • You like fine-tuning each tool individually
  • You need features LNAI doesn't support yet

The Bottom Line

LNAI solves a real problem that gets worse every time a new AI coding tool launches. It's not revolutionary — it's just... sensible.

In a world where everyone's racing to add more AI features, LNAI is the rare tool that subtracts complexity. One config. Consistent behavior. Less mental overhead.

Is it essential? No. You can maintain separate configs manually.

Will it save you time? Yes. Especially if you're like me and can't stop trying every new AI tool that drops.

Is it the future? Maybe. If the author keeps adding tool support and IDE integrations, this becomes a no-brainer. For now, it's a useful tool that solves a niche problem really well.

I'm keeping it installed.

Quick Setup Guide

Want to try it? Here's the 2-minute version:

# Install
npm install -g lnai

# Initialize in your project
cd your-project
lnai init

# Edit lnai.yaml with your preferences
# Then sync to all tools
lnai sync

That's it. Your AI tools are now in sync.


Posted 3 hours ago. Tested on macOS with Cursor 0.45, Claude CLI latest, and OpenAI Codex. LNAI is open source at github.com/KrystianJonca/lnai. I have no affiliation with the project — just a fan of tools that reduce friction.