llms.txt Directory

llms.txt for Nuxt

Ship a spec-compliant /llms.txt from your Nuxt site. Drop-in module, runtime Nitro route, or static asset.

Why llms.txt on Nuxt?

Nuxt apps build through Nitro, which serves static assets from public/ and dynamic handlers from server/routes/. That gives you three paths for llms.txt without changing any Nuxt configuration.

If your docs live in Nuxt Content or @nuxt/content, the @mdream/nuxt module reads your rendered pages and produces llms.txt and llms-full.txt automatically at build time. No runtime work, and the output re-generates on every deploy.

Install options

Pick whichever fits your stack. For most teams the module or a static file is enough; reach for the dynamic route when you need request-time content.

Recommended — module

Install the official module. Reads your pages at build and writes llms.txt into the output.

```bash
npx nuxi module add @mdream/nuxt
```
```ts
export default defineNuxtConfig({
  modules: ['@mdream/nuxt'],
})
```

@mdream/nuxt docs →

Static file

Drop a file at public/llms.txt. No build step, no server logic.

```md
# My Nuxt Project

> A short summary of what this project does and who it is for.

## Docs

- [Getting Started](https://example.com/docs/getting-started): Quick setup guide
- [API Reference](https://example.com/docs/api): Full API docs

## Examples

- [Starter Template](https://example.com/examples/starter): Minimal Nuxt starter

```
Dynamic route

Generate at request time from a CMS, MDX collection, or database. Handler at server/routes/llms.txt.ts.

```ts
export default defineEventHandler(async (event) => {
  setResponseHeader(event, 'Content-Type', 'text/plain; charset=utf-8')
  setResponseHeader(event, 'Cache-Control', 'public, max-age=0, s-maxage=3600, stale-while-revalidate=86400')
  return await buildLlmsTxt()
})

async function buildLlmsTxt() {
  // Replace with your own content source (Nuxt Content, filesystem…)
  return `# My Nuxt Project\n\n> Short summary.\n\n## Docs\n\n- [Getting Started](https://example.com/docs/getting-started)\n`
}

```

Nitro server route. Runs on every deploy target Nuxt supports — Node, Cloudflare, Vercel, Netlify, Deno.

Serve markdown to agents

Claude Code, Cursor, and Codex increasingly request markdown via the Accept: text/markdown header. Pairing your llms.txt with per-page markdown responses makes your whole site legible to agents, not just the index.

Generate llms.txt from your URL or browse the llms.txt directory for real examples.

Frequently Asked Questions

Install @mdream/nuxt with nuxi module add @mdream/nuxt. The module reads your rendered pages and writes llms.txt and llms-full.txt into the build output. Both files are generated at build time, so there is no runtime overhead.

Yes. @mdream/nuxt works alongside @nuxt/content and will include your rendered markdown pages. If you want full control, a Nitro server route at server/routes/llms.txt.ts can call queryCollection() (Nuxt Content v3) or queryContent() (v2) directly and format the output yourself.

Yes. The module writes a static file into .output/public/, so every Nitro preset (cloudflare-pages, vercel, netlify, node-server, bun, deno) serves it with no extra configuration. Dynamic server routes also work on every preset.

@mdream/nuxt generates both llms.txt and llms-full.txt by default. llms-full.txt embeds the full markdown of every page inline, which is useful for RAG pipelines and chat-with-docs apps. No extra configuration is needed.

Prefer build-time generation via @mdream/nuxt — your llms.txt stays in sync with your pages automatically. Commit a static public/llms.txt only when you want a fully hand-curated file that never changes.

Related tools