llms.txt Directory

llms.txt for SvelteKit

Ship a spec-compliant /llms.txt from any SvelteKit app. Static asset, server route, or a +server.ts endpoint that builds from MDsveX.

Why llms.txt on SvelteKit?

SvelteKit serves everything under static/ directly, so a hand-written llms.txt lands at the site root with no routing needed. For dynamic content, a +server.ts endpoint can return text/plain and is portable across every adapter: adapter-vercel, adapter-cloudflare, adapter-node, and adapter-static.

If you build docs with MDsveX or a content collection, mark the endpoint with export const prerender = true so llms.txt ships as a static asset on every deploy. Zero runtime cost, stays in sync with your pages.

Install options

Pick whichever fits your stack. For most teams the module or a static file is enough; reach for the dynamic route when you need request-time content.

Static file

Drop a file at static/llms.txt. No build step, no server logic.

```md
# My SvelteKit Project

> A short summary of what this project does and who it is for.

## Docs

- [Getting Started](https://example.com/docs/getting-started): Quick setup guide
- [API Reference](https://example.com/docs/api): Full API docs

```
Dynamic route

Generate at request time from a CMS, MDX collection, or database. Handler at src/routes/llms.txt/+server.ts.

```ts
export const prerender = true

export async function GET() {
  const body = await buildLlmsTxt()
  return new Response(body, {
    headers: {
      'Content-Type': 'text/plain; charset=utf-8',
      'Cache-Control': 'public, max-age=0, s-maxage=3600, stale-while-revalidate=86400',
    },
  })
}

async function buildLlmsTxt() {
  // Replace with your MDsveX collection or CMS query
  return `# My SvelteKit Project\n\n> Short summary.\n\n## Docs\n\n- [Getting Started](https://example.com/docs/getting-started)\n`
}

```

SvelteKit endpoint. Mark prerender = true for static output, or leave false for request-time generation.

Serve markdown to agents

Claude Code, Cursor, and Codex increasingly request markdown via the Accept: text/markdown header. Pairing your llms.txt with per-page markdown responses makes your whole site legible to agents, not just the index.

Generate llms.txt from your URL or browse the llms.txt directory for real examples.

Frequently Asked Questions

Use static/llms.txt for a hand-curated file you rarely change. Use src/routes/llms.txt/+server.ts when you want to generate the file from your content collection. Don't ship both — the static asset wins and the endpoint never runs.

Yes, as long as you add export const prerender = true. The adapter pre-renders the endpoint into a static llms.txt file at build time. Works with adapter-cloudflare, adapter-vercel, adapter-netlify, and adapter-node too.

In your +server.ts endpoint, use globalThis._importMeta_.glob() to import all your markdown posts, iterate over them, and return a formatted llms.txt body. Prerender the route so the file builds once per deploy.

text/plain; charset=utf-8 is the safest default. text/markdown; charset=utf-8 is also valid per the spec and plays well with agents that negotiate on Accept: text/markdown. Either works; pick text/plain if you want zero compatibility risk.

Related tools