Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.sdk.anghami.com/llms.txt

Use this file to discover all available pages before exploring further.

/.well-known/llms.txt is a single markdown document optimized for being dropped into an LLM context window. It follows the llmstxt.org convention.

What’s in it

  • Identity — what the API is, who it’s for, the base URL.
  • Authentication — API key vs OAuth, header names, scopes.
  • Pagination — the cursor scheme.
  • Service inventory — one line per service with its purpose.
  • Pointers — the bundled OpenAPI URL, the docs site URL, this page.
The goal is to give an LLM enough scaffolding in a few hundred tokens to plan correct calls without reading the full OpenAPI (which is tens of thousands of tokens).

How to consume it

As a system prompt component

const llmsTxt = await fetch("https://docs.sdk.anghami.com/.well-known/llms.txt").then(r => r.text());

const messages = [
  {
    role: "system",
    content: `You are integrating with the Anghami + OSN+ SDK. Reference:\n\n${llmsTxt}`,
  },
  { role: "user", content: userQuery },
];

For agent planning

Use llms.txt to plan. When a specific operation is needed, fetch the relevant slice of the OpenAPI bundle to get exact request/response shapes — don’t load the full bundle into context.

For documentation grounding

Mintlify’s built-in “Ask AI” indexes both the MDX content of this site and the bundled OpenAPI — so on this docs site, you don’t need to wire up llms.txt yourself. It’s there for other agents that aren’t already integrated with our docs.

Caching

The file is served with Cache-Control: public, max-age=3600. Memoize for an hour; refetch on cache miss. We update it whenever the service inventory or auth shape changes.

Why not just use OpenAPI?

OpenAPI is the machine contract — exact, exhaustive, large. llms.txt is the LLM briefing — concise, opinionated, readable. An LLM that sees only the OpenAPI has to figure out the high-level shape of the API from the operations alone, which wastes tokens and produces hesitant tool calls. An LLM that sees llms.txt first knows what the API is for, and uses the OpenAPI to fill in the exact details. Use both: llms.txt for planning, OpenAPI for execution.