Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.sdk.anghami.com/llms.txt

Use this file to discover all available pages before exploring further.

The SDK is agent-native. An autonomous agent — Claude, Cursor, an MCP client (coming soon), or a custom LangChain/LangGraph workflow — can start from a single root URL and discover everything it needs without prior knowledge. No README required. Just GET https://docs.sdk.anghami.com/.

The discovery surface

EndpointStandardWhat it returns
/api/anghami-sdk.openapi.yamlOpenAPI 3.1Single bundled spec covering every service — feed it to any code-generator.
/api/anghami-sdk.openapi.jsonOpenAPI 3.1Same, JSON. Postman imports this directly.
/.well-known/llms.txtllmstxt.orgLLM-friendly markdown briefing — overview, services, auth, links.
https://sdk.anghami.com/.well-known/oauth-authorization-serverRFC 8414OAuth 2.1 authorization-server metadata (PKCE, token endpoints).
Coming soon — the metadata document will be served once the OAuth surface is publicly available.
MCP coming soon. A native MCP server with /.well-known/mcp/server-card.json and an api-catalog linkset will land alongside the server implementation. Until then, OpenAPI + llms.txt cover the discovery surface.

Codegen a typed REST client

# 1. Pull the bundle
curl -sS https://docs.sdk.anghami.com/api/anghami-sdk.openapi.yaml \
  -o anghami-sdk.openapi.yaml

# 2. Generate
npx @openapitools/openapi-generator-cli generate \
  -i anghami-sdk.openapi.yaml \
  -g typescript-fetch \
  -o ./client
The bundle covers all services in one document — one codegen pass produces typed clients for everything (catalog, search, library, streaming, etc.). When we ship a new service, it appears in the bundle on the next deploy. No version pinning, no client SDK release cycle to wait for. If your stack speaks Postman, Insomnia, Bruno, or any OpenAPI-aware tooling, drop the JSON variant in directly.

Brief an LLM in one fetch

curl -sS https://docs.sdk.anghami.com/.well-known/llms.txt
The file follows the llmstxt.org convention — concise markdown that describes the API at a level appropriate for an LLM context window. Drop it into a system prompt and the model gets enough scaffolding to plan calls without reading the full OpenAPI spec.

Authenticate without us telling you

curl -sS https://sdk.anghami.com/.well-known/oauth-authorization-server
Returns the authorization-server metadata an OAuth-aware client needs (token endpoint, authorization endpoint, supported grant types, PKCE method, etc.). MCP clients and any RFC 8414-aware tooling can configure themselves from this.
Coming soon — until the metadata document is publicly served, configure clients manually using the endpoints documented in Authentication.
For server-to-server use, a single API key in the x-api-key header is simpler — see API Keys.

Why this matters

The point isn’t novelty — RFC 8414 and OpenAPI are old. The point is that every SDK surface (REST, OAuth, eventually MCP) is reachable from one root URL via well-known conventions. An agent can:
  • Discover the API without reading our docs.
  • Authenticate without us telling it which OAuth flow we use.
  • Stay current — when we ship a new service, the bundled OpenAPI and llms.txt reflect it on the next deploy.
  • llms.txt — what’s in our briefing file and how to consume it.
  • AI Features — what AI surfaces the SDK currently exposes.
  • Authentication — full OAuth + PKCE walkthrough.