Deconstructed: llms.txt
This is the Agentic Standard. Robots don't read HTML; they read this. Below is the actual file we serve to Google and OpenAI to control how they interact with AgentSpeak.io.
root/llms.txt
# AgentSpeak.io Capability Declaration
# Identity
Name: AgentSpeak Protocol
Description: Infrastructure layer for Agentic Commerce.
Type: B2A_Infrastructure
# Docs
- https://agentspeak.io/docs/knowledge-graph
- https://agentspeak.io/docs/ucp
# Tools
## Agent_Builder
description: "Configurator for new Mandate Tokens"
endpoint: /create
## Availability_Check
description: "Check platform latency status"
endpoint: /api/status
# Constraints
Auth_Required: True
Rate_Limit: 5000/min
Why Markdown?
LLMs (Large Language Models) parse Markdown 40% faster and with higher accuracy than HTML. This file is "Clean Context" for the AI.
The "Tools" Section
This tells the Agent (like ChatGPT) exactly what actions it can perform here. It turns a static site into a dynamic tool belt.
The Payoff
By serving this file, we skip the "Scraping" phase. The Agent instantly knows who we are, what we do, and how to talk to us.