Build on Stripe with LLMs
Use LLMs in your Stripe integration workflow.
You can use large language models (LLMs) to assist in the building of Stripe integrations. We provide a set of tools and best practices if you use LLMs during development.
Plain text docs 
You can access all of our documentation as plain text markdown files by adding .
to the end of any url. For example, you can find the plain text version of this page itself at https://docs.stripe.com/building-with-llms.md.
This helps AI tools and agents consume our content and allows you to copy and paste the entire contents of a doc into an LLM. This format is preferable to scraping or copying from our HTML and JavaScript-rendered pages because:
- Plain text contains fewer formatting tokens.
- Content that isn’t rendered in the default view (for example, it’s hidden in a tab) of a given page is rendered in the plain text version.
- LLMs can parse and understand markdown hierarchy.
We also host an /llms.txt file which instructs AI tools and agents how to retrieve the plain text versions of our pages. The /llms.
file is an emerging standard for making websites and content more accessible to LLMs.
Stripe Model Context Protocol (MCP) Server 
You can use the Stripe Model Context Protocol (MCP) server if you use code editors that use AI, such as Cursor or Windsurf, or general purpose tools such as Claude Desktop. The MCP server provides AI agents a set of tools you can use to call the Stripe API and search our knowledge base (documentation, support articles, and so on).
Local server 
If you prefer or require a local setup, you can run the local Stripe MCP server.
Remote server Public preview 
Stripe also hosts a remote MCP server, available at https://mcp.
. To interact with the remote server, you need to pass your Stripe API key as a bearer token in the request header. We recommend using restricted API keys to limit access to the functionality your agent requires.
curl https://mcp.stripe.com/ \ -H "Content-Type: application/json" \ -H "Authorization: Bearer sk_test_BQokikJOvBiI2HlWgH4olfQ2" \ -d '{ "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "create_customer", "arguments": {"name": "Jenny Rosen", "email": "jenny.rosen@example.com" } }, "id": 1 }'
This remote server currently only supports bearer token authentication. To avoid phising attacks verify you only use trusted MCP clients and double check the URL used is the official https://mcp.
server.
VS Code AI Assistant
If you’re a Visual Studio Code user, you can install the Stripe VS Code extension to access our AI Assistant.
With the Stripe AI Assistant, you can:
- Get immediate answers about the Stripe API and products
- Receive code suggestions tailored to your integration
- Ask follow-up questions for more detailed information
- Access knowledge from the Stripe documentation and the Stripe developer community
To get started with the Stripe AI assistant:
- Make sure you have the Stripe VS Code extension installed.
- Navigate to the Stripe extension UI
- Under AI Assistant click Ask a question.
- If you’re a Copilot user, this opens the Copilot chat where you can @-mention
@stripe
. In the input field, talk to the Stripe-specific assistant using@stripe
followed by your question. - If you’re not a Copilot user, it opens a chat UI where you can talk to the Stripe LLM directly.
- If you’re a Copilot user, this opens the Copilot chat where you can @-mention
Stripe Agent Toolkit SDK
If you’re building agentic software, we provide an SDK for adding Stripe functionality to your agent’s capabilities. For example, using the SDK you can:
- Create Stripe objects
- Charge for agent usage
- Use with popular frameworks such as OpenAI’s Agent SDK, Vercel’s AI SDK, Langchain, and CrewAI
Learn more in our agents documentation.