Working with LLMs
Large language models (LLMs) can help you troubleshoot your applications and fix issues faster. When integrated with Honeybadger’s error tracking and application monitoring tools, they become even more effective at helping you squash bugs and keep your systems running smoothly.
Honeybadger Model Context Protocol (MCP) server
Section titled “Honeybadger Model Context Protocol (MCP) server”The Honeybadger MCP server provides structured access to Honeybadger’s API through the Model Context Protocol, allowing AI assistants to interact with your Honeybadger projects and monitoring data.
Instead of manually copying error details or switching between tools, your AI assistant can automatically fetch error data, analyze patterns, and provide contextual debugging suggestions—all within your existing workflow.
What is the Model Context Protocol?
Section titled “What is the Model Context Protocol?”The Model Context Protocol (MCP) is a standard that enables LLMs to interact with external services in a structured and safe manner. Think of it as giving your AI assistant the ability to use tools - in this case, tools to manage your Honeybadger data and investigate errors and production issues.
Quick start
Section titled “Quick start”The easiest way to get started is with Docker:
docker pull ghcr.io/honeybadger-io/honeybadger-mcp-server:latestYou’ll need your Honeybadger personal authentication token to configure the server, which you can find under the “Authentication” tab in your Honeybadger user settings.
Claude Code
Section titled “Claude Code”Run this command to configure Claude Code:
claude mcp add honeybadger -- docker run -i --rm -e HONEYBADGER_PERSONAL_AUTH_TOKEN="HONEYBADGER_PERSONAL_AUTH_TOKEN" ghcr.io/honeybadger-io/honeybadger-mcp-server:latestCursor, Windsurf, and Claude Desktop
Section titled “Cursor, Windsurf, and Claude Desktop”Put this config in ~/.cursor/mcp.json for
Cursor, or
~/.codeium/windsurf/mcp_config.json for
Windsurf. See Anthropic’s
MCP quickstart guide for how
to locate your claude_desktop_config.json for Claude Desktop:
{ "mcpServers": { "honeybadger": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "HONEYBADGER_PERSONAL_AUTH_TOKEN", "ghcr.io/honeybadger-io/honeybadger-mcp-server" ], "env": { "HONEYBADGER_PERSONAL_AUTH_TOKEN": "your personal auth token" } } }}VS Code
Section titled “VS Code”Add the following to your
user settings
or .vscode/mcp.json in your workspace:
{ "mcp": { "inputs": [ { "type": "promptString", "id": "honeybadger_auth_token", "description": "Honeybadger Personal Auth Token", "password": true } ], "servers": { "honeybadger": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "HONEYBADGER_PERSONAL_AUTH_TOKEN", "ghcr.io/honeybadger-io/honeybadger-mcp-server" ], "env": { "HONEYBADGER_PERSONAL_AUTH_TOKEN": "${input:honeybadger_auth_token}" } } } }}See Use MCP servers in VS Code for more info.
Add the following to your Zed settings file in ~/.config/zed/settings.json:
{ "context_servers": { "honeybadger": { "command": { "path": "docker", "args": [ "run", "-i", "--rm", "-e", "HONEYBADGER_PERSONAL_AUTH_TOKEN", "ghcr.io/honeybadger-io/honeybadger-mcp-server" ], "env": { "HONEYBADGER_PERSONAL_AUTH_TOKEN": "your personal auth token" } }, "settings": {} } }}Running without Docker
Section titled “Running without Docker”If you don’t have Docker, you can build the server from source:
git clone git@github.com:honeybadger-io/honeybadger-mcp-server.gitcd honeybadger-mcp-servergo build -o honeybadger-mcp-server ./cmd/honeybadger-mcp-serverAnd then configure your MCP client to run the server directly:
{ "mcpServers": { "honeybadger": { "command": "/path/to/honeybadger-mcp-server", "args": ["stdio"], "env": { "HONEYBADGER_PERSONAL_AUTH_TOKEN": "your personal auth token" } } }}For detailed development instructions, check out the full documentation on GitHub.
What can you do with the MCP server?
Section titled “What can you do with the MCP server?”Once you have Honeybadger’s MCP server running, your AI assistant gains the following capabilities:
- Project management: List, create, update, and delete projects, and get detailed project reports.
- Error investigation: Search and filter errors, view occurrences and stack traces, see affected users, and analyze error patterns.
- We’re actively developing additional tools for working with Honeybadger Insights data, account and team management, uptime monitoring, and other platform features. More to come!
For a complete list of available tools and their parameters, see the tools documentation in the GitHub README.
Example workflows
Section titled “Example workflows”Here are some things you might ask your AI assistant to help with. Always review closely; LLMs can make mistakes.
“Fix this error [link to error]”
Your assistant can look up the project and error details, open the source file from the stack trace, and fix the bug. You could also try phrases like “Tell me more about this error,” “Help me troubleshoot this error,” etc.
“What’s happening with my Honeybadger projects?”
Your assistant can list your projects, show recent error activity, and filter faults by time and environment to provide a quick overview or help you triage.
“Create an interactive chart that shows error occurrences for my ‘[project name]’ Honeybadger project over time. Use your excellent front end skills to make it look very professional and well polished.”
Your assistant can fetch time-series data from your project and generate an interactive chart showing error trends.
Documentation for LLMs (llms.txt)
Section titled “Documentation for LLMs (llms.txt)”We provide machine-readable versions of our documentation optimized for LLMs. These files follow the llms.txt standard and are automatically generated from our documentation content.
Available formats
Section titled “Available formats”- /llms.txt - Index page with links to full and abridged documentation, plus specialized subsets
- /llms-full.txt - Complete documentation in text format
- /llms-small.txt - Abridged documentation with non-essential content removed
The abridged version (llms-small.txt) is optimized for token efficiency while
preserving essential technical information.
Specialized documentation subsets
Section titled “Specialized documentation subsets”We also provide focused documentation subsets for specific use cases:
- The Honeybadger Data (REST) API and reporting APIs
- Honeybadger Insights and BadgerQL
- Honeybadger’s user interface and product features
- Individual documentation sets for each client library (Ruby, JavaScript, Python, PHP, Elixir, etc.)
Visit /llms.txt for the complete list with links to download.
Using llms.txt files
Section titled “Using llms.txt files”These files are designed to be consumed by LLMs either:
- Directly: Some LLM tools can fetch and process llms.txt files automatically
- As context: Copy and paste relevant sections into your AI assistant
- Via automation: Build tools that fetch and inject documentation into LLM prompts
Responding to alerts in Slack and GitHub
Section titled “Responding to alerts in Slack and GitHub”Honeybadger includes full backtraces in Slack error notifications to provide the context that AI coding assistants need for effective debugging.
The backtrace appears as formatted code in Slack, allowing you to copy and paste it to AI debugging assistants like Cursor, Windsurf, or Copilot. If you use Cursor’s Background Agents, you can install their Slack integration to ask Cursor to fix the error directly from Slack:
We include similar information when creating issues for errors in GitHub and other issue trackers, which should help you—for example—assign bugfixes to GitHub Copilot.
See our integration docs to learn more about our 3rd-party integrations.
Going further
Section titled “Going further”As we continue to develop LLM integrations, we’re exploring ways to make automated monitoring and debugging more intelligent. Some ideas we’re excited about:
- Root cause analysis and bug fixes
- Natural language queries for error search and BadgerQL
Do you have ideas for how LLMs could improve your Honeybadger experience? We’d love to hear from you! Drop us a line at support@honeybadger.io.