An MCP server for Clusterfudge

Empower your AI agents with insights from your GPU cluster

An MCP server for Clusterfudge
Sam2025-03-19

Clusterfudge + MCP

When AI labs deploy Clusterfudge across their GPU clusters, a previously inaccessible wealth of information becomes available to their teams: failed jobs, inefficient workloads, idle nodes. Our new clusterfudge-mcp server makes all this valuable data directly accessible to your favourite AI models right in your IDE, empowering you to interact with your cluster and workloads in novel and exciting ways.

Anthropic's Model Context Protocol establishes a standard for exposing actions and data sources to AI agents. With our MCP server, you can seamlessly enhance your AI-powered research workflow by connecting your models to all the data and capabilities available through Clusterfudge. Ask about underutilized nodes or inefficient workloads, cancel jobs or relaunch experiments — all without leaving your IDE.

The real magic happens when your AI models gain simultaneous access to both your working environment (codebase, scripts, documentation) and your cluster environment (jobs, nodes, GPUs). This unified context is particularly powerful for agentic systems, which can help translate insights from your workloads into changes to launch scripts/code and can assist with cancelling and relaunching workloads after you've made changes to your code.

Ready to transform your AI research workflow?

Our MCP server is a lightweight, standalone binary available for both Mac and Linux, designed to work seamlessly with Cursor, Claude for Desktop and Windsurf.ai.

Join the beta

See how Clusterfudge can accelerate your research.