# DeployStack > DeployStack is an open-source (AGPL-3.0) MCP hosting platform for AI workflows. Deploy any MCP server from GitHub to an HTTP endpoint in 30 seconds. Built for workflow automation platforms like n8n, Dify, Voiceflow, Langflow, Zapier, Make.com, and Activepieces — and for AI development tools like Claude Desktop, Cursor, and VS Code. ## What is DeployStack? DeployStack turns stdio MCP servers into hosted HTTP endpoints. Most MCP servers on GitHub only run locally via stdio. Workflow automation platforms and cloud-based AI tools need HTTP URLs. DeployStack bridges that gap. **Two ways to use it:** 1. **Deploy from GitHub** — Point DeployStack at a GitHub repo containing an MCP server. It detects the runtime (Node.js, Python, Docker), builds it, and gives you an HTTP endpoint URL. Auto-redeploys when you push. 2. **Install from Catalog** — Browse a curated catalog of popular MCP servers and install them with one click. No local setup needed. Every MCP server you deploy or install gets a direct HTTP endpoint with a token. Paste the URL into n8n, Zapier, Make.com, or any MCP client. ## Core Features ### MCP Deployment - **GitHub to URL in 30 seconds** — Connect your repo, select a branch, get an HTTP endpoint - **Auto-redeploy on push** — Push to your repo, DeployStack rebuilds and redeploys automatically - **Runtime detection** — Supports Node.js, Python, and Docker-based MCP servers - **Direct MCP endpoints** — Each server gets its own URL and instance token - Learn more: [MCP Deployment](https://deploystack.io/mcp-deployment) ### MCP Server Catalog - **Curated catalog** — Browse and install popular MCP servers with one click - **No local installation** — Servers run on DeployStack satellite infrastructure - **Instant setup** — Install, add credentials, start using ### Token Optimization - **Hierarchical Router** — Reduces token consumption from 75,000 to 1,372 tokens (98% reduction) - **Two Meta-Tools Pattern** — Exposes `discover_mcp_tools(query)` and `execute_mcp_tool(tool_path, args)` instead of hundreds of individual tools - **Scale to 100+ Servers** — Use dozens of MCP servers simultaneously without degrading LLM performance ### Observability - **Tool usage tracking** — See which MCP tools are called, how often, and whether they succeed or fail - **Debugging** — Trace failing tool calls back to specific MCP servers - Learn more: [MCP Observability](https://deploystack.io/mcp-observability) ### Team Management & Security - **Encrypted Credential Vault** — Store API keys and tokens with encryption, auto-injected into MCP servers at runtime - **Role-Based Access Control** — Control who can use which MCP tools - **Team Isolation** — Complete separation between teams with audit logging - **OAuth Support** — MCP servers requiring authentication (Gemini, Linear, etc.) work with proper OAuth flows ### Two Connection Methods - **Hierarchical Router** — One URL (`https://satellite.deploystack.io/mcp`) gives AI agents access to all your MCP servers via OAuth. Best for Claude Desktop, Cursor, VS Code. - **Direct MCP Endpoint** — Per-server URL with token auth. Best for workflow automation (n8n, Zapier, Make.com), custom scripts, and no-code platforms. ## Use Cases ### For Workflow Automation (Primary) - **n8n** — Deploy MCP servers, paste the endpoint URL into n8n's MCP Client Tool node - **Zapier** — Add MCP server endpoints to Zapier's MCP Client integration as actions in your Zaps - **Make.com** — Connect MCP server endpoints to Make.com's MCP Client module in your scenarios - **Dify** — Dify v1.6+ only supports HTTP/SSE MCP servers. DeployStack provides the HTTP endpoints. - **Voiceflow** — Voiceflow explicitly says local MCP is not supported. DeployStack provides hosted endpoints. - **Langflow** — HTTP/SSE endpoints for production Langflow agent flows - **Activepieces** — Use MCP server endpoints in Activepieces AI agent workflows ### For AI Development Tools - **Claude Desktop** — Connect via hierarchical router for access to all MCP servers - **Cursor** — One URL in config, all MCP tools available in Copilot Chat - **VS Code** — Add DeployStack as a remote MCP server, authenticate via OAuth ### For Teams - Share MCP servers across the team without individual setup - Centralize credential management in encrypted vault - Control access with RBAC - Track usage and audit tool access - Onboard new developers in minutes with a single URL ## Technical Architecture DeployStack runs MCP servers on managed infrastructure called "satellites." When you deploy from GitHub or install from the catalog, DeployStack spawns the MCP server process on satellite infrastructure and exposes it via HTTP. **How it works:** - Stdio MCP servers are run as managed processes on satellites - Each server gets a direct HTTP endpoint with token authentication - The hierarchical router aggregates all servers behind a single OAuth-protected URL - Supports both stdio (deployed/catalog) and HTTP/SSE (remote) MCP servers **Deployment Options:** - **Global Satellites** — Cloud-hosted at cloud.deploystack.io - **Team Satellites** — Self-hosted on your own infrastructure ## Getting Started **Option 1: Deploy from GitHub** 1. Sign up at https://cloud.deploystack.io (GitHub, Google, or email) 2. Connect your GitHub account 3. Select a repository containing an MCP server 4. Get an HTTP endpoint URL and token 5. Paste the URL into your workflow tool or AI client **Option 2: Install from Catalog** 1. Sign up at https://cloud.deploystack.io 2. Browse the MCP server catalog 3. Install a server with one click 4. Add credentials to the vault 5. Copy the endpoint URL into your tool **Free Tier:** - No credit card required - Full features included ## Open Source DeployStack is open source under the AGPL-3.0 license. Self-host on your own infrastructure or use the managed cloud platform. **Repository:** https://github.com/deploystackio/deploystack ## Integrations - [VS Code Integration](https://deploystack.io/integrations/vscode) — Connect VS Code to DeployStack via OAuth - [n8n Integration](https://deploystack.io/integrations/n8n) — Deploy MCP servers for n8n workflows - [Zapier Integration](https://deploystack.io/integrations/zapier) — Use MCP tools as Zapier actions - [Make.com Integration](https://deploystack.io/integrations/make) — Connect MCP servers to Make.com scenarios - [Activepieces Integration](https://deploystack.io/integrations/activepieces) — Use MCP servers in Activepieces AI workflows ## Key Resources - [Homepage](https://deploystack.io) — Product overview and features - [Cloud Platform](https://cloud.deploystack.io) — Sign up and start using DeployStack - [GitHub Repository](https://github.com/deploystackio/deploystack) — Source code and issues - [Documentation](https://docs.deploystack.io) — Technical docs - [Blog](https://deploystack.io/blog) — Articles and tutorials - [Changelog](https://deploystack.io/changelog) — Product updates - [MCP Deployment](https://deploystack.io/mcp-deployment) — How deployment works - [MCP Observability](https://deploystack.io/mcp-observability) — Monitoring and debugging - [Discord](https://discord.gg/42Ce3S7b3b) — Community ## Positioning **DeployStack vs FastMCP Cloud:** FastMCP Cloud is proprietary SaaS. DeployStack is open source (AGPL-3.0), self-hostable, with 98% token reduction, curated catalog, and encrypted credential vault. FastMCP is general-purpose. DeployStack is built for workflow automation platforms. **DeployStack vs mcp-remote:** mcp-remote is a local proxy that requires Node.js. DeployStack is cloud hosting — no local installation, no CLI setup. **DeployStack vs DIY hosting (Fly.io, Railway):** DIY takes hours and requires DevOps skills. DeployStack takes 30 seconds with auto-redeploy. **Category:** Open source MCP hosting for AI workflows. ## Common Questions **Q: How does deployment work?** A: Connect your GitHub account, select a repo with an MCP server, and DeployStack builds and hosts it. You get an HTTP endpoint URL. When you push to your repo, it redeploys automatically. **Q: What workflow tools does it work with?** A: n8n, Zapier, Make.com, Dify, Voiceflow, Langflow, Activepieces — any tool that can connect to an HTTP/SSE MCP endpoint. **Q: What AI clients does it work with?** A: Claude Desktop, Cursor, VS Code, and any MCP-compatible client. **Q: How does token reduction work?** A: Instead of injecting 150+ tool definitions (75,000 tokens) into every LLM conversation, the hierarchical router exposes 2 meta-tools (1,372 tokens) that let the LLM discover and execute only what it needs. **Q: Can I self-host?** A: Yes. DeployStack is AGPL-3.0 licensed. Deploy on your own infrastructure for full control. **Q: How are credentials stored?** A: Encrypted at rest, auto-injected into MCP servers at runtime. Team members never see raw API keys. **Q: Is it free?** A: Free tier available, no credit card required.