🚀 Say hello to Volcano! Kong has just open-sourced a power-packed TypeScript SDK that’s set to transform the way you build production-ready AI agents. This isn’t just any SDK; it’s a game-changer that plays nice with multiple LLM providers and has native Model Context Protocol (MCP) tool use. And guess what? It’s here to make your life as a developer a whole lot easier!

So, what’s the big deal about Volcano?

Well, imagine this: with just 9 lines of code, you can create multi-step agent workflows that span multiple LLM providers and MCP tools. That’s right, you heard it! No more wrestling with 100+ lines of code to handle tool schemas, context management, provider switching, error handling, and HTTP clients. Volcano does all that for you, in just 9 lines!

Here’s a sneak peek into what Volcano can do:

“`typescript
import { agent, llmOpenAI, llmAnthropic, mcp } from “volcano-ai”;

// Setup: two LLMs, two MCP servers
const planner = llmOpenAI({ model: “gpt-5-mini”, apiKey: process.env.OPENAI_API_KEY! });
const executor = llmAnthropic({ model: “claude-4.5-sonnet”, apiKey: process.env.ANTHROPIC_API_KEY! });
const database = mcp(“https://api.company.com/database/mcp”);
const slack = mcp(“https://api.company.com/slack/mcp”);

// One workflow
await agent({ llm: planner })
.then({
prompt: “Analyze last week’s sales data”,
mcps: [database] // Auto-discovers and calls the right tools
})
.then({
llm: executor, // Switch to Claude
prompt: “Write an executive summary”
})
.then({
prompt: “Post the summary to #executives”,
mcps: [slack]
})
.run();
“`

But wait, there’s more! Volcano comes packed with features like:

– A compact, chainable API that passes intermediate context between steps and switches LLMs per step.
– Automatic tool discovery and invocation for MCP servers.
– Support for multiple LLM providers in a single workflow.
– Streaming of intermediate and final results for responsive agent interactions.
– Configurable retries and timeouts for reliability under real-world failures.
– Hooks for customizing behavior and instrumentation.
– Typed error handling for actionable failures during agent execution.
– Parallel execution, branching, and loops for complex control flow.
– Observability via OpenTelemetry for tracing and metrics across steps and tool calls.
– OAuth support and connection pooling for secure, efficient access to MCP servers.

Where does Volcano fit in Kong’s MCP architecture?

Kong’s Konnect platform adds multiple MCP governance and access layers that complement Volcano’s SDK surface. AI Gateway gains MCP gateway features, the Konnect Developer Portal can be turned into an MCP server, and Kong’s team has even previewed MCP Composer and MCP Runner. It’s all about making your life as a developer smoother and more efficient!

Key Takeaways:

– Volcano is an open-source TypeScript SDK for building multi-step AI agents with first-class MCP tool use.
– It provides production features like retries, timeouts, connection pooling, OAuth, and OpenTelemetry tracing/metrics for MCP workflows.
– Volcano composes multi-LLM plans/executions and auto-discovers/invokes MCP servers/tools, minimizing custom glue code.
– Kong paired the SDK with platform controls: AI Gateway/Konnect add MCP server autogeneration, centralized OAuth 2.1, and observability.

Editorial Comments:

Kong’s Volcano SDK is a breath of fresh air in the MCP ecosystem. It’s a TypeScript-first agent framework that aligns developer workflow with enterprise controls, delivered via AI Gateway and Konnect. By prioritizing protocol-native MCP integration, Volcano cuts operational drift and closes auditing gaps as internal agents scale.

Ready to dive in? Check out the [GitHub Repo](https://github.com/kong/volcano-ai) and [Technical details](https://marktechpost.com/2023/03/15/kong-releases-volcano-a-typescript-mcp-native-sdk-for-building-production-ready-ai-agents-with-llm-reasoning-and-real-world-actions/). For tutorials, codes, and notebooks, head over to our [GitHub Page](https://github.com/kong/volcano-ai). You can also follow us on [Twitter](https://twitter.com/konginc) and join our [100k+ ML SubReddit](https://www.reddit.com/r/MachineLearning/) and [subscribe to our Newsletter](https://www.konghq.com/newsletter). And hey, if you’re on Telegram, you can join us there too! 🤖🚀

Share.
Leave A Reply

Exit mobile version