Skip to content

Model Context Protocol (MCP)

The Model Context Protocol (MCP) enables AI assistants to securely connect to your Microfn functions, allowing for dynamic, AI-driven interactions with your serverless infrastructure. This integration transforms your functions into tools that AI can discover, understand, and execute intelligently.

MCP is an open protocol that standardizes how AI assistants interact with external systems. With Microfn’s MCP integration:

  • AI assistants can discover your available functions
  • Understand function purposes through descriptions and metadata
  • Execute functions with appropriate parameters
  • Process results to provide intelligent responses
  • Chain operations to complete complex tasks

For AI assistants that support remote MCP servers (like Claude):

  1. Open your AI assistant’s integrations page
  2. Add a new MCP server
  3. Enter the Microfn MCP URL:
https://mcp.microfn.dev/sse
  1. Authenticate with your Microfn account when prompted
  2. Your functions are now available to the AI assistant

For local development environments:

// Claude Desktop config example
// ~/Library/Application Support/Claude/claude_desktop_config.json (Mac)
// %APPDATA%/Claude/claude_desktop_config.json (Windows)
{
"mcpServers": {
"microfn": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.microfn.dev/sse"]
}
}
}

Beyond just executing existing functions, the MCP integration allows an AI assistant to write, trigger, and modify any function in your Microfn workspace. This turns your AI assistant into a powerful collaborator that can actively help you build and manage your serverless applications.

Here are a few examples of how you can leverage MCP to interact with your Microfn functions.

You can create a function that uses the @microfn/kv module to give your AI assistant a persistent memory across conversations.

First, you could ask the AI to create the function:

User:

Create a function named remember that can store and retrieve key-value pairs using the @microfn/kv module.

AI:

I have created the remember function for you.

functions/remember.ts
import kv from "@microfn/kv";
/**
* Stores or retrieves a piece of information.
* @param input.key The key to store or retrieve.
* @param input.value If provided, the value to store. If not, the function retrieves the value for the key.
* @returns The stored value or a success message.
*/
export default async function main(input: {
key: string;
value?: any;
}) {
const { key, value } = input;
const kvKey = `memory:${key}`;
if (value !== undefined) {
await kv.set(kvKey, value);
return { success: true, message: `I will remember that ${key} is ${JSON.stringify(value)}.` };
} else {
const storedValue = await kv.get(kvKey);
if (storedValue !== null) {
return { success: true, value: storedValue };
} else {
return { success: false, message: `I don't have a memory for "${key}".` };
}
}
}

Now you can use this function in a conversation:

User:

Please remember that my favorite color is blue.

AI:

[Executes remember function with { key: "favoriteColor", value: "blue" }] Okay, I will remember that your favorite color is “blue”.

User:

What is my favorite color?

AI:

[Executes remember function with { key: "favoriteColor" }] Your favorite color is “blue”.

You can ask your AI assistant to create new functions on the fly.

User:

Create a new function called greet that takes a name and returns ‘Hello, [name]!’.

AI:

Sure, I can do that. I will create a new function named greet. [AI writes the new function file to functions/greet.ts]

functions/greet.ts
/**
* Greets a user by name.
* @param input.name The name to greet.
* @returns A greeting message.
*/
export default async function main(input: { name: string }) {
return `Hello, ${input.name}!`;
}

AI:

I have created the greet function. You can now call it.