mcp

class agentopera.mcp.StdioMcpToolAdapter(server_params: StdioServerParams, tool: Tool)[source]

Bases: McpToolAdapter[StdioServerParams]

Allows you to wrap an MCP tool running over STDIO and make it available to agentopera.

This adapter enables using MCP-compatible tools that communicate over standard input/output with agentopera agents. Common use cases include wrapping command-line tools and local services that implement the Model Context Protocol (MCP).

Note

To use this class, you need to install mcp extra for the agentopera package.

pip install -U "agentopera[mcp]"
Parameters:
  • server_params (StdioServerParams) – Parameters for the MCP server connection, including command to run and its arguments

  • tool (Tool) – The MCP tool to wrap

See mcp_server_tools() for examples.

class agentopera.mcp.StdioServerParams(*, command: str, args: list[str] = <factory>, env: dict[str, str] | None = None, encoding: str = 'utf-8', encoding_error_handler: ~typing.Literal['strict', 'ignore', 'replace'] = 'strict')[source]

Bases: StdioServerParameters

Parameters for connecting to an MCP server over STDIO.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentopera.mcp.SseMcpToolAdapter(server_params: SseServerParams, tool: Tool)[source]

Bases: McpToolAdapter[SseServerParams]

Allows you to wrap an MCP tool running over Server-Sent Events (SSE) and make it available to agentopera.

This adapter enables using MCP-compatible tools that communicate over HTTP with SSE with agentopera agents. Common use cases include integrating with remote MCP services, cloud-based tools, and web APIs that implement the Model Context Protocol (MCP).

Note

To use this class, you need to install mcp extra for the agentopera package.

pip install -U "agentopera[mcp]"
Parameters:
  • server_params (SseServerParameters) – Parameters for the MCP server connection, including URL, headers, and timeouts

  • tool (Tool) – The MCP tool to wrap

Examples

Use a remote translation service that implements MCP over SSE to create tools that allow agentopera agents to perform translations:

import asyncio
from agentopera.models.openai import OpenAIChatCompletionClient
from agentopera.agents.tools.mcp import SseMcpToolAdapter, SseServerParams
from agentopera.chatflow.agents import AssistantAgent
from agentopera.chatflow.ui import Console
from agentopera.core import CancellationToken


async def main() -> None:
    # Create server params for the remote MCP service
    server_params = SseServerParams(
        url="https://api.example.com/mcp",
        headers={"Authorization": "Bearer your-api-key", "Content-Type": "application/json"},
        timeout=30,  # Connection timeout in seconds
    )

    # Get the translation tool from the server
    adapter = await SseMcpToolAdapter.from_server_params(server_params, "translate")

    # Create an agent that can use the translation tool
    model_client = OpenAIChatCompletionClient(model="gpt-4")
    agent = AssistantAgent(
        name="translator",
        model_client=model_client,
        tools=[adapter],
        system_message="You are a helpful translation assistant.",
    )

    # Let the agent translate some text
    await Console(
        agent.run_stream(task="Translate 'Hello, how are you?' to Spanish", cancellation_token=CancellationToken())
    )


if __name__ == "__main__":
    asyncio.run(main())
class agentopera.mcp.SseServerParams(*, url: str, headers: dict[str, Any] | None = None, timeout: float = 5, sse_read_timeout: float = 300)[source]

Bases: BaseModel

Parameters for connecting to an MCP server over SSE.

url: str
headers: dict[str, Any] | None
timeout: float
sse_read_timeout: float
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

async agentopera.mcp.mcp_server_tools(server_params: StdioServerParams | SseServerParams) list[StdioMcpToolAdapter | SseMcpToolAdapter][source]

Creates a list of MCP tool adapters that can be used with AgentOpera agents.

This factory function connects to an MCP server and returns adapters for all available tools. The adapters can be directly assigned to an AgentOpera agent’s tools list.

Parameters:

server_params (McpServerParams) – Connection parameters for the MCP server. Can be either StdioServerParams for command-line tools or SseServerParams for HTTP/SSE services.

Returns:

A list of tool adapters ready to use

with AgentOpera agents.

Return type:

list[StdioMcpToolAdapter | SseMcpToolAdapter]

Examples

Local file system MCP service over standard I/O example:

Install the filesystem server package from npm (requires Node.js 16+ and npm).

npm install -g @modelcontextprotocol/server-filesystem

Create an agent that can use all tools from the local filesystem MCP server.

import asyncio
from pathlib import Path
from agentopera.models.openai import OpenAIChatCompletionClient
from agentopera.agents.tools.mcp import StdioServerParams, mcp_server_tools
from agentopera.chatflow.agents import AssistantAgent
from agentopera.core import CancellationToken


async def main() -> None:
    # Setup server params for local filesystem access
    desktop = str(Path.home() / "Desktop")
    server_params = StdioServerParams(
        command="npx.cmd", args=["-y", "@modelcontextprotocol/server-filesystem", desktop]
    )

    # Get all available tools from the server
    tools = await mcp_server_tools(server_params)

    # Create an agent that can use all the tools
    agent = AssistantAgent(
        name="file_manager",
        model_client=OpenAIChatCompletionClient(model="gpt-4"),
        tools=tools,  # type: ignore
    )

    # The agent can now use any of the filesystem tools
    await agent.run(task="Create a file called test.txt with some content", cancellation_token=CancellationToken())


if __name__ == "__main__":
    asyncio.run(main())

Local fetch MCP service over standard I/O example:

Install the mcp-server-fetch package.

pip install mcp-server-fetch

Create an agent that can use the fetch tool from the local MCP server.

import asyncio

from agentopera.chatflow.agents import AssistantAgent
from agentopera.models.openai import OpenAIChatCompletionClient
from agentopera.agents.tools.mcp import StdioServerParams, mcp_server_tools


async def main() -> None:
    # Get the fetch tool from mcp-server-fetch.
    fetch_mcp_server = StdioServerParams(command="uvx", args=["mcp-server-fetch"])
    tools = await mcp_server_tools(fetch_mcp_server)

    # Create an agent that can use the fetch tool.
    model_client = OpenAIChatCompletionClient(model="gpt-4o")
    agent = AssistantAgent(name="fetcher", model_client=model_client, tools=tools, reflect_on_tool_use=True)  # type: ignore

    # Let the agent fetch the content of a URL and summarize it.
    result = await agent.run(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle")
    print(result.messages[-1].content)


asyncio.run(main())

Remote MCP service over SSE example:

from agentopera.agents.tools.mcp import SseServerParams, mcp_server_tools


async def main() -> None:
    # Setup server params for remote service
    server_params = SseServerParams(url="https://api.example.com/mcp", headers={"Authorization": "Bearer token"})

    # Get all available tools
    tools = await mcp_server_tools(server_params)

    # Create an agent with all tools
    agent = AssistantAgent(name="tool_user", model_client=OpenAIChatCompletionClient(model="gpt-4"), tools=tools)  # type: ignore

For more examples and detailed usage, see the samples directory in the package repository.