Model Context Protocol (MCP)
Fahemai empowers your AI agents with the Model Context Protocol (MCP), allowing them to use local and remote tools seamlessly. By connecting MCP servers, you transform your LLM from a simple conversationalist into a powerful assistant capable of interacting with external data and services.
Pro Tip
- Protocols Supported: HTTP Streamable, SSE, and Stdio modes.
- Expansion: We recommend exploring Composio MCP for a vast library of ready-to-use tools.
1. Enable Cognitive Capabilities
Before an agent can use MCP tools, its "brain" (the LLM) must support Function Calling.
- Navigate to Models > LLM Models.
- Edit your desired model and toggle the Function Calling switch to ON.
2. Access MCP Orchestration
Manage all your tool connections from a centralized hub.
- Select Plugin Management from the sidebar.
- Open the MCP Management dashboard.
3. Register a New Server
Connect your local or cloud-based tools to the platform.
- Click the Add button (+) in the top-right corner.
- Select Create MCP Server.
- Configuration: Provide the connection URL (e.g., for a Web Scraper or Database tool).
- Verification: Click Test to ensure a heartbeat connection.
- Submit to register.
4. Activation & Insights
Once registered, you must activate the server to make it available to your agents.
- Toggle Switch: Turn on the server card to enable the connection.
- Detailed View: Click on the card to inspect provided tools and their schemas.
5. Deployment in Pipelines
Bring the tools to life in your AI agents.
- Open your Pipeline settings.
- Ensure the Runner is set to Local Agent.
- Select the LLM model that you enabled function calling for in Step 1.
Ready to Launch
Your agent can now automatically decide when to call these tools during a conversation to provide accurate, real-time results.