Empowering LLMs: MCP Manager, a Rust Middleware for API Interaction via Model Context Protocol
Hey Dev Community! I'm excited to share a new open-source project I've built: MCP Manager. Developed in Rust, this tool acts as a crucial piece of middleware designed to enable Large Language Models (LLMs) to interact with and call external APIs using the Model Context Protocol (MCP). The Challenge While LLMs are incredibly powerful at understanding and generating text, connecting them reliably and securely to external systems to perform actions is still a significant challenge. Existing methods can be ad-hoc, lack standardization, or require complex custom integrations. The only integrations I was able to found are proprietary or paid. The only real alternative is Claude for Desktop, however it is limited to the Antropic models. The Solution: MCP Manager + Model Context Protocol MCP Manager provides a standardized way for LLMs (initially supporting Google Gemini and Azure OpenAI) to trigger API calls. The flow works like this: The LLM determines that an external action is needed based on its prompt/context The LLM communicates this intent to MCP Manager using its tool/function calling capabilities MCP Manager receives this request and, using the Model Context Protocol (MCP), communicates with a backend "MCP Server" The MCP Server, which is responsible for knowing about and interacting with specific external APIs, executes the required API call The response from the API is sent back through the MCP Server and MCP Manager, and back to the LLM The LLM can decide to print an answer to the user or continuing calling functions to fulfill the user needs This approach separates the LLM's intent from the specifics of API execution, providing a cleaner, more maintainable architecture for building LLM-powered applications that interact with the real world. Why Rust? Building MCP Manager in Rust provides several key advantages for a middleware tool: Performance: Efficient handling of requests and communication. Reliability & Safety: Rust's strong type system and ownership model prevent common bugs, crucial for system-level tools. Only crashes the program when we tell it to Concurrency: Rust's excellent support for concurrent programming is ideal for handling multiple LLM interactions or API responses Current Status & Future: Currently, MCP Manager v0.1.0 supports integration with Google Gemini and Azure OpenAI and is designed to work only with local MCP server implementations. Support for remote MCP servers is planned to enable more distributed architectures, but this feature is not yet implemented. Get Involved MCP Manager is fully open source, and I strongly believe in collaborative development. Whether you're a Rust developer, an AI/LLM enthusiast, or someone interested in AI or APIs, your contributions are highly welcome! Explore the Code: See how the MCP is integrated and how the LLM integrations work Contribute: Help add support for more LLMs, enhance the MCP integration, improve documentation, or suggest new features Provide Feedback: Share your thoughts on the project's direction and potential use cases Check out the repositories: https://gitlab.com/DMaxter/mcp-manager Let's discuss how we can build robust bridges between LLMs and the external world! Looking forward to your thoughts and contributions.

Hey Dev Community!
I'm excited to share a new open-source project I've built: MCP Manager. Developed in Rust, this tool acts as a crucial piece of middleware designed to enable Large Language Models (LLMs) to interact with and call external APIs using the Model Context Protocol (MCP).
The Challenge
While LLMs are incredibly powerful at understanding and generating text, connecting them reliably and securely to external systems to perform actions is still a significant challenge. Existing methods can be ad-hoc, lack standardization, or require complex custom integrations.
The only integrations I was able to found are proprietary or paid. The only real alternative is Claude for Desktop, however it is limited to the Antropic models.
The Solution: MCP Manager + Model Context Protocol
MCP Manager provides a standardized way for LLMs (initially supporting Google Gemini and Azure OpenAI) to trigger API calls. The flow works like this:
- The LLM determines that an external action is needed based on its prompt/context
- The LLM communicates this intent to MCP Manager using its tool/function calling capabilities
- MCP Manager receives this request and, using the Model Context Protocol (MCP), communicates with a backend "MCP Server"
- The MCP Server, which is responsible for knowing about and interacting with specific external APIs, executes the required API call
- The response from the API is sent back through the MCP Server and MCP Manager, and back to the LLM
- The LLM can decide to print an answer to the user or continuing calling functions to fulfill the user needs
This approach separates the LLM's intent from the specifics of API execution, providing a cleaner, more maintainable architecture for building LLM-powered applications that interact with the real world.
Why Rust?
Building MCP Manager in Rust provides several key advantages for a middleware tool:
- Performance: Efficient handling of requests and communication.
- Reliability & Safety: Rust's strong type system and ownership model prevent common bugs, crucial for system-level tools. Only crashes the program when we tell it to
- Concurrency: Rust's excellent support for concurrent programming is ideal for handling multiple LLM interactions or API responses
Current Status & Future:
Currently, MCP Manager v0.1.0 supports integration with Google Gemini and Azure OpenAI and is designed to work only with local MCP server implementations.
Support for remote MCP servers is planned to enable more distributed architectures, but this feature is not yet implemented.
Get Involved
MCP Manager is fully open source, and I strongly believe in collaborative development. Whether you're a Rust developer, an AI/LLM enthusiast, or someone interested in AI or APIs, your contributions are highly welcome!
- Explore the Code: See how the MCP is integrated and how the LLM integrations work
- Contribute: Help add support for more LLMs, enhance the MCP integration, improve documentation, or suggest new features
- Provide Feedback: Share your thoughts on the project's direction and potential use cases
Check out the repositories: https://gitlab.com/DMaxter/mcp-manager
Let's discuss how we can build robust bridges between LLMs and the external world! Looking forward to your thoughts and contributions.