How to build MCP server in python using FastAPI

I’ve seen many people intimidated or confused by the new MCP protocol published by Anthropic, which is designed to make the connection between an AI agent and your application seamless and clear. But building such a server for your Python application doesn’t need to be complicated at all. Before we jump in, let me convince you why it’s necessary. The WHY Allows AI agents to integrate with your application Shifts complexity from human developers to AI agents, potentially streamlining application development Simplifies the process of connecting AI to numerous tools and data sources If MCP becomes the next standard, you’ll have no choice but to adopt it, as people will shift from manual integrations to AI doing everything for them. Now you might say, “No problem! That’s a big if, let’s wait and see.” But once you see how little effort it takes to adopt, you’ll want to stay ahead of the curve. The HOW Create a FastAPI Server with the Desired Functionalities (If you already have one, skip ahead) The FastAPI first-steps tutorial will surely be better than mine, but here’s the gist of it: Install dependencies: pip install uvicorn, fastapi Create a FastAPI server: from fastapi import FastAPI app = FastAPI() @app.get("/") async def root(): return {"message": "MCP is super cool"} For each functionality of your app, write a “path operation” — a function assigned to a specific path (here you see the root function under the path /), performing an HTTP operation (GET, POST, PUT, DELETE…). Run your app using uvicorn: uvicorn main:app - reload That’s it! Your FastAPI app is available at http://127.0.0.1:8000. Now let’s upgrade it to an MCP server. From FastAPI to MCP server Install the fastapi-mcp open-source tool: pip install fastapi-mcp Add to your FastAPI code: from fastapi import FastAPI from fastapi_mcp import add_mcp_server # Your FastAPI app app = FastAPI() # Mount the MCP server to your app add_mcp_server( app, # Your FastAPI app mount_path="/mcp", # Where to mount the MCP server name="My API MCP", # Name for the MCP server ) That’s it! The MCP server is auto-generated and available at http://127.0.0.1:8000/mcp. This is an address that can be configured in Cursor (or any AI agent supporting SSE, rumor has it that Cline support for MCP is on its way). Using the MCP Configure your MCP in Cursor: Settings -> MCP -> Add new MCP Paste this in the JSON file that opens: { "mcpServers": { "My First MCP server": { "url": "http://127.0.0.1:8000/mcp" } } } Configuring in Claude Desktop takes one more step (as it currently only supports stdio), but don’t worry — I will show you how to do it in a different post. What’s Next? Once set up, AI agents can now interface with your application through a standardized protocol. You can expand your MCP server by adding more endpoints that expose different functionalities of your application. Remember, the goal of MCP is to make integration simple — so don’t overcomplicate it! Start with basic functionality and expand as needed. The beauty of using FastAPI with the fastapi-mcp library is that you maintain all the benefits of FastAPI (like automatic documentation and type validation) while adding MCP capabilities with minimal additional code. Let me know if you tried it, and don’t forget to star the fastapi-mcp repository!

Mar 27, 2025 - 17:04
 0
How to build MCP server in python using FastAPI

I’ve seen many people intimidated or confused by the new MCP protocol published by Anthropic, which is designed to make the connection between an AI agent and your application seamless and clear. But building such a server for your Python application doesn’t need to be complicated at all.
Before we jump in, let me convince you why it’s necessary.

The WHY

  1. Allows AI agents to integrate with your application
  2. Shifts complexity from human developers to AI agents, potentially streamlining application development
  3. Simplifies the process of connecting AI to numerous tools and data sources

If MCP becomes the next standard, you’ll have no choice but to adopt it, as people will shift from manual integrations to AI doing everything for them.
Now you might say, “No problem! That’s a big if, let’s wait and see.” But once you see how little effort it takes to adopt, you’ll want to stay ahead of the curve.

The HOW

Create a FastAPI Server with the Desired Functionalities

(If you already have one, skip ahead)

The FastAPI first-steps tutorial will surely be better than mine, but here’s the gist of it:

  1. Install dependencies:

    pip install uvicorn, fastapi
    
  2. Create a FastAPI server:

    from fastapi import FastAPI
    
    app = FastAPI()
    
    @app.get("/")
    async def root():
        return {"message": "MCP is super cool"}
    
  3. For each functionality of your app, write a “path operation” — a function assigned to a specific path (here you see the root function under the path /), performing an HTTP operation (GET, POST, PUT, DELETE…).

  4. Run your app using uvicorn:

    uvicorn main:app - reload
    

That’s it! Your FastAPI app is available at http://127.0.0.1:8000. Now let’s upgrade it to an MCP server.

fastapi to mcp animation

From FastAPI to MCP server

  1. Install the fastapi-mcp open-source tool:

    pip install fastapi-mcp
    
  2. Add to your FastAPI code:

    from fastapi import FastAPI
    from fastapi_mcp import add_mcp_server
    
    # Your FastAPI app
    app = FastAPI()
    
    # Mount the MCP server to your app
    add_mcp_server(
        app,                 # Your FastAPI app
        mount_path="/mcp",   # Where to mount the MCP server
        name="My API MCP",   # Name for the MCP server
    )
    

That’s it! The MCP server is auto-generated and available at http://127.0.0.1:8000/mcp. This is an address that can be configured in Cursor (or any AI agent supporting SSE, rumor has it that Cline support for MCP is on its way).

Using the MCP

Configure your MCP in Cursor:

  1. Settings -> MCP -> Add new MCP
  2. Paste this in the JSON file that opens:

    {
      "mcpServers": {
        "My First MCP server": {
          "url": "http://127.0.0.1:8000/mcp"
          }
        }
      }
    

Configuring in Claude Desktop takes one more step (as it currently only supports stdio), but don’t worry — I will show you how to do it in a different post.

What’s Next?

Once set up, AI agents can now interface with your application through a standardized protocol. You can expand your MCP server by adding more endpoints that expose different functionalities of your application.
Remember, the goal of MCP is to make integration simple — so don’t overcomplicate it! Start with basic functionality and expand as needed.
The beauty of using FastAPI with the fastapi-mcp library is that you maintain all the benefits of FastAPI (like automatic documentation and type validation) while adding MCP capabilities with minimal additional code.
Let me know if you tried it, and don’t forget to star the fastapi-mcp repository!