20.9 C
New York
Wednesday, June 18, 2025

FastAPI-MCP Tutorial for Rookies and Consultants


Have you ever ever come throughout a scenario the place you needed your chatbot to make use of a device after which reply? Sounds difficult, proper! However now, MCP (Mannequin Context Protocol) presents you a solution to combine your LLM to exterior instruments simply and the LLM will be capable to use these instruments in each approach. On this tutorial, we are going to dive into the method of changing a easy internet app made utilizing FastAPI, powered by an MCP Server, utilizing the FastAPI-MCP.

FastAPI with MCP

FastAPI is a quite simple device in-built Python which lets you construct internet purposes utilizing APIs. It’s designed to be straightforward to make use of in addition to quick on the similar time. Consider FastAPI as a sensible waiter who takes your order (HTTP requests), goes to the Kitchen (Database/Server) after which takes your order (Output) after which reveals it to you. It’s an excellent device for constructing Internet backends, Providers for Cell apps and so on.

MCP is an open customary protocol by Anthropic that gives a performance for the LLMs to speak with exterior information sources and instruments. Consider MCP as a toolkit that gives the appropriate device for the given job. We’d be utilizing MCP for making a server.

Now, what if these functionalities are given to your LLM? It would make your life a lot simpler! That’s why FastAPI to MCP integration helps quite a bit. FastAPI takes care of the companies from totally different sources and MCP takes care of the context of your LLM. Through the use of FastAPI with MCP server, we will get entry to each device deployed over the online and make the most of that as a LLM device and make the LLMs do our work extra effectively.

 Within the above picture, we will see that there’s an MCP server that’s related to an API endpoint. This API endpoint could be a FastAPI endpoint or every other third occasion API service obtainable on the web.

What’s FastAPI-MCP?

FastAPI-MCP is a device which helps you to convert any FastAPI utility into some device that LLMs like ChatGPT or Claude can perceive and use simply. Through the use of FastAPI-MCP you may wrap your FastAPI endpoints in such a approach that they’ll turn out to be a plug and play device in an AI ecosystem using LLMs.

If you wish to know work with MCP, learn this text on Easy methods to Use MCP?

What APIs Can Be Transformed into MCP Utilizing FastAPI-MCP?

With FastAPI-MCP, any FastAPI endpoint will be transformed right into a MCP device for LLMs. These endpoints ought to embody:

  • GET endpoints: Transformed into MCP sources.
  • POST, PUT, DELETE endpoints: Transformed into MCP instruments.
  • Customized utility features: Could be added as extra MCP instruments

FastAPI-MCP is a really easy-to-use library that robotically discovers and converts these endpoints into MCP. It additionally preserves the schema in addition to the documentation of those APIs.

Palms-on utilizing FastAPI-MCP

Let’s take a look at a easy instance on convert a FastAPI endpoint right into a MCP server. Firstly, we are going to create a FastAPI endpoint after which transfer in the direction of changing it right into a MCP server utilizing fastapi-mcp.

Configuring FastAPI

1. Set up the dependencies

Make your system suitable by putting in the required dependencies.

pip set up fastapi fastapi_mcp uvicorn mcp-proxy

2. Import the required dependencies

Make a brand new file with the title ‘important.py’, then import the next dependencies inside it.

from fastapi import FastAPI, HTTPException, Question

import httpx

from fastapi_mcp import FastApiMCP

3. Outline the FastAPI App

Let’s outline a FastAPI app with the title “Climate Updates API”.

app = FastAPI(title="Climate Updates API")

4. Defining the routes and features

Now, we are going to outline the routes for our app, which is able to denote which endpoint will execute which perform. Right here, we’re making a climate replace app utilizing climate.gov API (free), which doesn’t require any API key. We simply have to hit the https://api.climate.gov/factors/{lat},{lon} with the appropriate worth of latitude and longitude.

We outlined a get_weather perform which is able to take a state title or code as an argument after which discover the corresponding coordinates within the CITY_COORDINATES dictionary after which hit the bottom URL with these coordinates.

# Predefined latitude and longitude for main cities (for simplicity)
# In a manufacturing app, you can use a geocoding service like Nominatim or Google Geocoding API
CITY_COORDINATES = {
   "Los Angeles": {"lat": 34.0522, "lon": -118.2437},
   "San Francisco": {"lat": 37.7749, "lon": -122.4194},
   "San Diego": {"lat": 32.7157, "lon": -117.1611},
   "New York": {"lat": 40.7128, "lon": -74.0060},
   "Chicago": {"lat": 41.8781, "lon": -87.6298},
   # Add extra cities as wanted
}


@app.get("/climate")
async def get_weather(
   stateCode: str = Question(..., description="State code (e.g., 'CA' for California)"),
   metropolis: str = Question(..., description="Metropolis title (e.g., 'Los Angeles')")
):
   """
   Retrieve at the moment's climate from the Nationwide Climate Service API primarily based on metropolis and state
   """
   # Get coordinates (latitude, longitude) for the given metropolis
   if metropolis not in CITY_COORDINATES:
       elevate HTTPException(
           status_code=404,
           element=f"Metropolis '{metropolis}' not present in predefined checklist. Please use one other metropolis."
       )
  
   coordinates = CITY_COORDINATES[city]
   lat, lon = coordinates["lat"], coordinates["lon"]
  
   # URL for the NWS API Gridpoints endpoint
   base_url = f"https://api.climate.gov/factors/{lat},{lon}"
  
   attempt:
       async with httpx.AsyncClient() as consumer:
           # First, get the gridpoint data for the given location
           gridpoint_response = await consumer.get(base_url)
           gridpoint_response.raise_for_status()
           gridpoint_data = gridpoint_response.json()
          
           # Retrieve the forecast information utilizing the gridpoint data
           forecast_url = gridpoint_data["properties"]["forecast"]
           forecast_response = await consumer.get(forecast_url)
           forecast_response.raise_for_status()
           forecast_data = forecast_response.json()


           # Returning at the moment's forecast
           today_weather = forecast_data["properties"]["periods"][0]
           return {
               "metropolis": metropolis,
               "state": stateCode,
               "date": today_weather["startTime"],
               "temperature": today_weather["temperature"],
               "temperatureUnit": today_weather["temperatureUnit"],
               "forecast": today_weather["detailedForecast"],
           }
  
   besides httpx.HTTPStatusError as e:
       elevate HTTPException(
           status_code=e.response.status_code,
           element=f"NWS API error: {e.response.textual content}"
       )
   besides Exception as e:
       elevate HTTPException(
           status_code=500,
           element=f"Inside server error: {str(e)}"
       )

5. Arrange MCP Server

Let’s convert this FastAPI app into MCP now utilizing the fastapi-mcp library. This course of could be very easy, we simply want so as to add a number of traces of and the fastapi-mcp robotically converts the endpoints into MCP instruments and detects its schema and documentation simply.

mcp = FastApiMCP(
   app,
   title="Climate Updates API",
   description="API for retrieving at the moment's climate from climate.gov",
)
mcp.mount() 

6. Beginning the app

Now, add the next on the finish of your Python file.

if __name__ == "__main__":
   import uvicorn
   uvicorn.run(app, host="0.0.0.0", port=8000) 

And go to terminal and run the primary.py file.

python important.py 

Now your FastAPI app ought to begin in localhost efficiently. 

Configuring Cursor

Let’s configure the Cursor IDE for testing our MCP server.

  1. Obtain Cursor from right here https://www.cursor.com/downloads.
  2. Set up it, join and get to the house display screen.
  1. Now go to the File from the header toolbar. and click on on Preferences after which on Cursor Settings.
Cursor Settings
  1. From the cursor settings, click on on MCP.
Configuring Cursor
  1. On the MCP tab, click on on Add new international MCP Server.
    It would open a mcp.json file. Paste the next code into it and save the file.
{
   "mcpServers": {
     "Nationwide Park Service": {
         "command": "mcp-proxy",
         "args": ["http://127.0.0.1:8000/mcp"]
     }
   }
}
  1. Again on the Cursor Settings, it’s best to see the next:
Linked MCP Server

In case you are seeing this in your display screen, meaning your server is operating efficiently and related to the Cursor IDE. If it’s displaying some errors, attempt utilizing the restart button in the appropriate nook.

We’ve efficiently arrange the MCP server within the Cursor IDE. Now, let’s take a look at the server.

Testing the MCP Server 

Our MCP server can retrieve the climate updates. We simply should ask the Cursor IDE for the climate replace on any location, and it’ll fetch that for us utilizing the MCP server. 

Question:Please inform me what’s at the moment’s climate in San Diego

Prompt Response 1

 Question:New York climate?

Prompt Response 2

We will see from the outputs that our MCP server is working effectively. We simply have to ask for the climate particulars, it would resolve by itself whether or not to make use of MCP server or not. Within the second output we requested vaguely “New York climate?” it was in a position to understand the context of the question primarily based on our earlier immediate, and used the suitable MCP instruments to reply.

Conclusion

MCP permits LLMs to extend their answering capabilities by giving entry to exterior instruments and FastAPI presents a straightforward approach to try this. On this complete information, we mixed each the applied sciences utilizing the fastapi-mcp library. Using this library, we will convert any API into MCP server, which is able to assist the LLMs and AI brokers to get the newest data from the APIs. There can be no have to outline a customized device for each new job. MCP with FastAPI will deal with every little thing robotically. The revolution within the LLMs was introduced by the introduction of MCP, and now, FastAPI paired with MCP is revolutionizing the best way LLMs are accessing these instruments.

Harsh Mishra is an AI/ML Engineer who spends extra time speaking to Giant Language Fashions than precise people. Obsessed with GenAI, NLP, and making machines smarter (in order that they don’t exchange him simply but). When not optimizing fashions, he’s most likely optimizing his espresso consumption. 🚀☕

Login to proceed studying and luxuriate in expert-curated content material.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles