Unleash Your LLM: Build a Python MCP Server for Cursor & Claude
Large Language Models (LLMs) are powerful, but they live in a text-generation bubble. Want to break them out and connect them to real-world data? This guide shows you how to build a Python MCP server to query databases and supercharge your AI interactions in apps like Cursor and Claude!
Model Context Protocol (MCP) is the key. It's a standardized way for LLMs to access external tools and data. Think of it as a universal adapter that lets your AI "plug in" to anything.
Why You Need an MCP Server: Bridging the AI Gap
LLMs like GPT and Claude are excellent at generating text, but they can't inherently access your files, query databases, or trigger actions. An MCP server acts as the bridge. Here's why it's crucial:
- Bypass LLM Limitations: Connect your AI to databases, APIs, and more.
- Real-World Interaction: Enable actions like sending emails or deploying applications.
- Context is King: Provide LLMs with the accurate, up-to-date information they need.
How an MCP Server Empowers Your LLM Workflow
Here's a breakdown of how an MCP server works its magic:
- You Ask: You send a request to the LLM within your host application (Cursor or Claude).
- Client Checks: The LLM checks if an MCP tool is available for the request.
- Server Steps In: The MCP client forwards the request to your Python MCP server.
- Data Retrieval: The server queries databases, calls APIs, or performs other tasks.
- Results Returned: The server sends the data back to the client and displays it in the host application.
Build a Python MCP Server: Step-by-Step Guide
Let's build a simple Python MCP server that queries a SQLite database for community chat data. This will give you a hands-on understanding of the process.
Prerequisites:
- Python 3.7+
- SQLite (with a sample database)
- Cursor Pro and Claude Desktop
Step 1: Set Up Your Environment
-
Create a Virtual Environment: Isolates your project dependencies.
-
Install the MCP Python SDK: Makes building your server easier.
Step 2: Grab a Sample Database
Download a community.db
file (containing a chatters
table with name
and messages
columns).
Step 3: Code Your MCP Server (sqlite-server.py)
This code connects to your database and retrieves the top chatters.
Integrate Your MCP Server with Cursor
Connect your server so your LLM can request data.
-
Open Cursor: Go to Settings → MCP (requires Cursor Pro).
-
Add Server: Click "Add a New Global MCP Server." This opens
~/.cursor/mcp.json
. -
Configure Server: Update the
mcp.json
file with your server details: -
Verify: Save and return to MCP Settings. A green dot indicates a successful connection.
Test in Cursor: See Your MCP Server in Action
- Ask a Question: In Cursor, ask: "Show me the list of top chatters."
- Grant Permission A prompt asks to run the tool.
- Witness the Magic: The LLM queries the SQLite database and displays the results.
Integrate Your MCP Server in Claude Desktop
-
Open Claude Desktop: Go to Settings → Developer → Edit Config.
-
Add Server: Add the same server block to
claude_desktop_config.json
. -
Refresh: Save, close, and reopen Claude Desktop.
-
Verify: Check Claude Desktop's settings for the MCP Server. Look for a tool icon in the chat indicating a connection.
Test Your MCP Server with Claude Desktop
- Open a Chat: Ask Claude: “Show me the list of top chatters.”
- Grant Permissions: The query requires an external tool. Approve the prompt.
- Get Results: The response should list top chatters.
Frequently Asked Questions
-
What is the purpose of the MCP Server? *The Python MCP server queries the sample SQLite database to provide the number of chatters, names, and message counts to the client for display.
-
How do I integrate my MCP Server with Claude Desktop?
*To integrate, you need to add the server config to
claude_desktop_config.json
found in your Claude Desktop settings. -
Can this tutorial serve as a foundation for more advanced MCP applications?
*Yes. This is suitable to start with custom tools and integrations, and can be expanded to SMS and other services.
The Power of Connection: Your LLM Unleashed
This guide showed you how to build a basic Python MCP server and connect it to Cursor and Claude. With this knowledge, you can now:
- Connect LLMs to any data source.
- Automate tasks and workflows.
- Build personalized AI experiences.
The Model Context Protocol opens a world of possibilities. This simple SQLite integration is just the beginning. Experiment, explore, and unlock the full potential of your LLMs!