paint-brush
Model Context Protocol (MCP): The USB-C of AI Data Connectivityby@HuseynG
314 reads
314 reads

Model Context Protocol (MCP): The USB-C of AI Data Connectivity

by Huseyn Gorbani7mMarch 5th, 2025
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Model Context Protocol (MCP) is an open standard for AI that enables models to dynamically connect to tools, files, and APIs. MCP is a single universal interface for feeding context to AI, allowing it to integrate smoothly with external data sources.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - Model Context Protocol (MCP): The USB-C of AI Data Connectivity
Huseyn Gorbani HackerNoon profile picture
0-item
1-item
2-item

This image is created by Google ImageFX

Why do we need something like MCP for AI systems? 🤔

Nowadays, AI assistants can help you write code, schedule meetings, and answer questions. The challenges? Each integration with an AI system is uniquely tailored with a specific API, database, or other setup since LLM-powered agents obviously have no access to external information—your files, emails, project documents, or APIs.


Understanding this challenge, Anthropic released the Model Context Protocol (MCP)—an open standard for AI that enables models to dynamically connect to tools, files, and APIs.


MCP is like USB-C for AI connectivity—a single universal interface for feeding context to AI, allowing it to integrate smoothly with external data sources. This means developers and end users no longer need different interfaces in order to connected to an AI system. This could be your IDE, chatbot, agent, application, etc. As a result, fetching and acting on real-time information is becoming more standardised than ever before.

What is Model Context Protocol (MCP)?

MCP is an open, standardised way for AI models to communicate with external tools, resources, prompt templates and fetch contextual information on demand. It enables AI assistants, agents, and LLM-based applications to easily connect to:


  • Documents & Files (e.g., Google Drive, Notion, local files)
  • APIs & Databases (e.g., PostgreSQL, REST APIs, JSON data)
  • Developer Tools (e.g., GitHub, IDEs, Slack, Jira)
  • Other Systems (e.g., Salesforce, Google Calendar and more.)

Why is MCP important?

Before the release of MCP, there was no truly standardised way for AI connectivity. Systems need custom integrations for every tool or resource they interact with. This creates scalability issues as every external service requires different connection. MCP promises a unified standard—any AI System that understands MCP can instantly connect to any tool or service that supports it.

How MCP works (the basics)

MCP uses a client-server model:

  • MCP Server: Provides structured context to AI models. It could be a databasefile system, or API that AI can query.
  • MCP Host/Client: The AI application (e.g., Claude for Desktop or Cursor IDE) that requests context from the MCP Server.

For more details, please refer to the official documentation.

Where Can MCP Be Used?

MCP is useful layer for AI-powered systems that require real-time data and automation. Practical Use Cases:

  1. AI Coding Assistants → Let AI fetch relevant documentation dynamically in an IDE like Cursor or VS Code.
  2. AI Chatbots for Business → Enable AI to query customer databases before responding.
  3. AI-powered Research Tools → Fetch live data from APIs for research analysis.
  4. Automated Workflows → Connect AI to Slack, Google Calendar, or Jira to automate tasks.

MCP in action: a simple Python example

Implementing a simple MCP server

In the following simple server code, we have created an MCP server that consists only of tools. We have two tools: calculate_profit and calculate_cost.


#finance_server.py
from mcp.server.fastmcp import FastMCP
import random

# Create a simple finance MCP server
mcp = FastMCP("finance")

@mcp.tool()
def calculate_profit(revenue: float, expenses: float, tax_rate: float = 0.2) -> dict:
    """Calculate profit using a simple equation with random factor"""
    # Add a random market factor between -5% and +10%
    market_factor = 1 + random.uniform(-0.05, 0.1)
    
    # Basic profit calculation with tax and market factor
    gross_profit = revenue - expenses
    taxed_profit = gross_profit * (1 - tax_rate)
    final_profit = taxed_profit * market_factor
    
    return {
        "profit": round(final_profit, 2),
        "formula": "Profit = (Revenue - Expenses) * (1 - Tax Rate) * Market Factor",
        "market_factor": round(market_factor, 2)
    }

@mcp.tool()
def calculate_cost(base_cost: float, quantity: int) -> dict:
    """Calculate total cost with bulk discount"""
    # Apply random discount based on quantity
    discount = min(0.3, quantity / 1000) 
    
    # Basic cost calculation with discount
    total_cost = base_cost * quantity * (1 - discount)
    
    return {
        "cost": round(total_cost, 2),
        "formula": "Cost = Base Cost * Quantity * (1 - Discount)",
        "discount_percent": round(discount * 100, 1)
    }

if __name__ == "__main__":
    mcp.run()


You can make this server available on your local machine. For example, in order to use it in Windsurf IDE, you can paste the following code in the mcp_config.json file of Windsurf:

{
  "mcpServers": {
    "finance": {
      "command": "python3",
      "args": ["/Users/huseyngorbani/Desktop/dev/hackernoon/mcp/app/finance_server.py"],
      "cwd": "/Users/huseyngorbani/Desktop/dev/hackernoon/mcp/app"
    }
  }
}

Please make sure to provide the correct details for your local server. In addition, as of today, Windsurf only supports MCP tools.

A screenshot from Windsurf IDE exploring the MCP component.

Implementing a simple MCP client

The following is also a simple client-side script which runs the tools available in finance_server.py.


#finance_client.py
import asyncio
import json
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

async def main():
    # Configure server connection
    server_params = StdioServerParameters(
        command="python",
        args=["finance_server.py"],
        cwd=""
    )

    # Start the server and create read/write streams
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            await session.initialize()
            
            print("Connected to Finance MCP server!")
            
            # Calculate profit
            profit_result = await session.call_tool("calculate_profit", arguments={
                "revenue": 100000,
                "expenses": 75000,
                "tax_rate": 0.2
            })
            
            # Extract the response data - it's in the first TextContent object's text field as JSON
            profit_json = profit_result.content[0].text
            profit_data = json.loads(profit_json)
            
            print(f"\nProfit: ${profit_data['profit']}")
            print(f"Formula: {profit_data['formula']}")
            print(f"Market Factor: {profit_data['market_factor']}")
            
            # Calculate cost
            cost_result = await session.call_tool("calculate_cost", arguments={
                "base_cost": 50,
                "quantity": 500
            })
            
            # Extract the response data
            cost_json = cost_result.content[0].text
            cost_data = json.loads(cost_json)
            
            print(f"\nCost: ${cost_data['cost']}")
            print(f"Formula: {cost_data['formula']}")
            print(f"Discount: {cost_data['discount_percent']}%")

if __name__ == "__main__":
    asyncio.run(main())


A screenshot from finance_client.py script output.

For more details such as environment setup, please refer to the official Python SDK documentation.

Final Thoughts

MCP or MCP-like standardisation will eventually emerge as demand in the industry grows. It is up to developers to adopt and utilise these standards which would encourage others and motivate those developing them to further enhance the project. There are already third-party markets offering MCP integrations that you can leverage. A list of the most common integrations is available in the GitHub repository.


Lastly, it is important to note that this project is still in its early stages, and relying on it fully at this point may be premature. Nevertheless, it is undoubtedly a game-changer for the AI domain.