Skip to content
Link

On 20th of November in 2022 OpenAI released ChatGPT, which was fine-tuned from a model in the GPT-3.5 series. That marked the turning point when the AI revolution truly took off, fundamentally transforming the way we access and interact with information. We got immense knowledge at our fingertips. I already know some of you will argue this was already the case with something called the internet. But you had to be good at Google search and use the right keywords. On the other hand, with AI you are able to simply ask a question, and the LLM will get you a proper answer as long as it doesn't hallucinate.

However, despite this revolutionary access to information, there is a fundamental limitation that comes with LLMs. They can output text, images, audio and video from text input, nothing more. Don't get me wrong, the breakthrough is immense, but still limited in what you achieve. This limitation became evident to many developers, which is why it was such a game-changer when OpenAI introduced the function calling API. All of a sudden you could make the LLM do amazing things. You were able to define custom functions that could be called with ChatGPT, and the number of use cases felt unlimited.

This function calling paved the way for more sophisticated frameworks like MCP. While function calling opened the door, MCP took it further by providing a more formalized and open standard for interaction, back in November 2024. The rapid adoption of MCP wasn't accidental - its success can be attributed to several key factors:

  1. MCP provides a standardized, secure, and extensible framework for connecting large language models and AI systems with real-world data, APIs, and tools. This addresses major integration challenges and fragmentation in AI workflows, enabling seamless, dynamic retrieval of live enterprise-specific data to improve response accuracy and personalization.
  2. The open nature of MCP allowed developers to create interoperable servers and clients working across multiple AI platforms, fostering a rapidly expanding ecosystem and greater adoption. This openness helped avoid vendor lock-in and encouraged a unified approach to AI context management.
  3. MCP supports compliance, auditability, data privacy, and security policies important to enterprise adoption, making it suitable for regulated environments and large-scale deployments.
  4. MCP enables better efficiency, flexibility, safety, and traceability in AI model interactions with external tools and data sources, which traditional model serving methods struggled to provide.

What is MCP

As already stated, MCP is an open-source standard for connecting AI applications to external data sources and systems through a centralized interface. Instead of having hardcoded connections between a LLM and a system, MCP establishes a common language that they all can speak.

MCP architecture and Components

There are 5 components that define the MCP, and each of the components plays a critical role in connecting AI models to external systems, while still having security and performance at the heart:

  1. Host/Application is the user-facing UX/UI, where users interact and initiate tasks. The host coordinates user input, manages permissions, and orchestrates communication between the LLM, clients, and external tools
  2. MCP client serves as the connection manager and translator within the host, establishing secure sessions with one or more servers. It ensures protocol compatibility, maintains isolation between servers, handles capability negotiation, and routes requests and responses as needed.
  3. MCP server exposes specific functions, tools, and resources to AI models, often acting as an interface to external data sources like APIs or ticketing systems. Each server operates independently, advertises its capabilities, and enforces security boundaries in line with protocol requirements.
  4. Transport and messaging layer represents the communication between the client and the server. Messages use the JSON-RPC 2.0 format and can be transported over STDIO for direct, low-latency local connections, or over HTTP (streaming) for remote, distributed environments. Server-Sent Events are also supported for remote calls, but this option is generally deprecated and discouraged. The transport layer is also responsible for authentication and message framing, keeping data exchanges reliable and secure.
  5. Protocol messages are structured requests, responses and notifications to enable two-way communication, with rigorous schema validation built in. Error handling and contract enforcement ensure that failures or mismatches are handled gracefully, reducing operational risk for security-sensitive workflows.

Hello, World!

Now that we understand what MCP is and its architecture, let's put this knowledge into practice. As with every guide, I will help you create your own MCP server in just a few lines. Before we dive into the code, it's worth noting that there are several frameworks available for implementing MCP:

  • EasyMCP (TypeScript)
  • FastMCP (Python & TypeScript)
  • Spring (Java)
  • Quarkus (Java)
  • LangChain MCP (Python & TypeScript)
  • CrewAI (Python)
  • Vercel AI SDK (TypeScript)
  • Anthropic MCP Core (TypeScript & Rust)

For this example, I will pick the same framework we use, FastMCP. Create your own Python project, add the FastMCP dependency to the mix, and as a last step copy this code:

from fastmcp import FastMCP

mcp = FastMCP("Demo 🚀")

@mcp.tool
def helloWorld(name: str) -> str:
    """Greeting"""
    if name:
      return f"Hello {name}"
    
    return "Hello World!"

if __name__ == "__main__":
    mcp.run()

Within a few lines we were able to create our own MCP server. Congratulation! Throughout the series, we will build on this example and examine these 3 features that are part of the MCP specification:

  • tools (which we already got a glimpse of through this example)
  • resources (as the name suggests, it provides resources for the LLM to give further context)
  • prompts (prompt templates)

Published by...

Image of the author

Jernej Klancic

Visit author page