Transform your Rails app into an MCP server and embrace the AI revolution.


With the advent of artificial intelligence and the rise of LLMs, you may have already heard of the MCP (Model Context Protocol). If you don’t understand what it is and what it does, you’ve come to the right place!

Together, we’ll explore a practical use case and an implementation example in Rails. By the end, you’ll be able to make your AI-Native apps, that’s a promise!


To write this article, I drew heavily from the superb presentation by Paweł Strzałkowski at the RailsWorld 2025 conference. You’ll find a link to the video at the end.

Let’s start by understanding what the MCP protocol is

Problem framing

One of the major problems when you want to develop a modern application that leverages LLMs (Large Language Models) is its connection to the outside world. Let’s take the example proposed by Pawel:


A user asks an AI to book a weekend in Spain with a very specific set of requirements: around mid-July, at a 5-star hotel with a romantic setting and favorable weather.

The AI has been trained on substantial amounts of information and can probably infer what ‘romance’ means. However, that information is time-bound. How does it know whether the right hotels are available for the given period? Or whether the weather will be favorable? It’s a closed system that doesn’t natively have real-time information!


There is a real wall between LLMs and traditional applications; by default they do not have access to the outside world.


One first approach would be to create a system that bridges the LLM and the user request via APIs. This can work but we quickly run into a scalability problem. For each model (M) and each application (N) you must manually create a connection to send the request, a system to translate between our apps and the LLM… and of course maintain all of this over time. Very quickly this becomes a nightmare!



This is exactly what the giants in the field understood, notably Anthropic, who in 2024 proposed an approach that would simplify our developer lives. They introduced the Model Context Protocol.

What is the MCP (Model Context Protocol)?

Concretely, this protocol is a standardization of how to provide an additional context to an LLM. Since it’s a standard, everyone uses it. You only need to create a connector once and all LLMs that know how to use MCP servers will be able to connect to it. So we no longer have (N) applications but one. This dramatically reduces development AND maintenance costs. In addition to improving our developer mental health (or at least not contributing to deterioration!).


And it’s so effective and practical that it was quickly adopted by the other giants in the industry, Google and OpenAI leading the way.


Technically, it resembles an API but optimized for talking effectively with an LLM.

The architecture splits into three components.

  1. On one side there is the HOST: This is your application!
  2. In the middle there is the MCP server which provides additional context to each prompt (via HTTP if MCP is remote or via direct transfer if the MCP runs on the same machine as the host)
  3. On the other side there is the LLM. The host communicates with it using prompt-response round trips

Zooming in on the heart of an MCP

MCPs are made up of “primitives”. It’s a term often used in programming to talk about an elementary block.


In an MCP there are three types of primitives

  1. Tools: These are functions that the server can invoke for the host.
  2. Resources: These are the data to which the MCP has access (logs, images, etc…)
  3. Prompts: These are standardized instructions that the server can provide to the user so that they can communicate effectively with the server.


Example on the travel booking application

Why is it more efficient than an API?

The Model Context Protocol is a standard that all modern LLMs understand! So if you build an MCP server for your application, by default all LLMs will understand it—no extra adaptation needed.

No longer write your connectors and prompts because MCPs are already able to communicate with your application natively, by construction.


By adding MCPs to your application you are AI-native since your application knows how to talk to all LLMs. It’s a very powerful feature and in Rails it’s very easy to set up as you’ll see in the example.

Implementing an AI-native blog application with Rails and the GEM “ruby-sdk”.


Rails is a wonderful ecosystem if you want to build a modern app that works natively with artificial intelligence. Why?


Because

  1. Rails was built on a solid foundation and has under-the-hood essential features that let you do amazing things without having to manage a complex configuration yourself
  2. Rails has an incredible community of enthusiasts who develop improvements, including powerful GEMs.


Think my adjectives are exaggerated? Read to the end and you’ll see it’s really not the case…

The GEM “ruby-sdk”

When you want to add extra features to your Rails app you inevitably think of GEMs. And to add MCP features to your app there is already an official GEM, named ruby-sdk, maintained by the community in collaboration with Shopify. Suffice it to say, it’s solid.


In a few lines of code, you can add an MCP server to your application


server = MCP::Server.new(name: "my_server")
transport = MCP::Server::Transports::StreamableHTTPTransport.new(server)
server.transport = transport
# When tools change, notify clients
server.define_tool(name: "new_tool") { |**args| { result: "ok" } }
server.notify_tools_list_changed

Setting up the demonstration application

To illustrate the MCP server setup we’re going to create a project that will allow us to automate blog post creation. To do this we’ll use the template provided by Pawel


git clone <https://github.com/pstrzalk/mcp-on-rails.git>
rails new blog-mcp -m mcp-on-rails/mcp
cd blog-mcp


This will initialize a new project “blog-mcp” and configure several things for you, including notably:

  1. Add several routes
Rails.application.routes.draw do
resources :comments
resources :posts
# Model Context Protocol
post "/mcp", to: "mcp#handle"
get "/mcp", to: "mcp#handle"
# Define your application routes per the DSL in <https://guides.rubyonrails.org/routing.html>
# Reveal health status on /up that returns 200 if the app boots with no exceptions, otherwise 500.
# Can be used by load balancers and uptime monitors to verify that the app is live.
get "up" => "rails/health#show", as: :rails_health_check
# Render dynamic PWA files from app/views/pwa/* (remember to link manifest in application.html.erb)
# get "manifest" => "rails/pwa#manifest", as: :pwa_manifest
# get "service-worker" => "rails/pwa#service_worker", as: :pwa_service_worker
# Defines the root path route ("/")
# root "posts#index"
end
  1. Create a controller for the MCP
class McpController < ActionController::API
def handle
if params[:method] == "notifications/initialized"
head :accepted
else
render(json: mcp_server.handle_json(request.body.read))
end
end
private
def mcp_server
MCP::Server.new(
name: "rails_mcp_server",
version: "1.0.0",
tools: MCP::Tool.descendants
)
end
end
  1. And a generator that will help you create your first tools
class McpToolGenerator < Rails::Generators::NamedBase
source_root File.expand_path("templates", __dir__)
argument :attributes, type: :array, default: [], banner: "field:type field:type"
def create_tool_file
template "tool.rb.tt", File.join("app", "tools", "#{file_name}.rb")
end
private
def tool_class_name
file_name.classify
end
def map_attribute_type(type)
case type.to_sym
when :references, :belongs_to, :timestamp, :integer
:integer
when :boolean
:boolean
else
:string
end
end
end


Now if you run “rails server” you’ll see your application start on http://localhost:3000/. First step done!

Installing MCP Inspector

In addition to our Rails app, we’ll need a second developer debugging tool called “MCP Inspector”. This tool is not integrated into the Ruby environment; you’ll run it on a Node.js environment. You can install it using the command:

# Assurez vous d'avoir node installé
# Il vous faudra au moins la version 22.7.5 de node
# Si vous n'avez pas node ou que vous avez besoin de le mettre à jour
# aller sur <https://nodejs.org/en>
> node -v
v22.20.0
> npx -v
10.9.3
# Puis installez l'outil MCP inspector
> npx @modelcontextprotocol/inspector

On launch you should see this :


Once installed:

  1. Go to the link provided by the tool
  2. Choose “transport type: streamable http” and provide your local server with the MCP endpoint
  3. Click “Connect”


Of course, your Rails server must be running in parallel for the connection to work. If it asks for a token, provide the one created at startup of the tool in the field “Proxy Session Token” in the settings menu. Normally, if you opened the link with the “token pre-filled” it should already be filled but you never know. ;)



Congratulations you’re connected and your Rails app is recognized as an MCP server. That was easy, right? And wait, you haven’t seen anything yet!

Creating our first tools

All well and good, but how do we fill our blog now? Any Rails app is really just a wrapper around a standard CRUD application. So the first step is to create our models, controllers and views to manage our posts.


In the example, we’ll use Rails’ scaffold method to quickly create everything we need. Don’t forget the migration afterward ;)


rails g scaffold post title:string body:text
rails db:migrate


Since we’re using the MCP template, you’ll notice that the command generates extra files we’re not used to seeing.


Why? Because the template includes a dedicated generator for the task


# frozen_string_literal: true
require "rails/generators/resource_helpers"
module Rails
module Generators
class McpGenerator < NamedBase
include Rails::Generators::ResourceHelpers
source_root File.expand_path("templates", __dir__)
argument :attributes, type: :array, default: [], banner: "field:type field:type"
def create_mcp_tools
template "show_tool.rb", File.join("app", "tools", controller_file_path, "show_tool.rb")
template "index_tool.rb", File.join("app", "tools", controller_file_path, "index_tool.rb")
template "create_tool.rb", File.join("app", "tools", controller_file_path, "create_tool.rb")
template "update_tool.rb", File.join("app", "tools", controller_file_path, "update_tool.rb")
template "delete_tool.rb", File.join("app", "tools", controller_file_path, "delete_tool.rb")
end
private
def map_attribute_type(type)
case type.to_sym
when :references, :belongs_to, :timestamp, :integer
:integer
when :boolean
:boolean
else
:string
end
end
end
end
end


And this generator will automatically create dedicated tools for each CRUD action. For example, for the post creation action we’ll have a “post-create-tool”.


module Posts
class CreateTool < MCP::Tool
tool_name "post-create-tool"
description "Create a new Post entity"
input_schema(
properties: {
title: { type: "string" },
body: { type: "string" },
},
required: [ ]
)
def self.call(title: nil, body: nil, server_context:)
post = Post.new(
title: title,
body: body
)
if post.save
MCP::Tool::Response.new([ { type: "text", text: "Created #{post.to_mcp_response}" } ])
else
MCP::Tool::Response.new([ { type: "text", text: "Post was not created due to the following errors: #{post.errors.full_messages.join(', ')}" } ])
end
rescue StandardError => e
MCP::Tool::Response.new([ { type: "text", text: "An error occurred, what happened was #{e.message}" } ])
end
end
end


Does this mean we’ll be able to create posts via MCP? Yes! And not only “create”. We’ll be able to perform all CRUD actions via MCP!

You can list all available tools by running the command “rails mcp:tools”.


Or by querying your MCP server from MCP Inspector. You can then launch the tool in question and see the response directly in the interface. Here I ask it to launch the tool “post-index-tool” and it returns the list of the N most recently created posts.

Of course, I had created a post by navigating to “http://localhost:3000/posts/new”. But since we’re here, try the “post-create-tool” directly! 😊

Finalizing the setup

We’ll also add the ability to leave a comment for each post. To do this, run the following command:

rails g scaffold comment post:references content:text
rails db:migrate


As with the posts, the command will also automatically create tools for creating comments

And we’ll add the ability to update the index view of posts in real time via turbo stream.

  1. Modify the index to add turbo_stream
<p style="color: green"><%= notice %></p>
<% content_for :title, "Posts" %>
<h1>Posts</h1>
<%= render "posts" %>
<%= turbo_stream_from "posts" %>
<%= link_to "New post", new_post_path %>
  1. Create a _posts partial
<div id="posts">
<% Post.all.each_with_index do |post, index| %>
<div id="<% dom_id post %>">
<h2><%= post.title %></h2>
<p><%= post.body %></p>
<% if post.comments.any? %>
<div>
<h3>Comments</h3>
<% post.comments.each do |comment| %>
<p>
<strong><%= comment.create_at %></strong> - <%= comment.content %>
</p>
<% end %>
</div>
<% end %>
<% end %>
</div>
  1. Modify the Post model
class Post < ApplicationRecord
has_many :comments, dependent: :destroy
broadcasts_to :replace_posts
private
def replace_posts
broadcast_replace_to "posts", target: "posts", partial: "posts/post"
end
end
  1. Modify the Comment model
class Comment < ApplicationRecord
belongs_to :post, touch: true
end

You can test the functionality right away!


Test our live application!

There are several possible solutions to connect your MCP servers, or any MCP server, to an LLM. For simplicity, we’ll use Claude Desktop. It’s Anthropic’s official app available at https://www.claude.com/download.


To connect an MCP to Claude you typically need at least a Pro account or be resourceful. I chose the latter :)


Next, you’ll need a Node.js server. If like me you’re on Windows (WSL2) with Ubuntu (no insults, please), you’ll need to install Node on Windows directly. If you’re on Linux or Mac and you already have Node installed, you can skip to the next part.

Install Node.js for Windows

Very simple, you’ll see. Go directly to the official site https://nodejs.org/fr/download and download the .msi installer and install it.

Run the installation by double-clicking the .msi and follow the prompts to the end.

Once the installation is complete, open a Windows PowerShell terminal. You can easily find it via the Windows search menu. Check that Node is installed.


> node -v
v22.20.0

Configure Claude Desktop

Install and run Claude Desktop. Then go to the Developer settings. Click the “Edit config” button.


Open the file “claude_desktop_config.json” in your favorite text editor (VSCode, Notepad++, it doesn’t matter) and add the following configuration:


{
"mcpServers": {
"blog-mcp": {
"command": "npx",
"args": [
"mcp-remote",
"<http://localhost:3000/mcp>"
]
}
}


At startup, Claude Desktop will launch a “mcp-remote” command to connect to your MCP server.


npx mcp-remote <http://localhost:3000/mcp>


Quit Claude Desktop and relaunch it.NoteClicking the cross does not close the application but minimizes it! To close it, go to File > Quit. This is essential so that Claude launches the MCP server connection at startup.


Before relaunching Claude, make sure your application is started with “rails serve”.

If this worked you should see your MCP server and have access to the list of all the tools it offers:

Create blog posts with Claude

Awesome! Now for the fun. Start a new conversation with Claude and, for example, ask it to create three blog posts. Then ask it to comment on the second article.

And observe the power of the implementation! Thanks to the tools we set up, it can perfectly manage context. It’s almost magic at this point, isn’t it?



Conclusions

For my part I find this example incredible! I hope you do too and that the adjectives used earlier now seem relevant to you.

The future of the internet will probably be a mix between what we know today and a world of MCP-ready or AI-Native applications. Users will no longer need to interact manually but will use LLMs to converse in natural language, or via audio. This will completely change how we consume the internet! So if you don’t want to fall behind, integrate an MCP server into your app. If you’re using Rails, that’s great because it’s very simple as we’ve seen:


To recap:

  1. Do not create APIs to talk to LLMs but use the MCP protocol. It has become the standard in the industry.
  2. You can easily connect your application to modern LLMs and make your app AI-native with MCP integration.
  3. The Rails ecosystem moves very fast and keeps up with modern technologies thanks to its incredible community.


Feel free to share this article and subscribe so you don’t miss anything.




Pawel’s reference video can be viewed here:





Share it: