1
.
06
.
2025
31
.
05
.
2025
LLM
Ruby on Rails

MCP Client in Rails using ruby-mcp-client gem

Paweł Strzałkowski
Chief Technology Officer
Ruby MCP Client in Rails by Paweł Strzałkowski

Let's build a helpful assistant! Haven't you seen too many articles like this? This one, I promise, will be special. It will add Model Context Protocol (MCP) capabilities to your Ruby on Rails application and allow YOUR assistant to be smarter, better, and even more helpful!

Implementing an LLM-driven assistant in Rails

We start by creating a new Ruby on Rails application and adding a minimal implementation of an LLM-driven assistant. Each prompt coming from the frontend will be a separate message, with no context of the previous ones (basically, it's nowhere near being production-ready).

It will use the ruby-openai gem. For simplicity, most of the needed logic is in the controller. You may find the full implementation in the link below the article.

class AssistantController < ApplicationController
  def show; end

  def chat
    system_message = {
      "role" => "system",
      "content" => "You are a helpful assistant."
    }
    user_message = {
      "role" => "user",
      "content" => params[:prompt]
    }

    openai_client = OpenAI::Client.new
    response = openai_client.chat(
      parameters: {
        model: "gpt-4.1-mini",
        messages: [ system_message, user_message ],
        temperature: 0.7
      }
    )

    assistant_message = response.dig("choices", 0, "message")

    respond_to do |format|
      format.turbo_stream do
        render(
          turbo_stream: turbo_stream.append(
                          "messages",
                          partial: "assistant/message",
                          locals: { message: user_message }
                        ) +
                        turbo_stream.append(
                          "messages",
                          partial: "assistant/message",
                          locals: { message: assistant_message }
                        ) +
                        turbo_stream.update(
                          "chat_input",
                          partial: "assistant/chat_form"
                        )
        )
      end
    end
  end
end

It is a simple form with a prompt textarea. The chat UI displays the prompt and the answers provided by the LLM.

It works well but has very limited capabilities. You can't use it to trigger any interactions or ask about real-world information (i.e., the current weather). Even though you may be used to asking for those via the ChatGPT web UI, you can't do it with the API. The core model doesn't have access to such features.

But we can add MCP capabilities!

Adding MCP Client to a Rails app

To keep focus on the task, I will show an example using the capability of greeting users, which is a feature of the FastMCP-backed server we've created in the article Create an MCP Server using Ruby on Rails and FastMCP.

Let's now add the ruby-mcp-client gem, written by Szymon Kurcab, available at https://github.com/simonx1/ruby-mcp-client. Add it to the application by including

gem 'ruby-mcp-client'

in the Gemfile and running

$ bundle install

This gem introduces an MCPClient::Client library, which then becomes an interface between your application and the LLM, allowing it to interact with tools provided by MCP servers.

Basically, it's the same approach which I have described in my talk in Kraków on The joy of creativity in the age of AI. You provide tools for the LLM and react to their usage in the LLM's responses. However, this time, you don't have to implement the tools within your Rails application. This is the task of the MCP servers. You can concentrate on building your service and let the MCP servers provide additional functionalities.

Using ruby-mcp-client with Rails

It's time to extend the application with MCP-provided features. The below implementation is quite naive, but you'll get the picture of how to use the gem:

  • Configure the MCPClient::Client object with server data. You may use any server which implements STDIO or SSE transport type.
  • Load the tools from the configured MCP server and put them in the proper format for your LLM (e.g., OpenAI, Anthropic, or Google).
  • Provide these tools in an LLM call.
  • With every response from the LLM, check if it requests any tool usage.
  • If there are tool calls, iterate through them, execute each tool using the mcp_client, and send the results back to the LLM in a subsequent call.
  • Continue this process until the LLM responds with a final message without any further tool calls.

Here’s how the chat action in AssistantController evolves:

def chat
  openai_messages = []
  ui_messages = []

  openai_messages << {
    "role" => "system",
    "content" => "You are a helpful assistant."
  }

  user_message = {
    "role" => "user",
    "content" => params[:prompt]
  }
  openai_messages << user_message
  ui_messages << user_message

  mcp_client = MCPClient::Client.new(
    mcp_server_configs: [
      MCPClient.sse_config(
        base_url: "http://localhost:3000/mcp/sse",
        read_timeout: 30,
        retries: 3,
        retry_backoff: 1
      )
    ]
  )
  tools = mcp_client.to_openai_tools

  openai_client = OpenAI::Client.new
  response = openai_client.chat(
    parameters: {
      model: "gpt-4.1-mini",
      messages: openai_messages,
      temperature: 0.7,
      tools: tools,
      tool_choice: "auto"
    }
  )

  assistant_message = response.dig("choices", 0, "message")
  tool_calls = assistant_message["tool_calls"] || []

  while tool_calls.any?
    tool_call = tool_calls.first
    function_details = tool_call["function"]
    function_name = function_details["name"]
    function_arguments = JSON.parse(
      function_details["arguments"]
    )

    tool_call_result = mcp_client.call_tool(
      function_name, function_arguments
    )

    openai_messages << {
      "role" => "assistant",
      "tool_calls" => [ tool_call ]
    }
    openai_messages << {
      "role" => "tool",
      "tool_call_id" => tool_call["id"],
      "name" => function_name,
      "content" => tool_call_result.to_json
    }
    ui_messages << {
      "role" => "assistant",
      "content" => "Calling tool <strong>#{function_name}</strong>, with #{function_arguments}"
    }

    response = openai_client.chat(
      parameters: {
        model: "gpt-4.1-mini",
        messages: openai_messages,
        tools: tools,
        tool_choice: "auto",
        temperature: 0.7
      }
    )

    assistant_message = response.dig("choices", 0, "message")
    tool_calls = assistant_message["tool_calls"] || []
  end

  ui_messages << assistant_message

  respond_to do |format|
    format.turbo_stream do
      turbo_stream_response = ActiveSupport::SafeBuffer.new

      ui_messages.each do |message|
        turbo_stream_response += turbo_stream.append(
          "messages",
          partial: "assistant/message",
          locals: { message: }
        )
      end

      turbo_stream_response += turbo_stream.update(
        "chat_input",
        partial: "assistant/chat_form"
      )

      render turbo_stream: turbo_stream_response
    end
  end
end

What's important: throughout a single chain of interactions, we provide the LLM with all the history of tool usage and assistant messages. This allows the LLM to understand which tools have already been called and what the effect of each call was.

The result is:

The assistant was asked to greet two users. It used the same GreetUserTool (from our FastMCP server example) twice in a row and then ended the interaction with a short summary.

Optimizations

What could surely be optimized in this implementation:

  • Your host application should allow the user to choose to accept or decline tool usage, even if it's a switch like "always use tools and don't ask".
  • You should limit the number of tools used in a single run. Otherwise, your application may be caught in an endless (or at least a very long) loop of actions.
  • And, of course, make the UI much nicer.

There is much more

The ruby-mcp-client gem has many more features. Please review its Readme at https://github.com/simonx1/ruby-mcp-client for a full list. It is actively maintained, and new features come often.There are several implementations of MCP servers, but very few of a client, which makes this project unique enough to give it a try and see how it fits your needs. If you have any questions for the author of the ruby-mcp-client gem, the rumor has it that he can be found on the Ruby AI Builders Discord :

If you'd like to check the full code used in this article, please review out the repository at https://github.com/pstrzalk/rails-with-ruby-mcp-client.

Summary

Integrating the ruby-mcp-client gem provides a direct method for enhancing Ruby on Rails assistants with MCP capabilities. It allows them to connect with external tools and services.  And there you have it: a straightforward approach to constructing advanced, tool-aware AI assistants directly within Ruby on Rails applications!

Articles in this series

Paweł Strzałkowski
Chief Technology Officer

Check my Twitter

Check my Linkedin

Did you like it? 

Sign up To VIsuality newsletter

READ ALSO

Updated guide to recruitment process at Visuality

14
.
11
.
2023
Michał Łęcicki
Visuality
HR

Visuality Academy for wannabe Junior Engineers

14
.
11
.
2023
Michał Piórkowski
HR
Visuality

How to approach funding as an MVP

14
.
11
.
2023
Michał Piórkowski
Business
Startups

Visuality 13th birthday

14
.
11
.
2023
Michał Piórkowski
HR
Visuality

How To Receive Emails With a Rails App in 2021

14
.
11
.
2023
Michał Łęcicki
Ruby on Rails
Tutorial

Project Quality in IT - How to Make Sure You Will Get What You Want?

02
.
10
.
2024
Wiktor De Witte
Ruby on Rails
Project Management
Business

5 Trends in HR Tech For 2021

14
.
11
.
2023
Maciej Zdunek
Business
Project Management

Is Go Language the Right Choice for Your Next Project?

14
.
11
.
2023
Maciej Zdunek
Backend
Business

SXSW Tradeshow 2020: Get Your FREE Tickets and Meet Us

02
.
10
.
2024
Michał Krochecki
Ruby on Rails
Conferences
Frontend
Backend
Business

How to build effective website: simplicity & McDonald's

14
.
11
.
2023
Lukasz Jackiewicz
Ruby on Rails
Frontend
Design

WebUSB - Print Image and Text in Thermal Printers

14
.
11
.
2023
Burak Aybar
Backend
Tutorial
Software

Thermal Printer Protocols for Image and Text

14
.
11
.
2023
Burak Aybar
Backend
Tutorial
Software

What happened in Visuality in 2019

14
.
11
.
2023
Maciej Zdunek
Visuality
HR

Three strategies that work in board games and in real life

14
.
11
.
2023
Michał Łęcicki
Ruby on Rails

HR Wave - No Bullshit HR Conference 2019

14
.
11
.
2023
Alicja Gruszczyk
HR
Conferences

Stress in Project Management

02
.
10
.
2024
Wiktor De Witte
HR
Project Management

Lightning Talks in your company

14
.
11
.
2023
Jarosław Kowalewski
Ruby on Rails
Visuality

How to find good developers and keep them happy - Part 1

02
.
10
.
2024
Michał Krochecki
HR
Visuality

PKP Intercity - Redesign and case study of polish national carrier

14
.
11
.
2023
Katarzyna Szewc
Design
Business
Frontend