No complex protocols, no integration headaches, no compatibility issues β just beautiful, expressive Ruby code.
AI models are powerful, but they need to interact with your applications to be truly useful. Traditional approaches mean wrestling with:
- π Complex communication protocols and custom JSON formats
- π Integration challenges with different model providers
- π§© Compatibility issues between your app and AI tools
- π§ Managing the state between AI interactions and your data
Fast MCP solves all these problems by providing a clean, Ruby-focused implementation of the Model Context Protocol, making AI integration a joy, not a chore.
- π οΈ Tools API - Let AI models call your Ruby functions securely, with in-depth argument validation through Dry-Schema.
- π Resources API - Share data between your app and AI models
- π¬ Prompts API - Define structured prompt templates for LLM interactions
- π Multiple Transports - Choose from STDIO, HTTP, or SSE based on your needs
- π§© Framework Integration - Works seamlessly with Rails, Sinatra, and Hanami
- π Authentication Support - Secure your AI endpoints with ease
- π Real-time Updates - Subscribe to changes for interactive applications
# Define tools for AI models to use
server = MCP::Server.new(name: 'recipe-ai', version: '1.0.0')
# Define a tool by inheriting from MCP::Tool
class GetRecipesTool < MCP::Tool
description "Find recipes based on ingredients"
# These arguments will generate the needed JSON to be presented to the MCP Client
# And they will be validated at run time.
# The validation is based off Dry-Schema, with the addition of the description.
arguments do
required(:ingredients).array(:string).description("List of ingredients")
optional(:cuisine).filled(:string).description("Type of cuisine")
end
def call(ingredients:, cuisine: nil)
Recipe.find_by_ingredients(ingredients, cuisine: cuisine)
end
end
# Register the tool with the server
server.register_tool(GetRecipesTool)
# Share data resources with AI models by inheriting from MCP::Resource
class IngredientsResource < MCP::Resource
uri "food/popular_ingredients"
name "Popular Ingredients"
mime_type "application/json"
def default_content
JSON.generate(Ingredient.popular.as_json)
end
end
# Register the resource with the server
server.register_resource(IngredientsResource)
# Accessing the resource through the server
server.read_resource("food/popular_ingredients")
# Updating the resource content through the server
server.update_resource("food/popular_ingredients", JSON.generate({id: 1, name: 'tomato'}))
# Easily integrate with web frameworks
# config/application.rb (Rails)
config.middleware.use MCP::RackMiddleware.new(
name: 'recipe-ai',
version: '1.0.0'
) do |server|
# Register tools and resources here
server.register_tool(GetRecipesTool)
end
# Secure your AI endpoints
config.middleware.use MCP::AuthenticatedRackMiddleware.new(
name: 'recipe-ai',
version: '1.0.0',
token: ENV['MCP_AUTH_TOKEN']
)
# Build real-time applications
server.on_resource_update do |resource|
ActionCable.server.broadcast("recipe_updates", resource.metadata)
end
# In your Gemfile
gem 'fast-mcp'
# Then run
bundle install
# Or install it yourself
gem install fast-mcp
require 'fast_mcp'
# Create an MCP server
server = MCP::Server.new(name: 'my-ai-server', version: '1.0.0')
# Define a tool by inheriting from MCP::Tool
class SummarizeTool < MCP::Tool
description "Summarize a given text"
arguments do
required(:text).filled(:string).description("Text to summarize")
optional(:max_length).filled(:integer).description("Maximum length of summary")
end
def call(text:, max_length: 100)
# Your summarization logic here
text.split('.').first(3).join('.') + '...'
end
end
# Register the tool with the server
server.register_tool(SummarizeTool)
# Create a resource by inheriting from MCP::Resource
class StatisticsResource < MCP::Resource
uri "data/statistics"
name "Usage Statistics"
description "Current system statistics"
mime_type "application/json"
def default_content
JSON.generate({
users_online: 120,
queries_per_minute: 250,
popular_topics: ["Ruby", "AI", "WebDev"]
})
end
end
# Register the resource with the server
server.register_resource(StatisticsResource.new)
# Start the server
server.start
# config/application.rb
module YourApp
class Application < Rails::Application
# ...
config.middleware.use MCP::RackMiddleware.new(
name: 'my-ai-server',
version: '1.0.0'
) do |server|
# Register tools and resources here
server.register_tool(SummarizeTool)
end
end
end
MCP has developed a very useful inspector. You can use it to validate your implementation. I suggest you use the examples I provided with this project as an easy boilerplate. Clone this project, then give it a go !
npx @modelcontextprotocol/inspector examples/server_with_stdio_transport.rb
Or to test with an SSE transport using a rack middleware:
npx @modelcontextprotocol/inspector examples/rack_middleware.rb
Or to test over SSE with an authenticated rack middleware:
npx @modelcontextprotocol/inspector examples/authenticated_rack_middleware.rb
You can test your custom implementation with the official MCP inspector by using:
# Test with a stdio transport:
npx @modelcontextprotocol/inspector path/to/your_ruby_file.rb
# Test with an HTTP / SSE server. In the UI select SSE and input your address.
npx @modelcontextprotocol/inspector
# app.rb
require 'sinatra'
require 'fast_mcp'
use MCP::RackMiddleware.new(name: 'my-ai-server', version: '1.0.0') do |server|
# Register tools and resources here
server.register_tool(SummarizeTool)
end
get '/' do
'Hello World!'
end
Add your server to your Claude Desktop configuration at:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"my-great-server": {
"command": "ruby",
"args": [
"/Users/path/to/your/awesome/fast-mcp/server.rb"
]
}
}
}
Feature | Status |
---|---|
β JSON-RPC 2.0 | Full implementation for communication |
β Tool Definition & Calling | Define and call tools with rich argument types |
β Resource Management | Create, read, update, and subscribe to resources |
β Prompt Templates | Define and share prompt templates with arguments |
β Transport Options | STDIO, HTTP, and SSE for flexible integration |
β Framework Integration | Rails, Sinatra, Hanami, and any Rack-compatible framework |
β Authentication | Secure your AI endpoints with token authentication |
β Schema Support | Full JSON Schema for tool arguments with validation |
- π€ AI-powered Applications: Connect LLMs to your Ruby app's functionality
- π Real-time Dashboards: Build dashboards with live AI-generated insights
- π Microservice Communication: Use MCP as a clean protocol between services
- π Interactive Documentation: Create AI-enhanced API documentation
- π¬ Chatbots and Assistants: Build AI assistants with access to your app's data
- π Getting Started Guide
- π§© Integration Guide
- π€οΈ Rails Integration
- π Sinatra Integration
- πΈ Hanami Integration
- π Resources
- π οΈ Tools
- π¬ Prompts
- π Transports
- π API Reference
Check out the examples directory for more detailed examples:
-
π¨ Basic Examples:
-
π Web Integration:
- Ruby 3.2+
We welcome contributions to Fast MCP! Here's how you can help:
- Fork the repository
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create a new Pull Request
Please read our Contributing Guide for more details.
This project is available as open source under the terms of the MIT License.
- The Model Context Protocol team for creating the specification
- The Dry-Schema team for the argument validation.
- All contributors to this project