A Ruby gem for integrating AI agents from Mastra into your Rails application. This gem provides a Rails generator to create and manage AI agents seamlessly within your Ruby applications.
This gem provides a bridge between Mastra AI agents and your Rails application. It automatically generates client code for your agents and provides a consistent interface for calling them from your services.
Before you can use an agent in your Ruby application, you need to create it in Mastra first:
- Create a new agent with your desired configuration
- Note the agent name (this will be used in the next step)
- Ensure the Mastra service is running and accessible
Once your agent is created in Mastra, generate the corresponding Ruby agent using the Rails generator:
# Generate all agents from Mastra
bin/rails generate ai:agent --allThis command will:
- Fetch the agent configuration from Mastra
- Generate the corresponding Ruby client file in
app/generated/ai/agents/
After generating the agent client, you can use it in your Rails application services:
Create a service that defines an output class using T::Struct:
class MyAgentService
extend T::Sig
# Define the expected output structure
class Output < T::Struct
const :result, String
const :confidence, Float
const :metadata, T::Hash[String, T.untyped], default: {}
end
sig { params(input: String).returns(Output) }
def self.call(input)
# Your service logic here
messages = [Ai.user_message(input)]
# Initialize the agent (this would typically be done once and reused)
agent = Ai::Agent.new(agent_name: 'my_custom_agent', client: Ai::Client.new)
# Generate structured output
result = agent.generate_object(
messages: messages,
output_class: Output
)
end
endStructure your messages according to your agent's expected format:
messages = [
Ai.system_message("You are a helpful assistant that..."),
Ai.user_message("User's question or request")
]Sending Images:
For vision-capable agents, you can include images in your messages:
# Read image from file
image_data = File.binread('path/to/image.png')
# Create a message with text and image
message = Ai.user_message_with_image(
"What objects are in this image?",
image_data,
"image/png"
)
messages = [message]For multiple images in one message, use manual construction:
image1 = File.binread('photo1.jpg')
image2 = File.binread('photo2.png')
message = Ai::Message.new(
role: Ai::MessageRole::User,
content: [
Ai::TextPart.new(text: "Compare these images:"),
Ai::ImagePart.new(image_data: image1, media_type: "image/jpeg"),
Ai::ImagePart.new(image_data: image2, media_type: "image/png")
]
)Using Image URLs:
Instead of sending image data, you can send a URL for the agent to fetch the image:
# Create a message with text and image URL
message = Ai.user_message_with_image_url(
"What objects are in this image?",
"https://example.com/photo.jpg",
"image/jpeg"
)
messages = [message]For multiple image URLs or mixing URLs with text:
message = Ai::Message.new(
role: Ai::MessageRole::User,
content: [
Ai::TextPart.new(text: "Compare these images from the web:"),
Ai::ImagePart.new(image_url: "https://example.com/image1.jpg", media_type: "image/jpeg"),
Ai::ImagePart.new(image_url: "https://example.com/image2.png", media_type: "image/png")
]
)# Use the service in your application
result = MyAgentService.call("What is the weather like today?")
puts result.result # => Agent's response
puts result.confidence # => Confidence score
puts result.metadata # => Additional metadataThe generate_object method supports additional options for fine-tuning:
agent = Ai::Agent.new(agent_name: 'my_agent', client: Ai::Client.new)
result = agent.generate_object(
messages: messages,
output_class: Output,
runtime_context: { user_id: 123, session: 'abc' }, # Optional context
max_retries: 3, # Retry attempts (default: 2)
max_steps: 10 # Max processing steps (default: 5)
)
# Access the structured output
output = result.object
puts output.resultFor simple text generation without structured output:
agent = Ai::Agent.new(agent_name: 'my_agent', client: Ai::Client.new)
result = agent.generate_text(
messages: messages,
runtime_context: {}, # Optional context
max_retries: 2, # Retry attempts (default: 2)
max_steps: 5 # Max processing steps (default: 5)
)
puts result.text # Generated text responseThe gem supports OpenTelemetry integration for monitoring and observability. You can configure telemetry settings to control what data is recorded and add metadata for better tracing:
Telemetry Options:
enabled: Enable/disable telemetry (default: true)record_inputs: Record input messages (default: false, disable for sensitive data)record_outputs: Record output responses (default: false, disable for sensitive data)function_id: Identifier for grouping telemetry data by functionmetadata: Additional metadata for OpenTelemetry traces (agent identification, service info, etc.)
# Create telemetry settings
telemetry_settings = Ai::TelemetrySettings.new(
enabled: true,
record_inputs: false, # Disable for sensitive data
record_outputs: true, # Enable for monitoring
function_id: 'user-chat-session',
metadata: {
'agent.name' => 'customer-support',
'service.name' => 'mastra',
'service.namespace' => 'customer-service',
'cx.application.name' => 'ai-tracing',
'cx.subsystem.name' => 'mastra-agents'
}
)
# Use with text generation
result = agent.generate_text(
messages: messages,
telemetry: telemetry_settings
)
# Use with structured output
result = agent.generate_object(
messages: messages,
output_class: Output,
telemetry: telemetry_settings
)Mastra "workflows" let you orchestrate multiple agents to solve a task.
The generator creates a lightweight Ruby wrapper that exposes typed Input and Output structs and a convenience .call method.
# Generate a specific workflow
bin/rails generate ai:workflow --name="testWorkflow"
# Generate all workflows present in Mastra
bin/rails generate ai:workflow --allThe generator will create files in app/generated/ai/workflows/.
input = Ai::Workflows::TestWorkflow::Input.new(
first_number: 2.0,
second_number: 3.0
)
# Run the workflow
result = Ai::Workflows::TestWorkflow.call(input)
puts result.sumOfNumbers # => 5.0Example generated wrapper:
module Ai
module Workflows
class TestWorkflow
extend T::Sig
class Input < T::Struct
const :first_number, Float
const :second_number, Float
end
class Output < T::Struct
const :sumOfNumbers, Float
end
sig { params(input: Input).returns(Output) }
def self.call(input:)
response = Ai.client.run_workflow('testWorkflow', input:)
TypeCoerce[Output].from(response)
rescue TypeCoerce::CoercionError, ArgumentError => e
raise Ai::Error, "Workflow 'testWorkflow' output could not be coerced: #{e.message}"
end
end
end
endBoth ai:agent and ai:workflow generators accept the same set of command-line flags:
--endpoint URL– Mastra API endpoint URL (default: value fromMASTRA_LOCATIONenvironment variable).--all– Generate all agents/workflows found in Mastra.--name NAME– Name of the agent/workflow to generate (required unless--allis provided).--force– Override existing files if they already exist.--output PATH– Output directory for generated files.- Agents default:
app/generated/ai/agents - Workflows default:
app/generated/ai/workflows
- Agents default:
rails generate ai:agent AGENT_NAME [options]
rails generate ai:workflow WORKFLOW_NAME [options]
The same applies for workflows.
# Generate a specific agent
bin/rails generate ai:agent my_agent --endpoint http://localhost:4111
# Generate all agents from Mastra
bin/rails generate ai:agent --all --endpoint http://localhost:4111# Generate with custom output directory
bin/rails generate ai:agent my_agent --endpoint http://localhost:4111 --output lib/custom/agents
# Force overwrite existing files
bin/rails generate ai:agent my_agent --endpoint http://localhost:4111 --force
# Generate all agents with custom settings
bin/rails generate ai:agent --all --endpoint http://localhost:4111 --output app/ai/agents --force