Skip to content

Commit 2f82701

Browse files
chore (docs): update MCP examples (vercel#5373)
1 parent 559ac1a commit 2f82701

File tree

4 files changed

+216
-2
lines changed

4 files changed

+216
-2
lines changed
Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,107 @@
1+
---
2+
title: Model Context Protocol (MCP) Tools
3+
description: Learn how to use MCP tools with the AI SDK and Next.js
4+
tags: ['next', 'tool use', 'agent', 'mcp']
5+
---
6+
7+
# MCP Tools
8+
9+
The AI SDK supports Model Context Protocol (MCP) tools by offering a lightweight client that exposes a `tools` method for retrieving tools from a MCP server. After use, the client should always be closed to release resources.
10+
11+
## Server
12+
13+
Let's create a route handler for `/api/completion` that will generate text based on the input prompt and MCP tools that can be called at any time during a generation. The route will call the `streamText` function from the `ai` module, which will then generate text based on the input prompt and stream it to the client.
14+
15+
```ts filename="app/api/completion/route.ts"
16+
import { experimental_createMCPClient, streamText } from 'ai';
17+
import { Experimental_StdioMCPTransport } from 'ai/mcp-stdio';
18+
import { openai } from '@ai-sdk/openai';
19+
20+
export async function POST(req: Request) {
21+
const { prompt }: { prompt: string } = await req.json();
22+
23+
try {
24+
// Initialize an MCP client to connect to a `stdio` MCP server:
25+
const transport = new Experimental_StdioMCPTransport({
26+
command: 'node',
27+
args: ['src/stdio/dist/server.js'],
28+
});
29+
const stdioClient = await experimental_createMCPClient({
30+
transport,
31+
});
32+
33+
// Alternatively, you can connect to a Server-Sent Events (SSE) MCP server:
34+
const sseClient = await experimental_createMCPClient({
35+
transport: {
36+
type: 'sse',
37+
url: 'https://actions.zapier.com/mcp/[YOUR_KEY]/sse',
38+
},
39+
});
40+
41+
// Similarly to the stdio example, you can pass in your own custom transport as long as it implements the `MCPTransport` interface:
42+
const transport = new MyCustomTransport({
43+
// ...
44+
});
45+
const customTransportClient = await experimental_createMCPClient({
46+
transport,
47+
});
48+
49+
const toolSetOne = await stdioClient.tools();
50+
const toolSetTwo = await sseClient.tools();
51+
const toolSetThree = await customTransportClient.tools();
52+
const tools = {
53+
...toolSetOne,
54+
...toolSetTwo,
55+
...toolSetThree, // note: this approach causes subsequent tool sets to override tools with the same name
56+
};
57+
58+
const response = await streamText({
59+
model: openai('gpt-4o'),
60+
tools,
61+
prompt,
62+
// When streaming, the client should be closed after the response is finished:
63+
onFinish: async () => {
64+
await stdioClient.close();
65+
await sseClient.close();
66+
await customTransportClient.close();
67+
},
68+
});
69+
70+
return response.toDataStreamResponse();
71+
} catch (error) {
72+
return new Response('Internal Server Error', { status: 500 });
73+
}
74+
}
75+
```
76+
77+
## Client
78+
79+
Let's create a simple React component that imports the `useCompletion` hook from the `@ai-sdk/react` module. The `useCompletion` hook will call the `/api/completion` endpoint when a button is clicked. The endpoint will generate text based on the input prompt and stream it to the client.
80+
81+
```tsx filename="app/page.tsx"
82+
'use client';
83+
84+
import { useCompletion } from '@ai-sdk/react';
85+
86+
export default function Page() {
87+
const { completion, complete } = useCompletion({
88+
api: '/api/completion',
89+
});
90+
91+
return (
92+
<div>
93+
<div
94+
onClick={async () => {
95+
await complete(
96+
'Please schedule a call with Sonny and Robby for tomorrow at 10am ET for me!',
97+
);
98+
}}
99+
>
100+
Schedule a call
101+
</div>
102+
103+
{completion}
104+
</div>
105+
);
106+
}
107+
```

content/docs/03-ai-sdk-core/15-tools-and-tool-calling.mdx

Lines changed: 24 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -693,15 +693,37 @@ const mcpClient = await createMCPClient({
693693

694694
#### Closing the MCP Client
695695

696-
After initialization, always close the MCP client when you're done to prevent resource leaks. Use try/finally or cleanup functions in your framework:
696+
After initialization, you should close the MCP client based on your usage pattern:
697+
698+
- For short-lived usage (e.g., single requests), close the client when the response is finished
699+
- For long-running clients (e.g., command line apps), keep the client open but ensure it's closed when the application terminates
700+
701+
When streaming responses, you can close the client when the LLM response has finished. For example, when using `streamText`, you should use the `onFinish` callback:
702+
703+
```typescript
704+
const mcpClient = await experimental_createMCPClient({
705+
// ...
706+
});
707+
708+
const result = await streamText({
709+
model: openai('gpt-4o'),
710+
tools: mcpClient.tools(),
711+
prompt: 'What is the weather in Brooklyn, New York?',
712+
onFinish: async () => {
713+
await mcpClient.close();
714+
},
715+
});
716+
```
717+
718+
When generating responses without streaming, you can use try/finally or cleanup functions in your framework:
697719

698720
```typescript
699721
let mcpClient: MCPClient | undefined;
722+
700723
try {
701724
mcpClient = await experimental_createMCPClient({
702725
// ...
703726
});
704-
// ...
705727
} finally {
706728
await mcpClient?.close();
707729
}
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
import { openai } from '@ai-sdk/openai';
2+
import { experimental_createMCPClient, streamText } from 'ai';
3+
4+
export const maxDuration = 30;
5+
6+
export async function POST(req: Request) {
7+
const { messages } = await req.json();
8+
9+
const mcpClient = await experimental_createMCPClient({
10+
transport: {
11+
type: 'sse',
12+
url: 'https://actions.zapier.com/mcp/[YOUR_KEY]/sse',
13+
},
14+
});
15+
16+
try {
17+
const zapierTools = await mcpClient.tools();
18+
19+
const result = streamText({
20+
model: openai('gpt-4o'),
21+
messages,
22+
tools: zapierTools,
23+
onFinish: async () => {
24+
await mcpClient.close();
25+
},
26+
maxSteps: 10,
27+
});
28+
29+
return result.toDataStreamResponse();
30+
} catch (error) {
31+
return new Response('Internal Server Error', { status: 500 });
32+
}
33+
}
Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
'use client';
2+
3+
import { useChat } from '@ai-sdk/react';
4+
5+
export default function Page() {
6+
const { messages, input, handleInputChange, handleSubmit } = useChat({
7+
api: '/api/mcp-zapier',
8+
});
9+
10+
return (
11+
<div className="flex flex-col items-center justify-end h-screen gap-4">
12+
<h1 className="text-xl p-4">My AI Assistant</h1>
13+
14+
<div className="flex flex-col gap-2 p-4 mt-auto">
15+
{messages.map(message => (
16+
<div key={message.id}>
17+
<strong>{`${message.role}: `}</strong>
18+
{message.parts.map((part, index) => {
19+
switch (part.type) {
20+
case 'text':
21+
return <span key={index}>{part.text}</span>;
22+
case 'tool-invocation': {
23+
return (
24+
<pre key={index}>
25+
{JSON.stringify(part.toolInvocation, null, 2)}
26+
</pre>
27+
);
28+
}
29+
}
30+
})}
31+
</div>
32+
))}
33+
</div>
34+
35+
<div className="flex flex-col items-center gap-2 p-4">
36+
<textarea
37+
value={input}
38+
onChange={handleInputChange}
39+
placeholder="Start chatting"
40+
className="border-2 border-gray-300 rounded-md p-2 w-96 h-32"
41+
/>
42+
<button
43+
className="bg-blue-500 text-white p-2 rounded-md w-full px-4"
44+
type="button"
45+
onClick={handleSubmit}
46+
>
47+
Send
48+
</button>
49+
</div>
50+
</div>
51+
);
52+
}

0 commit comments

Comments
 (0)