@@ -16,6 +16,8 @@ placeholders filled in based on the provided arguments.
16
16
17
17
## Defining Prompts
18
18
19
+ Prompts can be defined in your configuration files using the ` prompts ` key:
20
+
19
21
``` yaml
20
22
prompts :
21
23
- id : generate-controller
@@ -40,31 +42,43 @@ Each prompt contains:
40
42
- **id**: Unique identifier for the prompt
41
43
- **description**: Human-readable description
42
44
- **schema** (optional): Defines input parameters with descriptions and required fields
43
- - **messages**: The sequence of conversation messages that make up the prompt
45
+ - **messages**: The sequence of conversation messages that make up the prompt template
44
46
45
47
## Variable Substitution
46
48
47
49
Prompts support variable substitution in message content using the format ` {{variableName}}`. When LLM requests a
48
50
prompt with arguments, the MCP server replaces these placeholders with the provided values.
49
51
52
+ # # Prompt Message Structure
53
+
54
+ Each message in the `messages` array must include :
55
+
56
+ - **role**: The role of the message sender (system, user, or assistant)
57
+ - **content**: The content of the message (can include variable placeholders)
58
+
59
+ Valid role values are defined in the `Mcp\Types\Role` enum and include :
60
+
61
+ - ` user`
62
+ - ` assistant`
63
+
50
64
# # Available Prompt Tools
51
65
52
66
When connected via MCP, LLM has access to the following prompt-related tools :
53
67
54
68
# ## Prompts Tools
55
69
56
- - `prompts. list` : List all available prompts defined in the configuration
57
- - `prompts. get` : Get a specific prompt by ID
70
+ - `prompts- list` : List all available prompts defined in the configuration
71
+ - `prompt- get` : Get a specific prompt by ID
58
72
59
73
# # Example Usage
60
74
61
75
Here's how LLM might use prompts during a conversation :
62
76
63
77
1. **Listing available prompts** :
64
- Claude can request a list of all available prompts to discover what templates are available.
78
+ LLM can request a list of all available prompts to discover what templates are available.
65
79
66
80
2. **Using a prompt with arguments** :
67
- Claude can request a specific prompt with arguments, which will return the prompt messages with variables
81
+ LLM can request a specific prompt with arguments, which will return the prompt messages with variables
68
82
substituted.
69
83
70
84
3. **Custom workflows** :
0 commit comments