You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+23-32Lines changed: 23 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ This chapter helps you to quickly set up a new Python chat module function using
9
9
> [!NOTE]
10
10
> To develop this function further, you will require the following environment variables in your `.env` file:
11
11
```bash
12
-
> If you use azureopenai:
12
+
> If you use azure-openai:
13
13
AZURE_OPENAI_API_KEY
14
14
AZURE_OPENAI_ENDPOINT
15
15
AZURE_OPENAI_API_VERSION
@@ -30,53 +30,29 @@ LANGCHAIN_API_KEY
30
30
LANGCHAIN_PROJECT
31
31
```
32
32
33
-
#### 1. Create a new repository
33
+
#### 1. Clone the repository
34
34
35
-
- In GitHub, choose `Use this template` > `Create a new repository` in the repository toolbar.
36
-
37
-
- Choose the owner, and pick a name for the new repository.
38
-
39
-
> [!IMPORTANT]
40
-
> If you want to deploy the evaluation function to Lambda Feedback, make sure to choose the Lambda Feedback organization as the owner.
41
-
42
-
- Set the visibility to `Public` or `Private`.
43
-
44
-
> [!IMPORTANT]
45
-
> If you want to use GitHub [deployment protection rules](https://docs.github.com/en/actions/deployment/targeting-different-environments/using-environments-for-deployment#deployment-protection-rules), make sure to set the visibility to `Public`.
46
-
47
-
- Click on `Create repository`.
48
-
49
-
#### 2. Clone the new repository
50
-
51
-
Clone the new repository to your local machine using the following command:
35
+
Clone this repository to your local machine using the following command:
You're ready to start developing your chat function. Head over to the [Development](#development) section to learn more.
60
44
61
-
#### 4. Update the README
45
+
#### 3. Update the README
62
46
63
47
In the `README.md` file, change the title and description so it fits the purpose of your chat function.
64
48
65
49
Also, don't forget to update or delete the Quickstart chapter from the `README.md` file after you've completed these steps.
66
50
67
-
## Run the Script
68
-
69
-
You can run the Python function itself. Make sure to have a main function in either `src/module.py` or `index.py`.
70
-
71
-
```bash
72
-
python src/module.py
73
-
```
74
-
75
51
## Development
76
52
77
-
You can create your own invokation to your own agents hosted anywhere. You can add the new invokation in the `module.py` file. Then you can create your own agent script in the `src/agents` folder.
53
+
You can create your own invocation to your own agents hosted anywhere. Copy the `base_agent` from `src/agents/` and edit it to match your LLM agent requirements. Import the new invocation in the `module.py` file.
78
54
79
-
You agent can be based on an LLM hosted anywhere, you have available currenlty OpenAI, AzureOpenAI, and Ollama models but you can introduce your own API call in the `src/agents/llm_factory.py`.
55
+
You agent can be based on an LLM hosted anywhere, you have available currently OpenAI, AzureOpenAI, and Ollama models but you can introduce your own API call in the `src/agents/llm_factory.py`.
80
56
81
57
### Prerequisites
82
58
@@ -93,6 +69,21 @@ You agent can be based on an LLM hosted anywhere, you have available currenlty O
93
69
94
70
src/module.py # chat_module function implementation
95
71
src/module_test.py # chat_module function tests
72
+
src/agents/ # find all agents developed for the chat functionality
73
+
src/agents/utils/test_prompts.py # allows testing of any LLM agent on a couple of example inputs containing Lambda Feedback Questions and synthetic student conversations
74
+
```
75
+
76
+
## Run the Chat Script
77
+
78
+
You can run the Python function itself. Make sure to have a main function in either `src/module.py` or `index.py`.
79
+
80
+
```bash
81
+
python src/module.py
82
+
```
83
+
84
+
You can also use the `test_prompts.py` script to test the agents with example inputs from Lambda Feedback questions and synthetic conversations.
0 commit comments