Skip to content

Commit eabbf93

Browse files
committed
update readme
1 parent 833360b commit eabbf93

File tree

1 file changed

+23
-32
lines changed

1 file changed

+23
-32
lines changed

README.md

Lines changed: 23 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ This chapter helps you to quickly set up a new Python chat module function using
99
> [!NOTE]
1010
> To develop this function further, you will require the following environment variables in your `.env` file:
1111
```bash
12-
> If you use azureopenai:
12+
> If you use azure-openai:
1313
AZURE_OPENAI_API_KEY
1414
AZURE_OPENAI_ENDPOINT
1515
AZURE_OPENAI_API_VERSION
@@ -30,53 +30,29 @@ LANGCHAIN_API_KEY
3030
LANGCHAIN_PROJECT
3131
```
3232

33-
#### 1. Create a new repository
33+
#### 1. Clone the repository
3434

35-
- In GitHub, choose `Use this template` > `Create a new repository` in the repository toolbar.
36-
37-
- Choose the owner, and pick a name for the new repository.
38-
39-
> [!IMPORTANT]
40-
> If you want to deploy the evaluation function to Lambda Feedback, make sure to choose the Lambda Feedback organization as the owner.
41-
42-
- Set the visibility to `Public` or `Private`.
43-
44-
> [!IMPORTANT]
45-
> If you want to use GitHub [deployment protection rules](https://docs.github.com/en/actions/deployment/targeting-different-environments/using-environments-for-deployment#deployment-protection-rules), make sure to set the visibility to `Public`.
46-
47-
- Click on `Create repository`.
48-
49-
#### 2. Clone the new repository
50-
51-
Clone the new repository to your local machine using the following command:
35+
Clone this repository to your local machine using the following command:
5236

5337
```bash
54-
git clone <repository-url>
38+
git clone https://github.com/lambda-feedback/lambda-chat
5539
```
5640

57-
#### 3. Develop the chat function
41+
#### 2. Develop the chat function
5842

5943
You're ready to start developing your chat function. Head over to the [Development](#development) section to learn more.
6044

61-
#### 4. Update the README
45+
#### 3. Update the README
6246

6347
In the `README.md` file, change the title and description so it fits the purpose of your chat function.
6448

6549
Also, don't forget to update or delete the Quickstart chapter from the `README.md` file after you've completed these steps.
6650

67-
## Run the Script
68-
69-
You can run the Python function itself. Make sure to have a main function in either `src/module.py` or `index.py`.
70-
71-
```bash
72-
python src/module.py
73-
```
74-
7551
## Development
7652

77-
You can create your own invokation to your own agents hosted anywhere. You can add the new invokation in the `module.py` file. Then you can create your own agent script in the `src/agents` folder.
53+
You can create your own invocation to your own agents hosted anywhere. Copy the `base_agent` from `src/agents/` and edit it to match your LLM agent requirements. Import the new invocation in the `module.py` file.
7854

79-
You agent can be based on an LLM hosted anywhere, you have available currenlty OpenAI, AzureOpenAI, and Ollama models but you can introduce your own API call in the `src/agents/llm_factory.py`.
55+
You agent can be based on an LLM hosted anywhere, you have available currently OpenAI, AzureOpenAI, and Ollama models but you can introduce your own API call in the `src/agents/llm_factory.py`.
8056

8157
### Prerequisites
8258

@@ -93,6 +69,21 @@ You agent can be based on an LLM hosted anywhere, you have available currenlty O
9369

9470
src/module.py # chat_module function implementation
9571
src/module_test.py # chat_module function tests
72+
src/agents/ # find all agents developed for the chat functionality
73+
src/agents/utils/test_prompts.py # allows testing of any LLM agent on a couple of example inputs containing Lambda Feedback Questions and synthetic student conversations
74+
```
75+
76+
## Run the Chat Script
77+
78+
You can run the Python function itself. Make sure to have a main function in either `src/module.py` or `index.py`.
79+
80+
```bash
81+
python src/module.py
82+
```
83+
84+
You can also use the `test_prompts.py` script to test the agents with example inputs from Lambda Feedback questions and synthetic conversations.
85+
```bash
86+
python src/agents/utils/test_prompts.py
9687
```
9788

9889
### Building the Docker Image

0 commit comments

Comments
 (0)