Skip to content

Commit 2e61cd0

Browse files
committed
minor updates readme
1 parent 27a513a commit 2e61cd0

File tree

3 files changed

+6
-7
lines changed

3 files changed

+6
-7
lines changed

.dockerignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -131,7 +131,7 @@ dmypy.json
131131
# VSCode configuration
132132
.vscode
133133

134-
# Evaluation function config
134+
# Chat function config
135135
config.json
136136

137137
# README

.gitignore

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,6 @@ dmypy.json
132132
.vscode
133133

134134
.DS_Store
135-
evaluation_function/db_analytics
136135

137136
# Synthetic data conversations
138137
src/agents/utils/synthetic_conversations/*.json

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -110,7 +110,7 @@ docker run -e OPENAI_API_KEY={your key} -e OPENAI_MODEL={your LLM chosen model n
110110
docker run --env-file .env -it --name my-lambda-container -p 8080:8080 llm_chat
111111
```
112112

113-
This will start the evaluation function and expose it on port `8080` and it will be open to be curl:
113+
This will start the chat function and expose it on port `8080` and it will be open to be curl:
114114

115115
```bash
116116
curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' --header 'Content-Type: application/json' --data '{"message":"hi","params":{"conversation_id":"12345Test","conversation_history": [{"type":"user","content":"hi"}]}}'
@@ -156,19 +156,19 @@ Body with optional Params:
156156

157157
Deploying the chat function to Lambda Feedback is simple and straightforward, as long as the repository is within the [Lambda Feedback organization](https://github.com/lambda-feedback).
158158

159-
After configuring the repository, a [GitHub Actions workflow](.github/workflows/main.yml) will automatically build and deploy the evaluation function to Lambda Feedback as soon as changes are pushed to the main branch of the repository. For development, the [GitHub Actions Dev workflow](.github/workflows/dev.yml) also deploys a dev version of the function onto AWS.
159+
After configuring the repository, a [GitHub Actions workflow](.github/workflows/main.yml) will automatically build and deploy the chat function to Lambda Feedback as soon as changes are pushed to the main branch of the repository. For development, the [GitHub Actions Dev workflow](.github/workflows/dev.yml) also deploys a dev version of the function onto AWS.
160160

161161
## Troubleshooting
162162

163163
### Containerized Function Fails to Start
164164

165-
If your evaluation function is working fine when run locally, but not when containerized, there is much more to consider. Here are some common issues and solution approaches:
165+
If your chat function is working fine when run locally, but not when containerized, there is much more to consider. Here are some common issues and solution approaches:
166166

167167
**Run-time dependencies**
168168

169169
Make sure that all run-time dependencies are installed in the Docker image.
170170

171171
- Python packages: Make sure to add the dependency to the `requirements.txt` or `pyproject.toml` file, and run `pip install -r requirements.txt` or `poetry install` in the Dockerfile.
172172
- System packages: If you need to install system packages, add the installation command to the Dockerfile.
173-
- ML models: If your evaluation function depends on ML models, make sure to include them in the Docker image.
174-
- Data files: If your evaluation function depends on data files, make sure to include them in the Docker image.
173+
- ML models: If your chat function depends on ML models, make sure to include them in the Docker image.
174+
- Data files: If your chat function depends on data files, make sure to include them in the Docker image.

0 commit comments

Comments
 (0)