Skip to content

Commit 02ac92e

Browse files
committed
Modernized code to use latest langchain versions and MI
Signed-off-by: Paul Yuknewicz <[email protected]>
1 parent 86fd753 commit 02ac92e

File tree

3 files changed

+72
-32
lines changed

3 files changed

+72
-32
lines changed

README.md

Lines changed: 36 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -26,23 +26,28 @@ This sample shows how to take a human prompt as HTTP Get or Post input, calculat
2626
### Pre-reqs
2727
1) [Python 3.8+](https://www.python.org/)
2828
2) [Azure Functions Core Tools](https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Cmacos%2Ccsharp%2Cportal%2Cbash#install-the-azure-functions-core-tools)
29-
3) [Azure OpenAPI API key, endpoint, and deployment](https://portal.azure.com)
30-
4) Add this `local.settings.json` file to this folder to simplify local development and include Key from step 3
29+
3) [Azure Developer CLI](https://aka.ms/azd)
30+
4) Once you have your Azure subscription, run the following in a new terminal window to create Azure OpenAI and other resources needed:
31+
```bash
32+
azd provision
33+
```
34+
35+
Take note of the value of `AZURE_OPENAI_ENDPOINT` which can be found in `./.azure/<env name from azd provision>/.env`. It will look something like:
36+
```bash
37+
AZURE_OPENAI_ENDPOINT="https://cog-<unique string>.openai.azure.com/"
38+
```
3139

32-
`./local.settings.json`
40+
5) Add this `local.settings.json` file to the root of the repo folder to simplify local development. Replace `AZURE_OPENAI_ENDPOINT` with your value from step 4. Optionally you can choose a different model deployment in `CHAT_MODEL_DEPLOYMENT_NAME`. This file will be gitignored to protect secrets from committing to your repo, however by default the sample uses Entra identity (user identity and mananaged identity) so it is secretless.
3341
```json
3442
{
3543
"IsEncrypted": false,
3644
"Values": {
3745
"FUNCTIONS_WORKER_RUNTIME": "python",
3846
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
39-
"AzureWebJobsStorage": "",
40-
"AZURE_OPENAI_KEY": "...",
41-
"AZURE_OPENAI_ENDPOINT": "https://<service_name>.openai.azure.com/",
42-
"AZURE_OPENAI_SERVICE": "...",
43-
"AZURE_OPENAI_CHATGPT_DEPLOYMENT": "...",
44-
"OPENAI_API_VERSION": "2023-05-15",
45-
"USE_LANGCHAIN": "True"
47+
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
48+
"AZURE_OPENAI_ENDPOINT": "https://<your deployment>.openai.azure.com/",
49+
"AZURE_OPENAI_CHATGPT_DEPLOYMENT": "chat",
50+
"OPENAI_API_VERSION": "2023-05-15"
4651
}
4752
}
4853
```
@@ -120,3 +125,24 @@ To provision and deploy:
120125
```bash
121126
azd up
122127
```
128+
129+
## Source Code
130+
131+
The key code that makes the prompting and completion work is as follows in [function_app.py](function_app.py). The `/api/ask` function and route expects a prompt to come in the POST body using a standard HTTP Trigger in Python. Then once the environment variables are set to configure OpenAI and LangChain frameworks, we can leverage favorite aspects of LangChain. In this simple example we take a prompt, build a better prompt from a template, and then invoke the LLM. By default the LLM deployment is `gpt-35-turbo` as defined in [./infra/main.parameters.json](./infra/main.parameters.json) but you can experiment with other models.
132+
133+
```python
134+
llm = AzureChatOpenAI(
135+
deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT,
136+
temperature=0.3,
137+
openai_api_key=AZURE_OPENAI_KEY
138+
)
139+
llm_prompt = PromptTemplate.from_template(
140+
"The following is a conversation with an AI assistant. " +
141+
"The assistant is helpful.\n\n" +
142+
"A:How can I help you today?\nHuman: {human_prompt}?"
143+
)
144+
formatted_prompt = llm_prompt.format(human_prompt=prompt)
145+
146+
response = llm.invoke(formatted_prompt)
147+
logging.info(response.content)
148+
```

function_app.py

Lines changed: 33 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,23 @@
22
import logging
33
import os
44
import openai
5-
from langchain.prompts import PromptTemplate
6-
from langchain.llms.openai import AzureOpenAI
5+
from langchain_core.prompts import PromptTemplate
6+
from langchain_openai import AzureChatOpenAI
7+
from azure.identity import DefaultAzureCredential
78

89
app = func.FunctionApp()
910

1011

12+
# Use the Entra Id DefaultAzureCredential to get the token
13+
credential = DefaultAzureCredential()
14+
# Set the API type to `azure_ad`
15+
os.environ["OPENAI_API_TYPE"] = "azure_ad"
16+
# Set the API_KEY to the token from the Azure credential
17+
os.environ["OPENAI_API_KEY"] = credential.get_token(
18+
"https://cognitiveservices.azure.com/.default"
19+
).token
20+
21+
1122
@app.function_name(name='ask')
1223
@app.route(route='ask', auth_level='anonymous', methods=['POST'])
1324
def main(req):
@@ -18,41 +29,42 @@ def main(req):
1829
req_body = req.get_json()
1930
except ValueError:
2031
raise RuntimeError("prompt data must be set in POST.")
21-
else:
32+
else:
2233
prompt = req_body.get('prompt')
2334
if not prompt:
2435
raise RuntimeError("prompt data must be set in POST.")
2536

26-
# init OpenAI: Replace these with your own values, either in env vars
27-
AZURE_OPENAI_KEY = os.environ.get("AZURE_OPENAI_KEY")
37+
# Init OpenAI: configure these using Env Variables
2838
AZURE_OPENAI_ENDPOINT = os.environ.get("AZURE_OPENAI_ENDPOINT")
39+
AZURE_OPENAI_KEY = credential.get_token(
40+
"https://cognitiveservices.azure.com/.default"
41+
).token
2942
AZURE_OPENAI_CHATGPT_DEPLOYMENT = os.environ.get(
3043
"AZURE_OPENAI_CHATGPT_DEPLOYMENT") or "chat"
31-
if 'AZURE_OPENAI_KEY' not in os.environ:
32-
raise RuntimeError("No 'AZURE_OPENAI_KEY' env var set.")
44+
OPENAI_API_VERSION = os.environ.get(
45+
"OPENAI_API_VERSION") or "2023-05-15"
3346

3447
# configure azure openai for langchain and/or llm
3548
openai.api_key = AZURE_OPENAI_KEY
3649
openai.api_base = AZURE_OPENAI_ENDPOINT
3750
openai.api_type = 'azure'
3851

3952
# this may change in the future
40-
# set this version in environment variables using OPENAI_API_VERSION
41-
openai.api_version = '2023-05-15'
42-
43-
logging.info('Using Langchain')
53+
openai.api_version = OPENAI_API_VERSION
4454

45-
llm = AzureOpenAI(
55+
llm = AzureChatOpenAI(
4656
deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT,
4757
temperature=0.3,
4858
openai_api_key=AZURE_OPENAI_KEY
4959
)
50-
llm_prompt = PromptTemplate(
51-
input_variables=["human_prompt"],
52-
template="The following is a conversation with an AI assistant. " +
53-
"The assistant is helpful.\n\n" +
54-
"A:How can I help you today?\nHuman: {human_prompt}?",
55-
)
56-
from langchain.chains import LLMChain
57-
chain = LLMChain(llm=llm, prompt=llm_prompt)
58-
return chain.run(prompt)
60+
llm_prompt = PromptTemplate.from_template(
61+
"The following is a conversation with an AI assistant. " +
62+
"The assistant is helpful.\n\n" +
63+
"A:How can I help you today?\nHuman: {human_prompt}?"
64+
)
65+
formatted_prompt = llm_prompt.format(human_prompt=prompt)
66+
67+
response = llm.invoke(formatted_prompt)
68+
logging.info(response.content)
69+
70+
return func.HttpResponse(response.content)

requirements.txt

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,5 +3,7 @@
33
# Manually managing azure-functions-worker may cause unexpected issues
44

55
azure-functions
6+
azure-identity
67
openai
7-
langchain
8+
langchain-core
9+
langchain-openai

0 commit comments

Comments
 (0)