Skip to content

Commit

Permalink
[JS] fix: Update citations sample (#1732)
Browse files Browse the repository at this point in the history
## Linked issues

#minor

Co-authored-by: Corina Gum <>
  • Loading branch information
corinagum authored Jun 13, 2024
1 parent e64642c commit 1d469b3
Show file tree
Hide file tree
Showing 3 changed files with 26 additions and 17 deletions.
2 changes: 2 additions & 0 deletions js/samples/04.ai-apps/h.datasource-azureOpenAI/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,8 @@ This sample shows how to integrate your search index as a data source into promp
1. Fill the `AZURE_OPENAI_ENDPOINT`, `AZURE_SEARCH_ENDPOINT`, and `AZURE_SEARCH_INDEX` variables appropriately.
1. Verify you are logged into azure cli. This is required because this sample uses managed identity. You can download and install Azure CLI from [Azure CLI docs](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli). For more information on setting up environment variables, see the [Azure SDK documentation](https://github.com/Azure/azure-sdk-for-go/wiki/Set-up-Your-Environment-for-Authentication).
1. Follow the [use your data quickstart instructions](https://learn.microsoft.com/en-us/azure/ai-services/openai/use-your-data-quickstart?tabs=command-line%2Cpython-new&pivots=programming-language-studio#add-your-data-using-azure-openai-studio) to add your data using Azure OpenAI Studio. Select `Upload files` as the data source. You can upload the `nba.pdf` file. Take note of the index name.
## Testing the sample
Expand Down
40 changes: 23 additions & 17 deletions js/samples/04.ai-apps/h.datasource-azureOpenAI/src/app.ts
Original file line number Diff line number Diff line change
Expand Up @@ -41,23 +41,27 @@ const planner = new ActionPlanner({
prompt.config.completion.model = 'gpt-4o';

if (process.env.AZURE_SEARCH_ENDPOINT) {
(prompt.config.completion as any).data_sources = [{
type: 'azure_search',
parameters: {
endpoint: process.env.AZURE_SEARCH_ENDPOINT,
index_name: process.env.AZURE_SEARCH_INDEX,
semantic_configuration: 'default',
query_type: 'simple',
fields_mapping: { },
in_scope: true,
strictness: 3,
top_n_documents: 5,
role_information: fs.readFileSync(path.join(__dirname, '../src/prompts/chat/skprompt.txt')).toString('utf-8'),
authentication: {
type: 'system_assigned_managed_identity'
(prompt.config.completion as any).data_sources = [
{
type: 'azure_search',
parameters: {
endpoint: process.env.AZURE_SEARCH_ENDPOINT,
index_name: process.env.AZURE_SEARCH_INDEX,
semantic_configuration: 'default',
query_type: 'simple',
fields_mapping: {},
in_scope: true,
strictness: 3,
top_n_documents: 5,
role_information: fs
.readFileSync(path.join(__dirname, '../src/prompts/chat/skprompt.txt'))
.toString('utf-8'),
authentication: {
type: 'system_assigned_managed_identity'
}
}
}
}];
];
}

return prompt;
Expand All @@ -83,12 +87,14 @@ export const app = new Application<ApplicationTurnState>({
});

app.conversationUpdate('membersAdded', async (context) => {
await context.sendActivity('Welcome! I\'m a conversational bot that can tell you about your data. You can also type `/clear` to clear the conversation history.');
await context.sendActivity(
"Welcome! I'm a conversational bot that can tell you about your data. You can also type `/clear` to clear the conversation history."
);
});

app.message('/clear', async (context, state) => {
state.deleteConversationState();
await context.sendActivity('New chat session started: Previous messages won\'t be used as context for new queries.');
await context.sendActivity("New chat session started: Previous messages won't be used as context for new queries.");
});

app.error(async (context: TurnContext, err: any) => {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,3 +57,4 @@ deploy:
AZURE_OPENAI_ENDPOINT: ${{AZURE_OPENAI_ENDPOINT}}
AZURE_SEARCH_ENDPOINT: ${{AZURE_SEARCH_ENDPOINT}}
AZURE_SEARCH_INDEX: ${{AZURE_SEARCH_INDEX}}
AZURE_TENANT_ID: ${{TEAMS_APP_TENANT_ID}}

0 comments on commit 1d469b3

Please sign in to comment.