|
| 1 | +# Deploying Azure OpenAI Service via Bicep |
| 2 | + |
| 3 | +In order to use the `openai-gpt` agent you will either need a public OpenAI API key or an Azure |
| 4 | +OpenAI deployment. Due to its additional features and manageability, we recommend using the Azure |
| 5 | +OpenAI Service. This document provides easy step-by-step instructions on how to deploy the Azure |
| 6 | +OpenAI Service using Bicep files. |
| 7 | + |
| 8 | +There are two things needed to begin a chat experience with Azure OpenAI Service, an Azure OpenAI |
| 9 | +Service account and an Azure OpenAI Deployment. The Azure OpenAI Service account is a resource that |
| 10 | +contains multiple different model deployments. The Azure OpenAI Deployment is a model deployment |
| 11 | +that can be called via an API to generate responses. |
| 12 | + |
| 13 | +## Prerequisites |
| 14 | + |
| 15 | +Before you begin, ensure you have the following: |
| 16 | + |
| 17 | +- An active Azure subscription |
| 18 | +- Azure CLI or Azure PowerShell installed |
| 19 | +- Proper permissions to create resources in your Azure subscription |
| 20 | + |
| 21 | +## Steps to Deploy |
| 22 | + |
| 23 | +### 1. Getting and modifying the Bicep file |
| 24 | + |
| 25 | +Clone the repository and navigate to the `./docs/development/AzureOAIDeployment` directory: |
| 26 | + |
| 27 | +```sh |
| 28 | +git clone www.github.com/PowerShell/AIShell |
| 29 | +cd AIShell/docs/development/AzureOAIDeployment |
| 30 | +``` |
| 31 | + |
| 32 | +You will need to modify the `./main.bicep` file to include your own values. You will have to modify |
| 33 | +the parameters at the top of the file. |
| 34 | + |
| 35 | +```bicep |
| 36 | +@description('This is the name of your AI Service Account') |
| 37 | +param aiserviceaccountname string = '<Insert own account name>' |
| 38 | +
|
| 39 | +@description('Custom domain name for the endpoint') |
| 40 | +param customDomainName string = '<Insert own unique domain name>' |
| 41 | +
|
| 42 | +@description('Name of the deployment') |
| 43 | +param modeldeploymentname string = '<Insert own deployment name>' |
| 44 | +
|
| 45 | +@description('The model being deployed') |
| 46 | +param model string = 'gpt-4' |
| 47 | +
|
| 48 | +@description('Version of the model being deployed') |
| 49 | +param modelversion string = 'turbo-2024-04-09' |
| 50 | +
|
| 51 | +@description('Capacity for specific model used') |
| 52 | +param capacity int = 80 |
| 53 | +
|
| 54 | +@description('Location for all resources.') |
| 55 | +param location string = resourceGroup().location |
| 56 | +
|
| 57 | +@allowed([ |
| 58 | + 'S0' |
| 59 | +]) |
| 60 | +param sku string = 'S0' |
| 61 | +``` |
| 62 | + |
| 63 | +The above is defaulted to use your resource groups location as the location for the account and |
| 64 | +`gpt-4` version `turbo-2024-04-09`. You can modify this based on the particular model you feel best |
| 65 | +fits your needs. You can find more information on available models at |
| 66 | +[Azure OpenAI Service models][03]. Additionally, you may need to modify the capacity of the |
| 67 | +deployment based on what model you use, you can find more information at |
| 68 | +[Azure OpenAI Service quotas and limits][04]. |
| 69 | + |
| 70 | +### 2. Deploy the Azure OpenAI Service |
| 71 | + |
| 72 | +Now that you have modified the bicep files parameters, you are ready to deploy your own Azure OpenAI |
| 73 | +instance! Simply use either Azure CLI or Azure PowerShell to deploy the bicep files. |
| 74 | + |
| 75 | +#### Using Azure CLI |
| 76 | + |
| 77 | +```sh |
| 78 | +az deployment group create --resource-group <resource group name> --template-file ./main.bicep |
| 79 | + |
| 80 | +// Get the endpoint and key of the deployment |
| 81 | +az cognitiveservices account show --name <account name> --resource-group <resource group name> | jq -r .properties.endpoint |
| 82 | + |
| 83 | +az cognitiveservices account keys list --name <account name> --resource-group <resource group name> | jq -r .key1 |
| 84 | +``` |
| 85 | + |
| 86 | +#### Using Azure PowerShell |
| 87 | + |
| 88 | +```powershell |
| 89 | +New-AzResourceGroupDeployment -ResourceGroupName <resource group name> -TemplateFile ./main.bicep |
| 90 | +
|
| 91 | +// Get the endpoint and key of the deployment |
| 92 | +Get-AzCognitiveServicesAccount -ResourceGroupName <resource group name> -Name <account name> | Select-Object -Property Endpoint |
| 93 | +
|
| 94 | +Get-AzCognitiveServicesAccountKey -ResourceGroupName <resource group name> -Name <account name> | Select-Object -Property Key1 |
| 95 | +``` |
| 96 | + |
| 97 | +### 3. Configuring the agent to use the deployment |
| 98 | + |
| 99 | +Now that you have the endpoint and key of the deployment, you can open up the `openai-gpt` agent and |
| 100 | +run `/agent config` to edit the json configuration file with all the details of the deployment. The |
| 101 | +example below shows the default system prompt and the fields that need to be updated. |
| 102 | + |
| 103 | +```jsonc |
| 104 | +{ |
| 105 | + // Declare GPT instances. |
| 106 | + "GPTs": [ |
| 107 | + { |
| 108 | + "Name": "ps-az-gpt4", |
| 109 | + "Description": "<insert description here>", |
| 110 | + "Endpoint": "<insert endpoint here>", |
| 111 | + "Deployment": "<insert deployment name here>", |
| 112 | + "ModelName": "gpt-4", |
| 113 | + "Key": "<insert key here>", |
| 114 | + "SystemPrompt": "1. You are a helpful and friendly assistant with expertise in PowerShell scripting and command line.\n2. Assume user is using the operating system `osx` unless otherwise specified.\n3. Use the `code block` syntax in markdown to encapsulate any part in responses that is code, YAML, JSON or XML, but not table.\n4. When encapsulating command line code, use '```powershell' if it's PowerShell command; use '```sh' if it's non-PowerShell CLI command.\n5. When generating CLI commands, never ever break a command into multiple lines. Instead, always list all parameters and arguments of the command on the same line.\n6. Please keep the response concise but to the point. Do not overexplain." |
| 115 | + } |
| 116 | + ], |
| 117 | + // Specify the default GPT instance to use for user query. |
| 118 | + // For example: "ps-az-gpt4" |
| 119 | + "Active": "ps-az-gpt4" |
| 120 | +} |
| 121 | +``` |
| 122 | + |
| 123 | +## Conclusion |
| 124 | + |
| 125 | +You have successfully deployed the Azure OpenAI Service and configured your `openai-gpt` agent to |
| 126 | +communicate with it! If you would like to go further in the model training, filters and settings you |
| 127 | +can find more information about Azure OpenAI deployments at |
| 128 | +[Azure OpenAI Service documentation][02]. |
| 129 | + |
| 130 | +A big thank you to Sebastian Jensen's medium article, |
| 131 | +[Deploy an Azure OpenAI service with LLM deployments via Bicep][01] for inspiring the Bicep code and |
| 132 | +guidance on how to deploy the Azure OpenAI Service using Bicep files. Please check out his blog for |
| 133 | +more great AI content! |
| 134 | + |
| 135 | +[01]: https://medium.com/medialesson/deploy-an-azure-openai-service-with-llm-deployments-via-bicep-244411472d40 |
| 136 | +[02]: https://docs.microsoft.com/azure/cognitive-services/openai/ |
| 137 | +[03]: https://learn.microsoft.com/azure/ai-services/openai/concepts/models?tabs=global-standard%2Cstandard-chat- |
| 138 | +[04]: https://learn.microsoft.com/azure/ai-services/openai/quotas-limits |
0 commit comments