You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: content/admin/configuration/configuring-your-enterprise/deploying-github-ae.md
+3-5
Original file line number
Diff line number
Diff line change
@@ -27,12 +27,11 @@ You must have permission to perform the `/register/action` operation for the res
27
27
The {% data variables.actions.azure_portal %} allows you to deploy the {% data variables.product.product_name %} account in your Azure resource group.
28
28
29
29
1. Click one of the following two links to begin deployment of {% data variables.product.product_name %}. The link you should click depends on the Azure cloud where you plan to deploy {% data variables.product.product_name %}. For more information about Azure Government, see [What is Azure Government?](https://docs.microsoft.com/en-us/azure/azure-government/documentation-government-welcome) in the Microsoft documentation.
30
-
30
+
31
31
-[Deploy {% data variables.product.product_name %} to Azure Commercial](https://aka.ms/create-github-ae-instance)
32
32
-[Deploy {% data variables.product.product_name %} to Azure Government](https://aka.ms/create-github-ae-instance-gov)
33
33
1. To begin the process of adding a new {% data variables.product.product_name %} account, click **Create GitHub AE account**.
34
34
1. Complete the "Project details" and "Instance details" fields.
35
-

36
35
-**Account name:** The hostname for your enterprise
37
36
-**Administrator username:** A username for the initial enterprise owner that will be created in {% data variables.product.product_name %}
38
37
-**Administrator email:** The email address that will receive the login information
@@ -53,14 +52,13 @@ You can use the {% data variables.actions.azure_portal %} to navigate to your {%
53
52
54
53
1. On the {% data variables.actions.azure_portal %}, in the left panel, click **All resources**.
55
54
1. From the available filters, click **All types**, then deselect **Select all** and select **GitHub AE**:
56
-

57
55
58
56
## Next steps
59
57
60
58
- Once your deployment has been provisioned, the next step is to initialize {% data variables.product.product_name %}. For more information, see "[Initializing {% data variables.product.product_name %}](/github-ae@latest/admin/configuration/configuring-your-enterprise/initializing-github-ae)."
61
59
- If you're trying {% data variables.product.product_name %}, you can upgrade to a full license at any time during the trial period by contacting contact {% data variables.contact.contact_enterprise_sales %}. If you haven't upgraded by the last day of your trial, then the deployment is automatically deleted. If you need more time to evaluate {% data variables.product.product_name %}, contact {% data variables.contact.contact_enterprise_sales %} to request an extension.
62
60
63
-
## Further reading
61
+
## Further reading
64
62
65
63
- "[Enabling {% data variables.product.prodname_advanced_security %} features on {% data variables.product.product_name %}](/github/getting-started-with-github/about-github-advanced-security#enabling-advanced-security-features-on-github-ae)"
66
-
- "[{% data variables.product.product_name %} release notes](/github-ae@latest/admin/overview/github-ae-release-notes)"
64
+
- "[{% data variables.product.product_name %} release notes](/github-ae@latest/admin/overview/github-ae-release-notes)"
Copy file name to clipboardexpand all lines: content/admin/github-actions/enabling-github-actions-for-github-enterprise-server/enabling-github-actions-with-azure-blob-storage.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -55,10 +55,10 @@ To configure {% data variables.product.prodname_ghe_server %} to use OIDC with a
55
55
1. Register a new application in Azure Active Directory. For more information, see [Register an application](https://learn.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app#register-an-application) in the Azure documentation.
56
56
1. In your Azure application, under "Essentials", take note of the values for "Application (client) ID" and "Directory (tenant) ID". These values are used later.
57
57
58
-

58
+

59
59
1. In your Azure application, under "Manage", click **Certificates & secrets**, select the **Federated credentials** tab, then click **Add credential**.
60
60
61
-

61
+

62
62
1. Enter the following details for the credential:
63
63
1. For "Federated credential scenario", select **Other issuer**.
64
64
1. For "Issuer", enter `https://HOSTNAME/_services/token`, where `HOSTNAME` is the public hostname for {% data variables.location.product_location_enterprise %}. For example, `https://my-ghes-host.example.com/_services/token`.
Copy file name to clipboardexpand all lines: content/admin/monitoring-activity-in-your-enterprise/reviewing-audit-logs-for-your-enterprise/streaming-the-audit-log-for-your-enterprise.md
+29-40
Original file line number
Diff line number
Diff line change
@@ -68,7 +68,7 @@ You can set up streaming to S3 with access keys or, to avoid storing long-lived
68
68
#### Setting up streaming to S3 with access keys
69
69
{% endif %}
70
70
71
-
To stream audit logs to Amazon's S3 endpoint, you must have a bucket and access keys. For more information, see [Creating, configuring, and working with Amazon S3 buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-buckets-s3.html) in the AWS documentation. Make sure to block public access to the bucket to protect your audit log information.
71
+
To stream audit logs to Amazon's S3 endpoint, you must have a bucket and access keys. For more information, see [Creating, configuring, and working with Amazon S3 buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-buckets-s3.html) in the AWS documentation. Make sure to block public access to the bucket to protect your audit log information.
72
72
73
73
To set up audit log streaming from {% data variables.product.prodname_dotcom %} you will need:
74
74
* The name of your Amazon S3 bucket
@@ -117,7 +117,7 @@ For information on creating or accessing your access key ID and secret key, see
117
117
```
118
118
For more information, see [Creating IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html) in the AWS documentation.
119
119
1. Configure the role and trust policy for the {% data variables.product.prodname_dotcom %} IdP. For more information, see [Creating a role for web identity or OpenID Connect Federation (console)](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-idp_oidc.html) in the AWS documentation.
120
-
120
+
121
121
- Add the permissions policy you created above to allow writes to the bucket.
122
122
- Edit the trust relationship to add the `sub` field to the validation conditions, replacing `ENTERPRISE` with the name of your enterprise.
123
123
```
@@ -154,84 +154,73 @@ You can consolidate your audit logs from {% data variables.product.product_name
154
154
155
155
### Setting up streaming to Azure Blob Storage
156
156
157
-
Before setting up a stream in {% data variables.product.prodname_dotcom %}, you must first have created a storage account and a container in Microsoft Azure. For details, see the Microsoft documentation, "[Introduction to Azure Blob Storage](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction)."
157
+
Before setting up a stream in {% data variables.product.prodname_dotcom %}, you must first have created a storage account and a container in Microsoft Azure. For details, see the Microsoft documentation, "[Introduction to Azure Blob Storage](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction)."
158
158
159
159
To configure the stream in {% data variables.product.prodname_dotcom %} you need the URL of a SAS token.
160
160
161
161
**On Microsoft Azure portal**:
162
162
1. On the Home page, click **Storage Accounts**.
163
-
2. Click the name of the storage account you want to use, then click **Containers**.
164
-
165
-

166
-
163
+
1. Under "Name", click the name of the storage account you want to use.
164
+
1. Under "Data storage", click **Containers**.
167
165
1. Click the name of the container you want to use.
168
-
1. Click **Shared access tokens**.
169
-
170
-

171
-
172
-
1. In the **Permissions** drop-down menu, change the permissions to only allow `Create` and `Write`.
1. In the left sidebar, under "Settings", click **Shared access tokens**.
167
+
1. Select the **Permissions** dropdown menu, then select `Create` and `Write` and deselect all other options.
176
168
1. Set an expiry date that complies with your secret rotation policy.
177
169
1. Click **Generate SAS token and URL**.
178
170
1. Copy the value of the **Blob SAS URL** field that's displayed. You will use this URL in {% data variables.product.prodname_dotcom %}.
179
171
180
172
**On {% data variables.product.prodname_dotcom %}**:
181
173
{% data reusables.enterprise.navigate-to-log-streaming-tab %}
182
174
1. Click **Configure stream** and select **Azure Blob Storage**.
183
-
175
+
184
176

185
177
186
178
1. On the configuration page, enter the blob SAS URL that you copied in Azure. The **Container** field is auto-filled based on the URL.
187
179
188
180

189
-
181
+
190
182
1. Click **Check endpoint** to verify that {% data variables.product.prodname_dotcom %} can connect and write to the Azure Blob Storage endpoint.
191
-
183
+
192
184

193
185
194
186
{% data reusables.enterprise.verify-audit-log-streaming-endpoint %}
195
187
196
188
### Setting up streaming to Azure Event Hubs
197
189
198
-
Before setting up a stream in {% data variables.product.prodname_dotcom %}, you must first have an event hub namespace in Microsoft Azure. Next, you must create an event hub instance within the namespace. You'll need the details of this event hub instance when you set up the stream. For details, see the Microsoft documentation, "[Quickstart: Create an event hub using Azure portal](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-create)."
190
+
Before setting up a stream in {% data variables.product.prodname_dotcom %}, you must first have an event hub namespace in Microsoft Azure. Next, you must create an event hub instance within the namespace. You'll need the details of this event hub instance when you set up the stream. For details, see the Microsoft documentation, "[Quickstart: Create an event hub using Azure portal](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-create)."
199
191
200
-
You need two pieces of information about your event hub: its instance name and the connection string.
192
+
You need two pieces of information about your event hub: its instance name and the connection string.
**On {% data variables.product.prodname_dotcom %}**:
222
211
{% data reusables.enterprise.navigate-to-log-streaming-tab %}
223
212
1. Click **Configure stream** and select **Azure Event Hubs**.
224
-
213
+
225
214

226
215
227
216
1. On the configuration page, enter:
228
217
* The name of the Azure Event Hubs instance.
229
218
* The connection string.
230
-
219
+
231
220

232
-
221
+
233
222
1. Click **Check endpoint** to verify that {% data variables.product.prodname_dotcom %} can connect and write to the Azure Events Hub endpoint.
234
-
223
+
235
224

236
225
237
226
{% data reusables.enterprise.verify-audit-log-streaming-endpoint %}
@@ -247,7 +236,7 @@ After you set up streaming to Datadog, you can see your audit log data by filter
247
236
1. In Datadog, generate a client token or an API key, then click **Copy key**. For more information, see [API and Application Keys](https://docs.datadoghq.com/account_management/api-app-keys/) in Datadog Docs.
248
237
{% data reusables.enterprise.navigate-to-log-streaming-tab %}
249
238
1. Select the **Configure stream** dropdown menu and click **Datadog**.
250
-
239
+
251
240

252
241
1. Under "Token", paste the token you copied earlier.
253
242
@@ -256,7 +245,7 @@ After you set up streaming to Datadog, you can see your audit log data by filter
256
245
257
246

258
247
1. To verify that {% data variables.product.prodname_dotcom %} can connect and write to the Datadog endpoint, click **Check endpoint**.
259
-
248
+
260
249

261
250
{% data reusables.enterprise.verify-audit-log-streaming-endpoint %}
262
251
1. After a few minutes, confirm that audit log data is appearing on the **Logs** tab in Datadog. If audit log data is not appearing, confirm that your token and site are correct in {% data variables.product.prodname_dotcom %}.
@@ -283,7 +272,7 @@ To set up streaming to Google Cloud Storage, you must create a service account i
283
272
284
273

285
274
286
-
1. To verify that {% data variables.product.prodname_dotcom %} can connect and write to the Google Cloud Storage bucket, click **Check endpoint**.
275
+
1. To verify that {% data variables.product.prodname_dotcom %} can connect and write to the Google Cloud Storage bucket, click **Check endpoint**.
287
276
288
277

289
278
@@ -295,20 +284,20 @@ To stream audit logs to Splunk's HTTP Event Collector (HEC) endpoint you must ma
295
284
296
285
{% data reusables.enterprise.navigate-to-log-streaming-tab %}
297
286
1. Click **Configure stream** and select **Splunk**.
298
-
287
+
299
288

300
289
301
290
1. On the configuration page, enter:
302
291
* The domain on which the application you want to stream to is hosted.
303
-
304
-
If you are using Splunk Cloud, `Domain` should be `http-inputs-<host>`, where `host` is the domain you use in Splunk Cloud. For example: `http-inputs-mycompany.splunkcloud.com`.
292
+
293
+
If you are using Splunk Cloud, `Domain` should be `http-inputs-<host>`, where `host` is the domain you use in Splunk Cloud. For example: `http-inputs-mycompany.splunkcloud.com`.
305
294
306
295
* The port on which the application accepts data.<br>
307
296
308
297
If you are using Splunk Cloud, `Port` should be `443` if you haven't changed the port configuration. If you are using the free trial version of Splunk Cloud, `Port` should be `8088`.
309
298
310
299
* A token that {% data variables.product.prodname_dotcom %} can use to authenticate to the third-party application.
311
-
300
+
312
301

313
302
314
303
1. Leave the **Enable SSL verification** check box selected.
@@ -329,7 +318,7 @@ Datadog only accepts logs from up to 18 hours in the past. If you pause a stream
329
318
330
319
{% data reusables.enterprise.navigate-to-log-streaming-tab %}
331
320
1. Click **Pause stream**.
332
-
321
+
333
322

334
323
335
324
1. A confirmation message is displayed. Click **Pause stream** to confirm.
@@ -341,7 +330,7 @@ When the application is ready to receive audit logs again, click **Resume stream
341
330
342
331
{% data reusables.enterprise.navigate-to-log-streaming-tab %}
343
332
1. Click **Delete stream**.
344
-
333
+
345
334

346
335
347
336
1. A confirmation message is displayed. Click **Delete stream** to confirm.
0 commit comments