Skip to content

Commit c85a635

Browse files
istranicalgitbook-bot
authored andcommittedJan 24, 2024
GITBOOK-27: change request with no subject merged in GitBook
1 parent c0a7987 commit c85a635

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed
 

‎storage-and-credentials/storage-options.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ description: >-
88

99
**Deep Lake datasets can be stored locally, or on several cloud storage providers including Deep Lake Storage, AWS S3, Microsoft Azure, and Google Cloud Storage.** Datasets are accessed by choosing the correct prefix for the dataset `path` that is passed to methods such as `deeplake.load(path)`, and `deeplake.empty(path)`. The path prefixes are:
1010

11-
<table data-header-hidden><thead><tr><th width="222.76694359979138">Storage</th><th>Path</th><th>Notes</th></tr></thead><tbody><tr><td><strong>Storage Location</strong></td><td><strong>Path</strong></td><td><strong>Notes</strong></td></tr><tr><td><strong>Local</strong></td><td><code>/local_path</code></td><td></td></tr><tr><td><strong>Deep Lake Storage</strong></td><td><code>hub://org_id/dataset_name</code></td><td></td></tr><tr><td><strong>Deep Lake Managed DB</strong></td><td><code>hub://org_id/dataset_name</code></td><td>Specify <code>runtime = {"tensor_db": True}</code> when creating the dataset</td></tr><tr><td><strong>AWS S3</strong></td><td><code>s3://bucket_name/dataset_name</code></td><td>Dataset can be connected to Deep Lake via <a href="managed-credentials/">Managed Credentials</a></td></tr><tr><td><strong>Microsoft Azure (Gen2 DataLake Only)</strong></td><td><code>azure://account_name/container_name/dataset_name</code></td><td>Dataset can be connected to Deep Lake via <a href="managed-credentials/">Managed Credentials</a></td></tr><tr><td><strong>Google Cloud</strong></td><td><code>gcs://bucket_name/dataset_name</code></td><td>Dataset can be connected to Deep Lake via <a href="managed-credentials/">Managed Credentials</a></td></tr></tbody></table>
11+
<table data-header-hidden><thead><tr><th width="222.76694359979138">Storage</th><th>Path</th><th>Notes</th></tr></thead><tbody><tr><td><strong>Storage Location</strong></td><td><strong>Path</strong></td><td><strong>Notes</strong></td></tr><tr><td><strong>Local</strong></td><td><code>/local_path</code></td><td></td></tr><tr><td><strong>Deep Lake Storage</strong></td><td><code>hub://org_id/dataset_name</code></td><td></td></tr><tr><td><strong>Deep Lake Managed DB</strong></td><td><code>hub://org_id/dataset_name</code></td><td>Specify <code>runtime = {"tensor_db": True}</code> when creating the dataset</td></tr><tr><td><strong>AWS S3</strong></td><td><code>s3://bucket_name/dataset_name</code></td><td>Dataset can be connected to Deep Lake via <a href="managed-credentials/">Managed Credentials</a></td></tr><tr><td><strong>Microsoft Azure (Gen2 DataLake Only)</strong></td><td><code>azure://account_name/container_name/dataset_name</code></td><td>Dataset can be connected to Deep Lake via <a href="managed-credentials/">Managed Credentials</a></td></tr><tr><td><strong>Google Cloud</strong></td><td><code>gcs://bucket_name/dataset_name</code></td><td>Dataset can be connected to Deep Lake via <a href="managed-credentials/">Managed Credentials</a></td></tr><tr><td><strong>In-Memory</strong></td><td><code>mem://dataset_name</code></td><td></td></tr></tbody></table>
1212

1313
{% hint style="info" %}
1414
Connecting Deep Lake datasets stored in your own cloud via Deep Lake [Managed Credentials](managed-credentials/) is required for accessing enterprise features, and it significantly simplifies dataset access.

0 commit comments

Comments
 (0)
Please sign in to comment.