Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
### Dependency updates

### Bundles
* Added support for --bind flag in `bundle generate` ([#3782](https://github.com/databricks/cli/pull/3782))
* Add `pydabs` template replacing `experimental-jobs-as-code` template ([#3806](https://github.com/databricks/cli/pull/3806))
* You can now use `python` section instead of `experimental/python` ([#3540](https://github.com/databricks/cli/pull/3540))

Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/alert/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> [CLI] bundle deployment bind my_alert [UUID] --auto-approve
Updating deployment state...
Successfully bound alert with an id '[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound alert with an id '[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle summary
Name: test-bundle-$UNIQUE_NAME
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/cluster/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@

>>> [CLI] bundle deployment bind cluster1 [CLUSTER-ID] --auto-approve
Updating deployment state...
Successfully bound cluster with an id '[CLUSTER-ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound cluster with an id '[CLUSTER-ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deployment unbind cluster1
Updating deployment state...
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/dashboard/output.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind dashboard1 [DASHBOARD_ID] --auto-approve
Updating deployment state...
Successfully bound dashboard with an id '[DASHBOARD_ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound dashboard with an id '[DASHBOARD_ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind dashboard1 [DASHBOARD_ID] --auto-approve
Updating deployment state...
Successfully bound dashboard with an id '[DASHBOARD_ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound dashboard with an id '[DASHBOARD_ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> errcode [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind database_instance1 [UUID] --auto-approve
Updating deployment state...
Successfully bound database_instance with an id '[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound database_instance with an id '[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle summary
Name: test-bundle-$UNIQUE_NAME
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/experiment/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@
=== Substitute variables in the template
=== Create a pre-defined experiment
=== Bind experiment: Updating deployment state...
Successfully bound experiment with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound experiment with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

=== Deploy bundle: Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Deploying resources...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@ test.py

>>> [CLI] bundle deployment bind test_job_key [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-generate-bind-[UNIQUE_NAME]/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ Created job with ID: [NUMID]
=== Bind job:
>>> [CLI] bundle deployment bind foo [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

=== Remove .databricks directory to simulate fresh deployment:
>>> rm -rf .databricks
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/job/noop-job/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> [CLI] bundle deployment bind job_1 [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/my_project/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> uv run --with [DATABRICKS_BUNDLES_WHEEL] -q [CLI] bundle deployment bind job_1 [NUMID] --auto-approve
Updating deployment state...
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound job with an id '[NUMID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> uv run --with [DATABRICKS_BUNDLES_WHEEL] -q [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/my_project/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@ resources:

>>> [CLI] bundle deployment bind endpoint1 test-endpoint-[UUID]
Updating deployment state...
Successfully bound model_serving_endpoint with an id 'test-endpoint-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound model_serving_endpoint with an id 'test-endpoint-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@

>>> [CLI] bundle deployment bind monitor1 catalog.schema.table
Updating deployment state...
Successfully bound quality_monitor with an id 'catalog.schema.table'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound quality_monitor with an id 'catalog.schema.table'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/bind-quality-monitor-test-localonly/default/files...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,8 @@ resources:

>>> [CLI] bundle deployment bind model1 main.test-schema-rmodel-[UUID].test-registered-model-[UUID]
Updating deployment state...
Successfully bound registered_model with an id 'main.test-schema-rmodel-[UUID].test-registered-model-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound registered_model with an id 'main.test-schema-rmodel-[UUID].test-registered-model-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/schema/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,8 @@
}

=== Bind schema: Updating deployment state...
Successfully bound schema with an id 'main.test-schema-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound schema with an id 'main.test-schema-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

=== Deploy bundle: Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Deploying resources...
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/secret-scope/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

>>> [CLI] bundle deployment bind secret_scope1 test-secret-scope-[UUID] --auto-approve
Updating deployment state...
Successfully bound secret_scope with an id 'test-secret-scope-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound secret_scope with an id 'test-secret-scope-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/bind-secret-scope-test-[UNIQUE_NAME]/default/files...
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@

>>> [CLI] bundle deployment bind sql_warehouse1 [SQL-WAREHOUSE-ID] --auto-approve
Updating deployment state...
Successfully bound sql_warehouse with an id '[SQL-WAREHOUSE-ID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound sql_warehouse with an id '[SQL-WAREHOUSE-ID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle summary
Name: test-bundle-$UNIQUE_NAME
Expand Down
3 changes: 2 additions & 1 deletion acceptance/bundle/deployment/bind/volume/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@
=== Create a pre-defined volume:
>>> [CLI] bundle deployment bind volume1 main.test-schema-[UUID].volume-[UUID] --auto-approve
Updating deployment state...
Successfully bound volume with an id 'main.test-schema-[UUID].volume-[UUID]'. Run 'bundle deploy' to deploy changes to your workspace
Successfully bound volume with an id 'main.test-schema-[UUID].volume-[UUID]'
Run 'bundle deploy' to deploy changes to your workspace

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-[UNIQUE_NAME]/default/files...
Expand Down
8 changes: 8 additions & 0 deletions acceptance/bundle/generate/auto-bind/databricks.yml.tmpl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
bundle:
name: auto-bind-test

workspace:
root_path: /tmp/${UNIQUE_NAME}

include:
- resources/*.yml
5 changes: 5 additions & 0 deletions acceptance/bundle/generate/auto-bind/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

48 changes: 48 additions & 0 deletions acceptance/bundle/generate/auto-bind/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@

=== Create a pre-defined job:
Created job with ID: [NUMID]

>>> [CLI] workspace mkdirs /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]

>>> [CLI] workspace import /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]/test --file test.py --language PYTHON

=== Generate and bind in one step:
>>> [CLI] bundle generate job --key test_job --existing-job-id [NUMID] --config-dir resources --source-dir src --bind
File successfully saved to src/test.py
Job configuration successfully saved to resources/test_job.job.yml
Updating deployment state...
Successfully bound job with an id '[NUMID]'

>>> ls src/
test.py

>>> cat resources/test_job.job.yml
name: auto-bind-job-[UNIQUE_NAME]

=== Deploy the bound job:
>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/tmp/[UNIQUE_NAME]/files...
Deploying resources...
Updating deployment state...
Deployment complete!

=== Destroy the bundle:
>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete job test_job

All files and directories at the following location will be deleted: /Workspace/tmp/[UNIQUE_NAME]

Deleting files...
Destroy complete!

=== Check that job is bound and does not exist after bundle is destroyed:
>>> errcode [CLI] jobs get [NUMID] --output json
Error: Job [NUMID] does not exist.

Exit code: 1

=== Delete the tmp folder:
>>> [CLI] workspace delete /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]/test

>>> [CLI] workspace delete /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]
51 changes: 51 additions & 0 deletions acceptance/bundle/generate/auto-bind/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
title "Create a pre-defined job:\n"

PYTHON_NOTEBOOK_DIR="/Workspace/Users/${CURRENT_USER_NAME}/python-${UNIQUE_NAME}"
PYTHON_NOTEBOOK="${PYTHON_NOTEBOOK_DIR}/test"

JOB_ID=$($CLI jobs create --json '
{
"name": "auto-bind-job-'${UNIQUE_NAME}'",
"tasks": [
{
"task_key": "test",
"new_cluster": {
"spark_version": "'${DEFAULT_SPARK_VERSION}'",
"node_type_id": "'${NODE_TYPE_ID}'",
"num_workers": 1
},
"notebook_task": {
"notebook_path": "'${PYTHON_NOTEBOOK}'"
}
}
]
}' | jq -r '.job_id')

echo "Created job with ID: $JOB_ID"

envsubst < databricks.yml.tmpl > databricks.yml

cleanup() {
title "Delete the tmp folder:"
trace $CLI workspace delete ${PYTHON_NOTEBOOK}
trace $CLI workspace delete ${PYTHON_NOTEBOOK_DIR}
}
trap cleanup EXIT

trace $CLI workspace mkdirs "${PYTHON_NOTEBOOK_DIR}"
trace $CLI workspace import "${PYTHON_NOTEBOOK}" --file test.py --language PYTHON

title "Generate and bind in one step:"
trace $CLI bundle generate job --key test_job --existing-job-id $JOB_ID --config-dir resources --source-dir src --bind
trace ls src/
# The output of the job is difference per cloud so we only check the name.
trace cat resources/test_job.job.yml | grep "name: auto-bind-job-${UNIQUE_NAME}"

title "Deploy the bound job:"
trace $CLI bundle deploy

title "Destroy the bundle:"
trace $CLI bundle destroy --auto-approve

title "Check that job is bound and does not exist after bundle is destroyed:"
trace errcode $CLI jobs get "${JOB_ID}" --output json
2 changes: 2 additions & 0 deletions acceptance/bundle/generate/auto-bind/test.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Databricks notebook source
print("Test notebook")
26 changes: 26 additions & 0 deletions acceptance/bundle/generate/auto-bind/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# This test is using a workspace import API to load a notebook file.
# This API has a logic on how to accept notebook files and distinguishes them from regular python files.
# To succeed locally we would need to replicate this logic in the fake_workspace
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worth doing some time?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Certainly, there are a bunch of other tests that need this, worth doing this as a separate PR

Local = false
Cloud = true

Ignore = [
"databricks.yml",
"resources/*",
"src/*",
".databricks",
]

[EnvMatrix]
DATABRICKS_BUNDLE_ENGINE = ["terraform"]


[Env]
# MSYS2 automatically converts absolute paths like /Users/$username/$UNIQUE_NAME to
# C:/Program Files/Git/Users/$username/UNIQUE_NAME before passing it to the CLI
# Setting this environment variable prevents that conversion on windows.
MSYS_NO_PATHCONV = "1"

[[Repls]]
Old = '\\'
New = '/'
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ Usage:
databricks bundle generate dashboard [flags]

Flags:
-b, --bind automatically bind the generated dashboard config to the existing dashboard
-s, --dashboard-dir string directory to write the dashboard representation to (default "src")
--existing-id string ID of the dashboard to generate configuration for
--existing-path string workspace path of the dashboard to generate configuration for
Expand Down
4 changes: 4 additions & 0 deletions acceptance/bundle/help/bundle-generate-job/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ Examples:
databricks bundle generate job --existing-job-id 67890 \
--key data_pipeline --config-dir resources --source-dir src

# Generate and automatically bind to the existing job
databricks bundle generate job --existing-job-id 12345 --key my_etl_job --bind

What gets generated:
- Job configuration YAML file in the resources directory
- Any associated notebook or Python files in the source directory
Expand All @@ -25,6 +28,7 @@ Usage:
databricks bundle generate job [flags]

Flags:
-b, --bind automatically bind the generated resource to the existing resource
-d, --config-dir string Dir path where the output config will be stored (default "resources")
--existing-job-id int Job ID of the job to generate config for
-f, --force Force overwrite existing files in the output directory
Expand Down
4 changes: 4 additions & 0 deletions acceptance/bundle/help/bundle-generate-pipeline/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@ Examples:
databricks bundle generate pipeline --existing-pipeline-id def456 \
--key data_transformation --config-dir resources --source-dir src

# Generate and automatically bind to the existing pipeline
databricks bundle generate pipeline --existing-pipeline-id abc123 --key etl_pipeline --bind

What gets generated:
- Pipeline configuration YAML file with settings and libraries
- Pipeline notebooks downloaded to the source directory
Expand All @@ -25,6 +28,7 @@ Usage:
databricks bundle generate pipeline [flags]

Flags:
-b, --bind automatically bind the generated resource to the existing resource
-d, --config-dir string Dir path where the output config will be stored (default "resources")
--existing-pipeline-id string ID of the pipeline to generate config for
-f, --force Force overwrite existing files in the output directory
Expand Down
16 changes: 12 additions & 4 deletions acceptance/bundle/help/bundle-generate/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,24 @@ Generate bundle configuration from existing Databricks resources.

Common patterns:
databricks bundle generate job --existing-job-id 123 --key my_job
databricks bundle generate pipeline --existing-pipeline-id abc123 --key etl_pipeline
databricks bundle generate dashboard --existing-path /my-dashboard --key sales_dash
databricks bundle generate dashboard --resource my_dashboard --watch --force # Keep local copy in sync. Useful for development.
databricks bundle generate dashboard --resource my_dashboard --force # Do a one-time sync.

Complete migration workflow:
1. Generate: databricks bundle generate job --existing-job-id 123 --key my_job
2. Bind: databricks bundle deployment bind my_job 123
3. Deploy: databricks bundle deploy
Migration workflows:

Two-step workflow (manual bind):
1. Generate: databricks bundle generate job --existing-job-id 123 --key my_job
2. Bind: databricks bundle deployment bind my_job 123
3. Deploy: databricks bundle deploy

One-step workflow (automatic bind):
1. Generate and bind: databricks bundle generate job --existing-job-id 123 --key my_job --bind
2. Deploy: databricks bundle deploy

Use --key to specify the resource name in your bundle configuration.
Use --bind to automatically bind the generated resource to the existing workspace resource.

Usage:
databricks bundle generate [command]
Expand Down
Loading
Loading