-
Notifications
You must be signed in to change notification settings - Fork 119
Fix oss-source pipeline errors when using the Databricks CLI #3889
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,20 @@ | ||
| # This test verifies that bundles can use the 'definitions' field for YAML anchors | ||
| # without triggering the OSS Spark Pipelines validation error. This was a regression | ||
| # patched in https://github.com/databricks/cli/pull/3889 | ||
|
|
||
| definitions: | ||
| common_cluster_settings: &common_cluster_settings | ||
| spark_version: "13.3.x-scala2.12" | ||
| node_type_id: "i3.xlarge" | ||
| num_workers: 2 | ||
|
|
||
| resources: | ||
| jobs: | ||
| my_job: | ||
| name: "test job with anchors" | ||
| tasks: | ||
| - task_key: "main" | ||
| new_cluster: | ||
| <<: *common_cluster_settings | ||
| notebook_task: | ||
| notebook_path: "/some/notebook" |
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
|
|
||
| >>> [CLI] bundle validate | ||
| Name: test-bundle | ||
| Target: default | ||
| Workspace: | ||
| User: [USERNAME] | ||
| Path: /Workspace/Users/[USERNAME]/.bundle/test-bundle/default | ||
|
|
||
| Validation OK! |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| trace $CLI bundle validate |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -44,11 +44,12 @@ func deployCommand() *cobra.Command { | |
| b.Config.Bundle.Deployment.FailOnActiveRuns = failOnActiveRuns | ||
| } | ||
| }, | ||
| Verbose: verbose, | ||
| AlwaysPull: true, | ||
| FastValidate: true, | ||
| Build: true, | ||
| Deploy: true, | ||
| Verbose: verbose, | ||
| AlwaysPull: true, | ||
| FastValidate: true, | ||
| Build: true, | ||
| Deploy: true, | ||
| IsPipelinesCLI: true, | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This scopes it down to deploy, but it seems this is something that would be useful in plan & validate as well, no?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This was the previous implementation, the error only triggered on Note pipeliens CLI does not have plan or validate, only a dryrun command. |
||
| }) | ||
| if err != nil { | ||
| return err | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a bit odd to have this here, this package orchestrates existing phases rather than having actual validation logic.
What is we keep it in initialize.go() and then detect pipelines exe via argv[0]?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Detecting via argv[0] feels hacky. I don't mind recording
IsPipelineCLIin the bundle tree maybe and then record this in initialize.IMO the current implementation is fine but I'm open to any alternate suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One advantage is it's obvious which code is for pipelines only. You don't have to dive into the guts (the initialize phase here) to see whether some code is for pipelines CLI only or not.