Skip to content

Change default environment to cloud-agnostic#1269

Open
hectorcast-db wants to merge 3 commits intomainfrom
hectorcast-db/single-path-2
Open

Change default environment to cloud-agnostic#1269
hectorcast-db wants to merge 3 commits intomainfrom
hectorcast-db/single-path-2

Conversation

@hectorcast-db
Copy link
Contributor

@hectorcast-db hectorcast-db commented Feb 17, 2026

Summary

This PR updates the SDK to be cloud-agnostic by removing cloud-specific detection and configuration. This includes updating the default environment handling and unifying Private Link error messages to work across all cloud providers without requiring cloud detection.

Changes

Environment Handling

  • Add Cloud.UNKNOWN enum value for cloud-agnostic hosts
  • Update DEFAULT_ENVIRONMENT to use Cloud.UNKNOWN with no dns_zone
  • Make dns_zone optional in DatabricksEnvironment dataclass
  • Update environment detection logic to handle None dns_zone values
  • Move DEFAULT_ENVIRONMENT to end of ALL_ENVS to ensure it's used as a fallback only

Private Link Error Handling

  • Unified Private Link error messages across all clouds (AWS, Azure, GCP)
  • Removed cloud detection dependency from _get_private_link_validation_error()
  • Single error message now provides guidance for all three cloud providers
  • Removed cloud-specific _PrivateLinkInfo classes and mapping

Motivation

This change enables better support for:

  • Cloud-agnostic Databricks endpoints (e.g., unified hosts, aliased hosts)
  • Unknown or custom host configurations that don't match standard cloud patterns
  • Future host types that don't fit AWS/Azure/GCP categorization
  • Simplified error handling that doesn't require knowing the user's cloud provider upfront

Backward Compatibility

✅ This change is backward compatible:

  • All existing AWS/Azure/GCP hosts continue to be detected correctly
  • Only unknown/unrecognized hosts are affected
  • The is_azure, is_gcp, and is_aws properties work correctly with Cloud.UNKNOWN (all return False)
  • azure_workspace_resource_id still correctly sets is_azure=True regardless of detected cloud
  • Private Link error messages now include information for all clouds, making them more helpful regardless of the user's environment

Testing

  • Added test_unknown_cloud_environment_properties - verifies UNKNOWN cloud behavior
  • Added test_azure_resource_id_sets_is_azure_with_unknown_environment - verifies Azure resource ID override works
  • Updated test_get_api_error[private_link_validation_error] - verifies unified Private Link error message
  • All existing tests pass

This change updates the DEFAULT_ENVIRONMENT from AWS-specific to Cloud.UNKNOWN,
enabling better support for cloud-agnostic and unknown Databricks hosts.

Key changes:
- Add Cloud.UNKNOWN enum value
- Change DEFAULT_ENVIRONMENT to use Cloud.UNKNOWN with no dns_zone
- Make dns_zone Optional in DatabricksEnvironment
- Add None-safety checks in is_azure/is_gcp/is_aws properties
- Move DEFAULT_ENVIRONMENT to end of ALL_ENVS list for proper fallback
- Add tests for UNKNOWN cloud environment behavior

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@github-actions
Copy link

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/sdk-py

Inputs:

  • PR number: 1269
  • Commit SHA: a2c1959185523333e06aaf992a4f159593fb8358

Checks will be approved automatically on success.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants