Skip to content

Commit aacbf20

Browse files
authored
[Exporter] Added support for Database Instance resource (aka Lakebase) (#5212)
## Changes <!-- Summary of your changes that are easy to understand --> Also adding more generic functions, i.e. handle of effective fields, etc. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [x] using Go SDK - [x] using TF Plugin Framework - [x] has entry in `NEXT_CHANGELOG.md` file
1 parent ebc3d92 commit aacbf20

File tree

13 files changed

+814
-214
lines changed

13 files changed

+814
-214
lines changed

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,5 +21,6 @@
2121
* Fix typo in the name of environment variable ([#5158](https://github.com/databricks/terraform-provider-databricks/pull/5158)).
2222
* Export permission assignments on workspace level ([#5169](https://github.com/databricks/terraform-provider-databricks/pull/5169)).
2323
* Added support for Databricks Apps resources ([#5208](https://github.com/databricks/terraform-provider-databricks/pull/5208)).
24+
* Added support for Database Instance resource (aka Lakebase) ([#5212](https://github.com/databricks/terraform-provider-databricks/pull/5212)).
2425

2526
### Internal Changes

docs/guides/experimental-exporter.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -184,6 +184,7 @@ Services could be specified in combination with predefined aliases (`all` - for
184184
* `groups` - **listing** [databricks_group](../data-sources/group.md) with [membership](../resources/group_member.md) and [data access](../resources/group_instance_profile.md). If Identity Federation is enabled on the workspace (when UC Metastore is attached), then account-level groups are exposed as data sources because they are defined on account level, and only workspace-level groups are exposed as resources. See the note above on how to perform migration between workspaces with Identity Federation enabled.
185185
* `idfed` - **listing** [databricks_mws_permission_assignment](../resources/mws_permission_assignment.md) (account-level) and [databricks_permission_assignment](../resources/permission_assignment.md) (workspace-level). When listing is done on account level, you can filter assignment only to specific workspace IDs as specified by `-match`, `-matchRegex`, and `-excludeRegex` options. I.e., to export assignments only for two workspaces, use `-matchRegex '^1688808130562317|5493220389262917$'`.
186186
* `jobs` - **listing** [databricks_job](../resources/job.md). Usually, there are more automated workflows than interactive clusters, so they get their own file in this tool's output. *Please note that workflows deployed and maintained via [Databricks Asset Bundles](https://docs.databricks.com/en/dev-tools/bundles/index.html) aren't exported!*
187+
* `lakebase` - **listing** [databricks_database_instance](../resources/database_instance.md).
187188
* `mlflow-webhooks` - **listing** [databricks_mlflow_webhook](../resources/mlflow_webhook.md).
188189
* `model-serving` - **listing** [databricks_model_serving](../resources/model_serving.md).
189190
* `mounts` - **listing** works only in combination with `-mounts` command-line option.
@@ -252,6 +253,7 @@ Exporter aims to generate HCL code for most of the resources within the Databric
252253
| [databricks_connection](../resources/connection.md) | Yes | Yes | Yes | No |
253254
| [databricks_credential](../resources/credential.md) | Yes | Yes | Yes | No |
254255
| [databricks_dashboard](../resources/dashboard.md) | Yes | No | Yes | No |
256+
| [databricks_database_instance](../resources/database_instance.md) | Yes | No | Yes | No |
255257
| [databricks_data_quality_monitor](../resources/data_quality_monitor.md) | Yes | Yes | Yes | No |
256258
| [databricks_dbfs_file](../resources/dbfs_file.md) | Yes | No | Yes | No |
257259
| [databricks_external_location](../resources/external_location.md) | Yes | Yes | Yes | No |

exporter/AGENTS.md

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -84,3 +84,43 @@ unifiedDataToHcl()
8484
**Key Differences**:
8585
- SDKv2 generates nested structures as **blocks**: `evaluation { ... }`
8686
- Plugin Framework generates nested structures as **attributes**: `evaluation = { ... }`
87+
88+
## Helper Functions for Field Omission Logic
89+
90+
### `shouldOmitWithEffectiveFields`
91+
92+
A reusable helper function (`exporter/util.go`) for resources that have input-only fields with corresponding `effective_*` fields. This pattern is common in resources where the API returns `effective_*` versions of input fields (e.g., `effective_node_count` for `node_count`).
93+
94+
**When to Use**:
95+
- Your resource has input-only fields that are not returned by the API
96+
- The API returns corresponding `effective_*` fields with the actual values
97+
- You want to generate HCL with non-zero values from the `effective_*` fields
98+
99+
**Usage**:
100+
```go
101+
"databricks_database_instance": {
102+
// ... other fields ...
103+
ShouldOmitFieldUnified: shouldOmitWithEffectiveFields,
104+
},
105+
```
106+
107+
**How it Works**:
108+
1. Checks if the field has a corresponding `effective_*` field in the schema
109+
2. If found, applies smart filtering:
110+
- Always includes required fields (even if zero value)
111+
- Omits fields with zero values (`false`, `0`, `""`, etc.)
112+
- Omits fields that match their default value
113+
- Includes fields with non-zero values
114+
3. Uses `reflect.ValueOf(v).IsZero()` for proper zero-value detection (important because `wrapper.GetOk()` returns `nonZero=true` even for `false` booleans)
115+
116+
**Prerequisites**:
117+
Your resource's `Import` function must call `copyEffectiveFieldsToInputFieldsWithConverters[TfType](ic, r, GoSdkType{})` to copy values from `effective_*` fields to their input counterparts. See `exporter/impl_lakebase.go` for an example.
118+
119+
**Example**:
120+
For a resource with `node_count` (input-only) and `effective_node_count` (API-returned):
121+
- API returns: `{"effective_node_count": 2, "effective_enable_readable_secondaries": false}`
122+
- Import function copies: `node_count = 2`, `enable_readable_secondaries = false`
123+
- Generated HCL includes: `node_count = 2` (non-zero)
124+
- Generated HCL omits: `enable_readable_secondaries = false` (zero value)
125+
126+
For more details, see `exporter/EFFECTIVE_FIELDS_PATTERN.md`.

exporter/abstractions.go

Lines changed: 123 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,12 @@ package exporter
33
import (
44
"context"
55
"fmt"
6+
"log"
67
"reflect"
78
"strconv"
89
"strings"
910

11+
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/converters"
1012
"github.com/hashicorp/terraform-plugin-framework/attr"
1113
"github.com/hashicorp/terraform-plugin-framework/path"
1214
frameworkschema "github.com/hashicorp/terraform-plugin-framework/resource/schema"
@@ -717,3 +719,124 @@ func convertGoToPluginFrameworkType(value interface{}) attr.Value {
717719
return types.StringValue(fmt.Sprintf("%v", value))
718720
}
719721
}
722+
723+
// copyEffectiveFieldsToInputFieldsWithConverters automatically copies values from effective_* fields
724+
// to their corresponding input fields (e.g., effective_node_count -> node_count).
725+
// This is useful for Plugin Framework resources where the API returns effective_* fields but doesn't
726+
// return the input fields that were originally set.
727+
//
728+
// NOTE: This function only works with Plugin Framework resources. The effective_* field pattern
729+
// is not used by SDKv2 resources.
730+
//
731+
// This function works by converting the TF state to a Go SDK struct, copying fields
732+
// using reflection, and then converting back to TF state. This approach:
733+
// - Handles complex types (lists, maps, nested objects) automatically via converters
734+
// - Leverages existing converter infrastructure for type safety
735+
// - Works for all field types including custom_tags (lists of objects)
736+
//
737+
// Type parameters:
738+
// - TTF: The Terraform Plugin Framework struct type
739+
// - TGo: The Go SDK struct type
740+
//
741+
// Example usage in an import function:
742+
//
743+
// func importDatabaseInstance(ic *importContext, r *resource) error {
744+
// copyEffectiveFieldsToInputFieldsWithConverters[database_instance_resource.DatabaseInstance](
745+
// ic, r, database.DatabaseInstance{})
746+
// return nil
747+
// }
748+
func copyEffectiveFieldsToInputFieldsWithConverters[TTF any, TGo any](
749+
ic *importContext,
750+
r *resource,
751+
_ TGo,
752+
) {
753+
if r.DataWrapper == nil {
754+
return
755+
}
756+
757+
wrapper := r.DataWrapper
758+
ctx := ic.Context
759+
760+
// Effective fields pattern is only applicable to Plugin Framework resources
761+
if !wrapper.IsPluginFramework() {
762+
log.Printf("[DEBUG] copyEffectiveFieldsToInputFieldsWithConverters called on non-Plugin Framework resource %s, skipping", r.ID)
763+
return
764+
}
765+
766+
// Step 1: Convert TF state to Go SDK struct
767+
var goSdkStruct TGo
768+
var tfStruct TTF
769+
if err := wrapper.GetTypedStruct(ctx, &tfStruct); err != nil {
770+
log.Printf("[WARN] Failed to extract TF struct for %s: %v", r.ID, err)
771+
return
772+
}
773+
774+
diags := converters.TfSdkToGoSdkStruct(ctx, tfStruct, &goSdkStruct)
775+
if diags.HasError() {
776+
log.Printf("[WARN] Failed to convert TF to Go SDK struct for %s: %v", r.ID, diags)
777+
return
778+
}
779+
780+
// Step 2: Copy effective_* fields to their input counterparts using reflection
781+
goSdkValue := reflect.ValueOf(&goSdkStruct).Elem()
782+
goSdkType := goSdkValue.Type()
783+
784+
copiedFields := []string{}
785+
for i := 0; i < goSdkValue.NumField(); i++ {
786+
field := goSdkType.Field(i)
787+
fieldName := field.Name
788+
789+
// Check if this is an effective_* field
790+
if !strings.HasPrefix(fieldName, "Effective") {
791+
continue
792+
}
793+
794+
// Derive the input field name (e.g., "EffectiveNodeCount" -> "NodeCount")
795+
inputFieldName := strings.TrimPrefix(fieldName, "Effective")
796+
797+
// Check if the corresponding input field exists
798+
inputField := goSdkValue.FieldByName(inputFieldName)
799+
if !inputField.IsValid() || !inputField.CanSet() {
800+
continue
801+
}
802+
803+
// Get the effective field value
804+
effectiveField := goSdkValue.Field(i)
805+
if !effectiveField.IsValid() {
806+
continue
807+
}
808+
809+
// Check if types match
810+
if effectiveField.Type() != inputField.Type() {
811+
log.Printf("[DEBUG] Type mismatch for %s: effective=%v, input=%v", inputFieldName, effectiveField.Type(), inputField.Type())
812+
continue
813+
}
814+
815+
// Copy the value
816+
inputField.Set(effectiveField)
817+
copiedFields = append(copiedFields, fmt.Sprintf("%s->%s", fieldName, inputFieldName))
818+
}
819+
820+
if len(copiedFields) > 0 {
821+
log.Printf("[TRACE] Copied effective fields for %s: %s", r.ID, strings.Join(copiedFields, ", "))
822+
}
823+
824+
// Step 3: Convert back to TF state
825+
var tfStruct2 TTF
826+
diags = converters.GoSdkToTfSdkStruct(ctx, goSdkStruct, &tfStruct2)
827+
if diags.HasError() {
828+
log.Printf("[WARN] Failed to convert Go SDK to TF struct for %s: %v", r.ID, diags)
829+
return
830+
}
831+
832+
// Step 4: Write back to the state using Set method on Plugin Framework state
833+
// Access the underlying state from PluginFrameworkResourceData
834+
if pfWrapper, ok := wrapper.(*PluginFrameworkResourceData); ok {
835+
diags := pfWrapper.state.Set(ctx, &tfStruct2)
836+
if diags.HasError() {
837+
log.Printf("[WARN] Failed to write TF struct back to state for %s: %v", r.ID, diags)
838+
}
839+
} else {
840+
log.Printf("[WARN] Unable to write TF struct back to state: wrapper is not PluginFrameworkResourceData for %s", r.ID)
841+
}
842+
}

exporter/abstractions_test.go

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
package exporter

exporter/codegen.go

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -399,6 +399,14 @@ func (ic *importContext) extractFieldsForGeneration(imp importable, path []strin
399399
shouldSkip = false
400400
}
401401

402+
// For Plugin Framework, also check for zero values in primitives
403+
if !shouldSkip && wrapper.IsPluginFramework() && nonZero && fieldSchema.IsOptional() {
404+
rv := reflect.ValueOf(raw)
405+
if rv.IsValid() && rv.IsZero() {
406+
shouldSkip = true
407+
}
408+
}
409+
402410
// Check if ShouldGenerateField forces generation
403411
if shouldSkip {
404412
forceGenerate := false

0 commit comments

Comments
 (0)