You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Refactor apps-mcp to use CLI-based approach (#4003)
This refactors the apps-mcp server to use CLI commands instead of direct
API providers, significantly simplifying the architecture and leveraging
existing bundle command functionality.
## Changes
**New CLI-based provider:**
- Add experimental/apps-mcp/lib/providers/clitools package
- Implement workspace exploration via CLI commands
- Add invoke_databricks_cli helper for executing CLI commands
- Update prompts to support apps exploration workflow
**Removed providers and templates:**
- Remove databricks provider (replaced by CLI invocation)
- Remove IO provider (scaffolding/validation, now handled by bundle
commands)
- Remove deployment provider (superseded by bundle deploy commands)
- Remove entire templates system including trpc template
**Clean up old development features:**
- Remove cmd/workspace/apps/dev.go and vite bridge
- Remove vite development server integration
- Drop experimental development workflow in favor of bundle-based
approach
## Why
This change reduces code complexity while providing a more maintainable
architecture that reuses existing CLI commands rather than duplicating
API logic.
## Tests
<!-- How have you tested the changes? -->
<!-- If your PR needs to be included in the release notes for next
release,
add a separate entry in NEXT_CHANGELOG.md as part of your PR. -->
---------
Co-authored-by: Arseny Kravchenko <[email protected]>
Co-authored-by: Fabian Jakobs <[email protected]>
A Model Context Protocol (MCP) server for generating production-ready Databricks applications with testing,
4
-
linting and deployment setup from a single prompt. This agent relies heavily on scaffolding and
5
-
extensive validation to ensure high-quality outputs.
3
+
A Model Context Protocol (MCP) server for working with Databricks through natural language. This server provides tools for data exploration, workspace management, and executing Databricks CLI commands through AI-powered conversations.
6
4
7
5
## TL;DR
8
6
9
-
**Primary Goal:**Create and deploy production-ready Databricks applications from a single natural language prompt. This MCP server combines scaffolding, validation, and deployment into a seamless workflow that goes from idea to running application.
7
+
**Primary Goal:**Interact with Databricks workspaces, manage Databricks Asset Bundles (DABs), deploy Databricks Apps, and query data through natural language conversations.
10
8
11
9
**How it works:**
12
10
1.**Explore your data** - Query Databricks catalogs, schemas, and tables to understand your data
@@ -16,11 +14,11 @@ extensive validation to ensure high-quality outputs.
-**Speed**: Go from concept to deployed Databricks app in minutes, not hours or days
20
-
-**Quality**: Extensive validation ensures your app builds, passes tests, and is production-ready
21
-
-**Simplicity**: One natural language conversation handles the entire workflow
17
+
-**Conversational interface**: Work with Databricks using natural language instead of memorizing CLI commands
18
+
-**Context-aware**: Get relevant command suggestions based on your workspace configuration
19
+
-**Unified workflow**: Combine data exploration, bundle management, and app deployment in one tool
22
20
23
-
Perfect for data engineers and developers who want to build Databricks apps without the manual overhead of project setup, configuration, testing infrastructure, and deployment pipelines.
21
+
Perfect for data engineers and developers who want to streamline their Databricks workflows with AI-powered assistance.
24
22
25
23
---
26
24
@@ -52,16 +50,18 @@ Perfect for data engineers and developers who want to build Databricks apps with
52
50
53
51
Try this in your MCP client:
54
52
```
55
-
Create a Databricks app that shows sales data from main.sales.transactions
56
-
with a chart showing revenue by region. Deploy it as "sales-dashboard".
53
+
Explore my Databricks workspace and show me what catalogs are available
57
54
```
58
55
59
-
The AI will:
60
-
- Explore your Databricks tables
61
-
- Generate a full-stack application
62
-
- Customize it based on your requirements
63
-
- Validate it passes all tests
64
-
- Deploy it to Databricks Apps
56
+
```
57
+
Initialize a new Databricks Asset Bundle for a data pipeline project
58
+
```
59
+
60
+
```
61
+
Query the main.sales.transactions table and show me the top 10 customers by revenue
62
+
```
63
+
64
+
The AI will use the appropriate Databricks tools to help you complete these tasks.
65
65
66
66
---
67
67
@@ -92,210 +92,143 @@ Then restart your MCP client for changes to take effect
92
92
93
93
## Features
94
94
95
-
All features are designed to support the end-to-end workflow of creating production-ready Databricks applications:
96
-
97
-
### 1. Data Exploration (Foundation)
98
-
99
-
Understand your Databricks data before building:
100
-
101
-
-**`databricks_list_catalogs`** - Discover available data catalogs
102
-
-**`databricks_list_schemas`** - Browse schemas in a catalog
103
-
-**`databricks_find_tables`** - Find tables in a schema
104
-
-**`databricks_describe_table`** - Get table details, columns, and sample data
105
-
-**`databricks_execute_query`** - Test queries and preview data
106
-
107
-
*These tools help the AI understand your data structure so it can generate relevant application code.*
108
-
109
-
### 2. Application Generation (Core)
95
+
The Databricks MCP server provides CLI-based tools for workspace interaction:
110
96
111
-
Create the application structure:
97
+
Execute Databricks CLI commands and explore workspace resources:
112
98
113
-
-**`scaffold_databricks_app`** - Generate a full-stack TypeScript application
114
-
- Modern stack: Node.js, TypeScript, React, tRPC
115
-
- Pre-configured build system, linting, and testing
116
-
- Production-ready project structure
117
-
- Databricks SDK integration
99
+
-**`explore`** - Discover workspace resources and get CLI command recommendations
100
+
- Lists workspace URL, SQL warehouse details, and authentication profiles
101
+
- Provides command examples for jobs, clusters, catalogs, tables, and workspace files
102
+
- Gives workflow guidance for Databricks Asset Bundles and Apps
118
103
119
-
*This is the foundation of your application - a working, tested template ready for customization.*
104
+
-**`invoke_databricks_cli`** - Execute any Databricks CLI command
0 commit comments