You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A Model Context Protocol (MCP) server for generating production-ready Databricks applications with testing,
4
-
linting and deployment setup from a single prompt. This agent relies heavily on scaffolding and
5
-
extensive validation to ensure high-quality outputs.
3
+
A Model Context Protocol (MCP) server for working with Databricks through natural language. This server provides tools for data exploration, workspace management, and executing Databricks CLI commands through AI-powered conversations.
6
4
7
5
## TL;DR
8
6
9
-
**Primary Goal:**Create and deploy production-ready Databricks applications from a single natural language prompt. This MCP server combines scaffolding, validation, and deployment into a seamless workflow that goes from idea to running application.
7
+
**Primary Goal:**Interact with Databricks workspaces, manage Databricks Asset Bundles (DABs), deploy Databricks Apps, and query data through natural language conversations.
10
8
11
9
**How it works:**
12
-
1.**Explore your data** - Query Databricks catalogs, schemas, and tables to understand your data
13
-
2.**Generate the app** - Scaffold a full-stack TypeScript application (tRPC + React) with proper structure
14
-
3.**Customize with AI** - Use workspace tools to read, write, and edit files naturally through conversation
15
-
4.**Validate rigorously** - Run builds, type checks, and tests to ensure quality
1.**Explore your workspace** - Discover workspace resources, get CLI command examples, and workflow recommendations
11
+
2.**Query your data** - Browse catalogs, schemas, and tables; execute SQL queries via CLI commands
12
+
3.**Manage bundles** - Initialize, validate, deploy, and run Databricks Asset Bundles
13
+
4.**Deploy apps** - Deploy and manage Databricks Apps through CLI commands
14
+
5.**Execute any CLI command** - Run the full Databricks CLI through the `invoke_databricks_cli` tool
17
15
18
16
**Why use it:**
19
-
-**Speed**: Go from concept to deployed Databricks app in minutes, not hours or days
20
-
-**Quality**: Extensive validation ensures your app builds, passes tests, and is production-ready
21
-
-**Simplicity**: One natural language conversation handles the entire workflow
17
+
-**Conversational interface**: Work with Databricks using natural language instead of memorizing CLI commands
18
+
-**Context-aware**: Get relevant command suggestions based on your workspace configuration
19
+
-**Unified workflow**: Combine data exploration, bundle management, and app deployment in one tool
22
20
23
-
Perfect for data engineers and developers who want to build Databricks apps without the manual overhead of project setup, configuration, testing infrastructure, and deployment pipelines.
21
+
Perfect for data engineers and developers who want to streamline their Databricks workflows with AI-powered assistance.
24
22
25
23
---
26
24
@@ -54,229 +52,164 @@ Perfect for data engineers and developers who want to build Databricks apps with
54
52
}
55
53
```
56
54
57
-
3.**Create your first Databricks app:**
55
+
3.**Start using Databricks with natural language:**
58
56
59
57
Restart your MCP client and try:
60
58
```
61
-
Create a Databricks app that shows sales data from main.sales.transactions
62
-
with a chart showing revenue by region. Deploy it as "sales-dashboard".
59
+
Explore my Databricks workspace and show me what catalogs are available
63
60
```
64
61
65
-
The AI will:
66
-
- Explore your Databricks tables
67
-
- Generate a full-stack application
68
-
- Customize it based on your requirements
69
-
- Validate it passes all tests
70
-
- Deploy it to Databricks Apps
71
-
72
-
---
73
-
74
-
## Features
75
-
76
-
All features are designed to support the end-to-end workflow of creating production-ready Databricks applications:
77
-
78
-
### 1. Data Exploration (Foundation)
79
-
80
-
Understand your Databricks data before building:
81
-
82
-
-**`databricks_list_catalogs`** - Discover available data catalogs
83
-
-**`databricks_list_schemas`** - Browse schemas in a catalog
84
-
-**`databricks_find_tables`** - Find tables in a schema
85
-
-**`databricks_describe_table`** - Get table details, columns, and sample data
86
-
-**`databricks_execute_query`** - Test queries and preview data
87
-
88
-
*These tools help the AI understand your data structure so it can generate relevant application code.*
89
-
90
-
### 2. Application Generation (Core)
91
-
92
-
Create the application structure:
93
-
94
-
-**`scaffold_data_app`** - Generate a full-stack TypeScript application
95
-
- Modern stack: Node.js, TypeScript, React, tRPC
96
-
- Pre-configured build system, linting, and testing
97
-
- Production-ready project structure
98
-
- Databricks SDK integration
62
+
```
63
+
Initialize a new Databricks Asset Bundle for a data pipeline project
64
+
```
99
65
100
-
*This is the foundation of your application - a working, tested template ready for customization.*
66
+
```
67
+
Query the main.sales.transactions table and show me the top 10 customers by revenue
68
+
```
101
69
102
-
### 3. Validation (Quality Assurance)
70
+
The AI will use the appropriate Databricks tools to help you complete these tasks.
0 commit comments