Deployment
Skill: databricks-app-python
What You Can Build
Section titled “What You Can Build”You can deploy Python apps to Databricks using two paths: the CLI for quick single-environment deploys, and Asset Bundles (DABs) for version-controlled multi-environment pipelines (dev/staging/prod). Both paths produce a running app with a URL, automatic HTTPS, and managed service principal credentials.
In Action
Section titled “In Action”“Deploy a Streamlit app to Databricks using the CLI with a SQL warehouse resource.”
# Create the appdatabricks apps create sales-dashboard
# Upload source code to the workspacedatabricks workspace mkdirs /Workspace/Users/you@company.com/apps/sales-dashboarddatabricks workspace import-dir . /Workspace/Users/you@company.com/apps/sales-dashboard
# Deploydatabricks apps deploy sales-dashboard \ --source-code-path /Workspace/Users/you@company.com/apps/sales-dashboard
# Verify status and get the app URLdatabricks apps get sales-dashboardKey decisions:
- Use the CLI path when you need a quick deployment or are iterating on a prototype
- Add resources (SQL warehouse, Lakebase, etc.) through the Databricks Apps UI after creating the app
- The
app.yamlfile controls the start command and environment variables — make sure the command matches your framework - After uploading new code, redeploy with the same
databricks apps deploycommand
More Patterns
Section titled “More Patterns”Multi-environment deployment with Asset Bundles
Section titled “Multi-environment deployment with Asset Bundles”“Set up a DABs pipeline to deploy the same app to dev and prod targets.”
# Generate bundle config from an existing appdatabricks bundle generate app \ --existing-app-name sales-dashboard \ --key sales_dashboard
# Validate the dev targetdatabricks bundle validate -t dev
# Deploy and startdatabricks bundle deploy -t devdatabricks bundle run sales_dashboard -t dev
# Promote to productiondatabricks bundle deploy -t proddatabricks bundle run sales_dashboard -t prodbundle deploy pushes configuration and code. bundle run starts the app. Both steps are required — deploy alone does not apply configuration changes. Environment variables go in src/app/app.yaml, not in databricks.yml.
Checking deployment logs
Section titled “Checking deployment logs”“How do I diagnose a failed deployment?”
databricks apps logs sales-dashboardLook for these patterns in the output: [SYSTEM] lines show deployment status and dependency installation; [APP] lines show your application’s stdout/stderr; Deployment successful and App started successfully confirm a healthy state. If you see a stack trace, the error is usually a missing dependency in requirements.txt or a misconfigured start command in app.yaml.
Redeploying after code changes
Section titled “Redeploying after code changes”“Push updated code to an existing app.”
# Delete old files, upload new ones, redeploydatabricks workspace delete /Workspace/Users/you@company.com/apps/sales-dashboard --recursivedatabricks workspace import-dir . /Workspace/Users/you@company.com/apps/sales-dashboarddatabricks apps deploy sales-dashboard \ --source-code-path /Workspace/Users/you@company.com/apps/sales-dashboardThis three-step cycle — delete, upload, deploy — ensures the workspace copy matches your local directory exactly. The app restarts automatically after deployment.
Service principal permission setup via DABs
Section titled “Service principal permission setup via DABs”“In a databricks.yml bundle, declare a SQL warehouse resource with auto-granted permissions.”
resources: apps: sales_dashboard: resources: - name: my-warehouse sql_warehouse: id: ${var.warehouse_id} permission: CAN_USEWhen you declare a resource with a permission field, the platform auto-grants that permission to the app’s service principal on deployment. This eliminates the manual step of granting SP access through the UI.
Watch Out For
Section titled “Watch Out For”- Running
bundle deploywithoutbundle run— deployment pushes code and config, but the app does not start or apply new configuration until you also runbundle run. This is different from CLI deploys, which start automatically. - Putting environment variables in
databricks.yml— for Databricks Apps, environment variables belong insrc/app/app.yaml, not the bundle config. The bundle manages the app resource;app.yamlmanages the runtime. - Exceeding the 10 MB file limit — the platform rejects individual files larger than 10 MB. If a dependency is too large, put it in
requirements.txtand let the runtime install it instead of bundling it with your source. - Skipping log review after deployment — a “deployed” status does not mean your app is healthy. Always run
databricks apps logsand verify theApp started successfullymessage before sharing the URL.