Workspace Config
Skill: databricks-config
What You Can Build
Section titled “What You Can Build”You can manage connections to multiple Databricks workspaces — dev, staging, production — from a single machine. The workspace config skill handles profile management, OAuth authentication, and live workspace switching so your AI coding assistant always targets the right environment. No more deploying to prod when you meant dev.
In Action
Section titled “In Action”“Set up Databricks CLI authentication for my dev and prod workspaces using OAuth, then switch to the prod profile.”
# Authenticate to each workspace (interactive OAuth flow)databricks auth login \ --host https://dbc-abc123.cloud.databricks.com \ -p dev
databricks auth login \ --host https://dbc-xyz789.cloud.databricks.com \ -p prod# ~/.databrickscfg (generated by auth login)[dev]host = https://dbc-abc123.cloud.databricks.comauth_type = databricks-cli
[prod]host = https://dbc-xyz789.cloud.databricks.comauth_type = databricks-cli# Verify which workspace is activedatabricks auth env -p dev
# Switch profile for subsequent commandsexport DATABRICKS_CONFIG_PROFILE=proddatabricks clusters list # now hits prodKey decisions:
- OAuth via
auth login— generates and auto-refreshes tokens. Preferred over PATs, which expire and get committed to repos by accident. - Named profiles — each workspace gets a short name (
dev,prod) instead of a URL. Every CLI command accepts-p <profile>to target a specific workspace. DATABRICKS_CONFIG_PROFILEenv var — sets the default profile for the current shell session. Avoids passing-pon every command.- No manual
.databrickscfgediting —auth loginwrites the config file correctly. Hand-editing risks malformed entries that produce confusing “invalid host” errors.
More Patterns
Section titled “More Patterns”Check current workspace status
Section titled “Check current workspace status”“Which workspace am I connected to right now?”
Your AI coding assistant uses the manage_workspace MCP tool with action="status" to report the active host, profile, and authenticated user. This runs without any CLI commands and works inside any coding session.
Active workspace: Host: https://dbc-abc123.cloud.databricks.com Profile: dev User: user@example.comThe switch is session-scoped — it resets when the MCP server restarts. For a permanent default, set the profile in your shell config.
List all configured workspaces
Section titled “List all configured workspaces”“Show me all the Databricks workspaces I have set up.”
Your AI coding assistant calls manage_workspace with action="list" to display every profile in ~/.databrickscfg with its host URL and which one is currently active.
Profile Host Activedev https://dbc-abc123.cloud.databricks.com *prod https://dbc-xyz789.cloud.databricks.comstaging https://dbc-def456.cloud.databricks.comUse this to verify your profiles before running deploys. If a workspace is missing, run databricks auth login --host <url> -p <name> to add it.
Configure cluster or serverless for a profile
Section titled “Configure cluster or serverless for a profile”“Set up my dev profile to default to serverless compute.”
[dev]host = https://dbc-abc123.cloud.databricks.comauth_type = databricks-cliserverless_compute_id = autoAdding serverless_compute_id = auto makes databricks-connect and notebook execution default to serverless for that profile. For cluster-based workflows, use cluster_id = 0123-456789-abcdef12 instead. This avoids passing compute IDs on every command.
Watch Out For
Section titled “Watch Out For”- Expired OAuth tokens with no visible error — OAuth tokens auto-refresh, but if the refresh token itself expires (typically 90 days without use), CLI commands fail with vague “unauthorized” errors. Re-run
databricks auth loginto fix. DATABRICKS_HOSToverrides profile — if you haveDATABRICKS_HOSTset as an environment variable, it takes precedence over any profile you pass with-p. This causes commands to hit the wrong workspace. Unset it or useunset DATABRICKS_HOSTbefore switching profiles.- Session-scoped MCP switches — the
manage_workspacetool switches profiles for the current session only. Restarting your editor or MCP server resets to the default profile. For persistent defaults, exportDATABRICKS_CONFIG_PROFILEin your shell rc file.