Builder App
The Builder App is a React + FastAPI web application that gives your team a browser-based interface for AI Dev Kit. Instead of using Claude Code or Cursor in the terminal, users interact with skills and MCP tools through a chat UI — ideal for team onboarding, demos, and exploration.
Prerequisites
Section titled “Prerequisites”- Python 3.12+
- Node.js / npm
- uv installed
- Databricks CLI with a profile configured
- A valid personal access token (PAT) or working CLI profile
Local Setup
Section titled “Local Setup”1. Clone and set up
Section titled “1. Clone and set up”git clone https://github.com/databricks-solutions/ai-dev-kit.gitcd ai-dev-kit/databricks-builder-app./scripts/setup.shThe setup script installs all Python and Node dependencies, creates a .venv virtual environment, and generates a .env.local file from the example template. Follow the interactive prompts.
2. Configure authentication and Lakebase
Section titled “2. Configure authentication and Lakebase”Edit the .env.local file in the databricks-builder-app directory:
# Required: Your Databricks workspaceDATABRICKS_HOST=https://your-workspace.cloud.databricks.comDATABRICKS_TOKEN=dapi...
# Required: Database for project persistence (pick ONE option)
# Option 1: Dynamic OAuth via Databricks SDK (recommended for Databricks Apps deployment)LAKEBASE_INSTANCE_NAME=your-lakebase-instanceLAKEBASE_DATABASE_NAME=databricks_postgres
# Option 2: Static connection URL (for local development)# LAKEBASE_PG_URL=postgresql://user:password@host:5432/database?sslmode=requireThe Builder App stores conversations and project settings in Lakebase — Databricks’ OLTP Postgres database.
3. Test locally
Section titled “3. Test locally”source .venv/bin/activate./scripts/start_dev.shThis starts two servers:
- Frontend: http://localhost:3000
- Backend: http://localhost:8000
Open http://localhost:3000 and verify the Builder App UI loads. If everything works, you’re ready to deploy.
Deploy to Databricks Apps
Section titled “Deploy to Databricks Apps”Once local testing passes, deploy to your workspace so your team can access it via a URL.
-
Install and authenticate the Databricks CLI
Terminal window # Install Databricks CLIpip install databricks-cli# Authenticate (interactive browser login)databricks auth login --host https://your-workspace.cloud.databricks.com# Verify authenticationdatabricks auth describe -
Create a Databricks App
Terminal window # Create a new appdatabricks apps create --name my-builder-app# Verify it was createddatabricks apps get my-builder-app -
Create a Lakebase instance
In your Databricks workspace, go to Compute > Lakebase and create a new provisioned database. Note the instance name — you’ll need it in the next steps.
-
Add Lakebase as an app resource
Terminal window databricks apps add-resource my-builder-app \--resource-type database \--resource-name lakebase \--database-instance <your-lakebase-instance-name> -
Configure app.yaml
Terminal window cp app.yaml.example app.yamlEdit
app.yamlwith your configuration:command:- "uvicorn"- "server.app:app"- "--host"- "0.0.0.0"- "--port"- "$DATABRICKS_APP_PORT"env:# Required: Your Lakebase instance name- name: LAKEBASE_INSTANCE_NAMEvalue: "<your-lakebase-instance-name>"- name: LAKEBASE_DATABASE_NAMEvalue: "databricks_postgres"# Skills to enable (comma-separated)- name: ENABLED_SKILLSvalue: "databricks-agent-bricks,databricks-python-sdk,databricks-spark-declarative-pipelines"# MLflow tracing (optional)- name: MLFLOW_TRACKING_URIvalue: "databricks"# - name: MLFLOW_EXPERIMENT_NAME# value: "/Users/your-email@company.com/claude-code-traces"# Other settings- name: ENVvalue: "production"- name: PROJECTS_BASE_DIRvalue: "./projects" -
Choose your LLM provider
The Builder App supports multiple LLM providers. Configure the one that fits your environment:
This is the default — no extra configuration needed. The app uses models available in your workspace.
# Already set by default in app.yaml# LLM_PROVIDER=DATABRICKS# DATABRICKS_MODEL=databricks-claude-sonnet-4-6# DATABRICKS_MODEL_MINI=databricks-claude-haiku-4-5Add your Anthropic API key to the
envsection ofapp.yaml:- name: LLM_PROVIDERvalue: "ANTHROPIC"- name: ANTHROPIC_API_KEYvalue: "your-anthropic-api-key"Add your Azure OpenAI configuration to the
envsection ofapp.yaml:- name: LLM_PROVIDERvalue: "AZURE"- name: AZURE_OPENAI_API_KEYvalue: "your-azure-api-key"- name: AZURE_OPENAI_ENDPOINTvalue: "https://your-resource.cognitiveservices.azure.com/"- name: AZURE_OPENAI_API_VERSIONvalue: "2024-08-01-preview"- name: AZURE_OPENAI_DEPLOYMENTvalue: "gpt-4o"- name: AZURE_OPENAI_DEPLOYMENT_MINIvalue: "gpt-4o-mini" -
Deploy the app
Terminal window ./scripts/deploy.sh my-builder-app -
Grant database permissions to the app’s service principal
Find the app’s service principal on the app’s Authorization tab in the Databricks workspace, then run these commands in the SQL Editor:
-- Replace <service-principal-id> with your app's service principalGRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA publicTO `<service-principal-id>`;GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA publicTO `<service-principal-id>`;ALTER DEFAULT PRIVILEGES IN SCHEMA publicGRANT ALL ON TABLES TO `<service-principal-id>`; -
Access your app
Your Builder App is now live at the Databricks-managed URL:
https://my-builder-app-1234567890.aws.databricksapps.comThe exact URL is shown in the app’s Overview tab or in the output of
databricks apps get my-builder-app.
Troubleshooting
Section titled “Troubleshooting”| Error | Fix |
|---|---|
uvicorn: command not found | Run source .venv/bin/activate first |
| Invalid access token | Regenerate your PAT — it may be expired |
| Permission denied on tables | Run the GRANT ALL PRIVILEGES SQL commands above for your app’s service principal |
| App fails to start | Check logs with databricks apps logs my-builder-app |
Next Steps
Section titled “Next Steps”- Skills Catalog — Browse all available skills to enable in your Builder App
- Deployment Patterns — Advanced deployment with Asset Bundles and CI/CD
- Lakebase Deep-Dive — Schema management, migrations, and connection patterns
- LLM Provider Details — Model routing and configuration