Prompt Library
A curated collection of 130+ ready-to-use prompts for working with Databricks through Claude Code and the AI Dev Kit. Each prompt is tested, copy-paste ready, and designed to leverage the right combination of skills and MCP tools automatically.
How to Use
Section titled “How to Use”- Browse a category below or use the sidebar to navigate
- Find a prompt that matches your task
- Copy it into Claude Code (or any agent with AI Dev Kit installed)
- Claude activates the appropriate skills and MCP tools to complete the task
Categories
Section titled “Categories”Cross-Cutting Scenarios
Section titled “Cross-Cutting Scenarios”These prompts span multiple skills and tools for end-to-end workflows. Each one triggers a multi-step build across different parts of the Databricks platform.
End-to-End Data Platform
Section titled “End-to-End Data Platform”Build a complete data platform:1. Create a medallion pipeline (bronze/silver/gold) using Spark Declarative Pipelines2. Schedule it with a Databricks Job that runs every hour3. Create an AI/BI dashboard that visualizes the gold layer4. Package everything as a Databricks Asset Bundle for deploymentSkills involved: databricks-spark-declarative-pipelines, databricks-jobs, databricks-aibi-dashboards, databricks-bundles
RAG Application
Section titled “RAG Application”Build an end-to-end RAG application:1. Parse PDF documents from a volume using ai_parse_document2. Chunk the text and create embeddings with a Vector Search index3. Build a Knowledge Assistant that answers questions using the index4. Deploy it as a Streamlit app on Databricks AppsSkills involved: databricks-ai-functions, databricks-vector-search, databricks-agent-bricks, databricks-app-python
Real-Time Analytics
Section titled “Real-Time Analytics”Build a real-time analytics system:1. Ingest events using Zerobus Ingest2. Process them with a Spark Structured Streaming pipeline3. Write aggregated results to a Lakebase database for low-latency serving4. Create a Genie Space for business users to ask questions about the dataSkills involved: databricks-zerobus-ingest, databricks-spark-structured-streaming, databricks-lakebase-autoscale, databricks-genie
ML Ops Pipeline
Section titled “ML Ops Pipeline”Set up an ML pipeline:1. Generate synthetic training data2. Train a model and log it with MLflow3. Evaluate the model using MLflow GenAI evaluation with custom scorers4. Deploy the best model to a serving endpoint5. Create a monitoring dashboard that tracks prediction qualitySkills involved: databricks-synthetic-data-gen, databricks-mlflow-evaluation, databricks-model-serving, databricks-aibi-dashboards
Multi-Agent System
Section titled “Multi-Agent System”Build a multi-agent customer service system:1. Create a Genie Space for SQL-based product and order queries2. Create a Knowledge Assistant for policy and FAQ documents3. Build a Supervisor Agent that routes customer questions to the right agent4. Deploy the supervisor as a serving endpoint5. Build a Gradio chat app as the customer-facing interfaceSkills involved: databricks-genie, databricks-agent-bricks, databricks-model-serving, databricks-app-python