Governance & Catalog
Data Exploration & Unity Catalog
Section titled “Data Exploration & Unity Catalog”Skills: databricks-unity-catalog, databricks-agent-skill-databricks
MCP Tools: manage_uc_objects, get_table_details, execute_sql
List all catalogs in my workspace.Show me all schemas in the main catalog.List all tables in main.default and show their row counts.Get detailed schema information for the table main.sales.transactions — show mecolumn names, types, and any comments.What tables exist in main.analytics? Show me the schema for each one.Query the system.access.audit table to show who accessed themain.production.customers table in the last 7 days.Query system.information_schema.tables to find the 20 largest tables by sizeacross all catalogs.Show me the column-level lineage for main.gold.revenue_summary — what upstreamtables feed into it?Query system.billing.usage to show my workspace's DBU consumption by SKU forthe past 30 days.Permissions & Security
Section titled “Permissions & Security”MCP Tools: manage_uc_grants, manage_uc_security_policies
Grant SELECT permission on main.analytics schema to the group "data-analysts".Show me all grants on the table main.production.customers.Set up row-level security on main.hr.employees so users can only see recordsfrom their own department.Create a column mask on main.finance.payroll to redact the salary column fornon-finance users.Tags & Metadata
Section titled “Tags & Metadata”MCP Tools: manage_uc_tags
Add tags {"pii": "true", "data_owner": "privacy-team"} to the tablemain.customers.profiles.List all tables tagged with "pii" across the main catalog.Volume File Operations
Section titled “Volume File Operations”MCP Tools: list_volume_files, upload_to_volume, download_from_volume, create_volume_directory, delete_volume_file, delete_volume_directory
List all files in /Volumes/main/raw/incoming/ and show their sizes andmodification dates.Upload my local file data/export.csv to/Volumes/main/staging/uploads/export.csv.Download the file /Volumes/main/reports/quarterly_summary.pdf to my localmachine.Create a new directory /Volumes/main/raw/2024-01/ for organizing incomingdata files.Iceberg Tables & Interoperability
Section titled “Iceberg Tables & Interoperability”Skills: databricks-iceberg
MCP Tools: execute_sql, manage_uc_objects
Create a managed Iceberg table in main.lake.events using the iceberg tableformat with partitioning on event_date.Enable External Iceberg Reads (UniForm) on my existing Delta tablemain.analytics.daily_metrics so external engines can read it as Iceberg.Configure the Iceberg REST Catalog (IRC) so my Spark on EMR cluster can readUnity Catalog tables as Iceberg tables.Set up a Streaming Table with compatibility mode enabled so it can be read asIceberg by external engines.Show me how to read my Databricks Delta table from Snowflake using Unity Catalogas an Iceberg REST Catalog.Create an Iceberg table and demonstrate reading it with PyIceberg from a localPython environment.Delta Sharing
Section titled “Delta Sharing”MCP Tools: manage_uc_sharing
Create a share called "partner_data" and add the tablesmain.analytics.product_metrics and main.analytics.daily_kpis to it.Add a recipient "partner-corp" to my Delta Share with token-based authentication.Data Quality Monitoring
Section titled “Data Quality Monitoring”MCP Tools: manage_uc_monitors
Create a quality monitor on main.production.orders that tracks data drift, nullrates, and column statistics on a daily schedule.External Connections
Section titled “External Connections”MCP Tools: manage_uc_connections
Create a Lakehouse Federation connection to my external PostgreSQL database forcross-database querying.