SQL Query Files
Skill: databricks-app-appkit
What You Can Build
Section titled “What You Can Build”You can write parameterized SQL query files that AppKit automatically type-checks, connects to visualization components, and executes against a SQL warehouse. Store queries in config/queries/, reference them by filename, and get auto-generated TypeScript types — no manual fetch logic or type definitions needed.
In Action
Section titled “In Action”“Using SQL and TypeScript, write a parameterized query that filters orders by date range and category, then wire it to an AppKit chart.”
-- config/queries/orders_by_category.sqlSELECT category, SUM(total_amount) AS revenue, COUNT(*) AS order_countFROM catalog.schema.ordersWHERE order_date BETWEEN :start_date AND :end_date AND (:category = '' OR LOWER(category) = LOWER(:category))GROUP BY categoryORDER BY revenue DESCKey decisions:
- Parameters use colon prefix (
:parameter_name) — AppKit binds them at execution time - Types auto-regenerate during dev whenever SQL files change, appearing in
client/src/appKitTypes.d.ts - Use Databricks SQL dialect (based on Spark SQL), not PostgreSQL — function syntax differs for date math, string aggregation, and case-insensitive matching
- Reference this query as
queryKey="orders_by_category"in any visualization component
More Patterns
Section titled “More Patterns”Type-safe parameter binding
Section titled “Type-safe parameter binding”“Using TypeScript, pass typed parameters to a SQL query using the sql.* helpers.”
import { sql } from "@databricks/appkit-ui/js";
const params = { min_value: sql.number(minValue), category: sql.string(category), start_date: sql.date("2024-01-01"), optional_filter: sql.string(optionalFilter || ""),};Each sql.* helper ensures the parameter is sent with the correct type. Available helpers: sql.string(), sql.number(), sql.boolean(), sql.date() (YYYY-MM-DD), sql.timestamp(), and sql.binary(). There is no sql.null(), sql.array(), sql.int(), or sql.float().
Handling optional date filters with sentinel values
Section titled “Handling optional date filters with sentinel values”“Using SQL, write a query with optional date bounds that default to ‘all time’ when not specified.”
-- config/queries/filtered_events.sqlSELECT event_name, event_date, attendeesFROM catalog.schema.eventsWHERE event_date >= :start_date AND event_date <= :end_date// Use sentinel dates for "no filter" instead of empty stringsconst params = { start_date: sql.date("1900-01-01"), // No lower bound end_date: sql.date("9999-12-31"), // No upper bound};Empty strings break date parameters. Sentinel dates let the query run without conditional logic while returning all rows when no filter is active.
Converting Databricks SQL from PostgreSQL syntax
Section titled “Converting Databricks SQL from PostgreSQL syntax”“Translate common PostgreSQL functions to their Databricks SQL equivalents.”
-- PostgreSQL: DATEDIFF(date1, date2)-- Databricks SQL: DATEDIFF(DAY, date2, date1) -- three arguments, unit firstSELECT DATEDIFF(DAY, start_date, end_date) AS days_elapsedFROM catalog.schema.projects;
-- PostgreSQL: NOW()-- Databricks SQL: CURRENT_TIMESTAMP()SELECT * FROM catalog.schema.eventsWHERE event_date > CURRENT_TIMESTAMP();
-- PostgreSQL: STRING_AGG(col, ',')-- Databricks SQL: CONCAT_WS(',', COLLECT_LIST(col))SELECT department, CONCAT_WS(',', COLLECT_LIST(employee_name)) AS membersFROM catalog.schema.employeesGROUP BY department;
-- PostgreSQL: ILIKE-- Databricks SQL: LOWER(col) LIKE LOWER(pattern)SELECT * FROM catalog.schema.productsWHERE LOWER(name) LIKE LOWER('%widget%');These are the most frequent dialect mismatches your AI coding assistant will encounter when generating SQL for AppKit. Databricks SQL is based on Spark SQL, so aggregate functions and date operations use different signatures than PostgreSQL.
Displaying numeric values from query results
Section titled “Displaying numeric values from query results”“Using TypeScript, safely render a numeric value from a SQL query result.”
// BIGINT, INT, and DECIMAL auto-map to TypeScript `number`,// but they may arrive as strings at runtime. Always convert:<span>{Number(row.total_amount).toFixed(2)}</span>Runtime type coercion is a common source of rendering bugs. Wrapping in Number() before formatting ensures consistent display regardless of how the SQL connector serializes the value.
Watch Out For
Section titled “Watch Out For”- Using PostgreSQL syntax in query files — AppKit queries run on Databricks SQL (Spark SQL). Functions like
DATEDIFF,NOW(),STRING_AGG, andILIKEhave different signatures or do not exist. Check the dialect translation above. - Passing empty strings for date parameters —
sql.date('')produces invalid SQL. Use sentinel dates (1900-01-01/9999-12-31) for optional date filters. - Writing queries outside
config/queries/— AppKit’s type generator only watches that directory. Queries placed elsewhere will not get auto-generated types and cannot be referenced viaqueryKey. - Assuming numeric columns are numbers — auto-generated types say
number, but the wire format may deliver a string. Always callNumber()before arithmetic or formatting.