Skip to content

Compute & Workspace

Execute code on Databricks compute, manage clusters and warehouses, and upload files to the workspace.

“Run this Python script on serverless compute”

“Create a single-node cluster with the latest ML runtime”

“Upload my local project folder to the workspace”


Description: Execute code on Databricks via serverless or cluster compute.

Parameters:

ParameterTypeRequiredDescription
codestrNo
file_pathstrNo
compute_typestrNo
cluster_idstrNo
context_idstrNo
languagestrNo
timeoutintNo
destroy_context_on_completionboolNo
workspace_pathstrNo
run_namestrNo
job_extra_paramsDict[str, Any]No

Description: List compute resources: clusters, node types, or spark versions.

Parameters:

ParameterTypeRequiredDescription
resourcestrNo
cluster_idstrNo
auto_selectboolNo

Description: Create, modify, start, terminate, or delete a cluster.

Parameters:

ParameterTypeRequiredDescription
actionstrYes
cluster_idstrNo
namestrNo
num_workersintNo
spark_versionstrNo
node_type_idstrNo
autotermination_minutesintNo
data_security_modestrNo
spark_confstrNo
autoscale_min_workersintNo
autoscale_max_workersintNo

Description: Create, modify, or delete a SQL warehouse.

Parameters:

ParameterTypeRequiredDescription
actionstrYes
warehouse_idstrNo
namestrNo
sizestrNo
min_num_clustersintNo
max_num_clustersintNo
auto_stop_minsintNo
warehouse_typestrNo
enable_serverlessboolNo

Description: Manage active Databricks workspace connection (session-scoped).

Parameters:

ParameterTypeRequiredDescription
actionstrYes
profileOptional[str]No
hostOptional[str]No

Description: Manage workspace files: upload, delete.

Parameters:

ParameterTypeRequiredDescription
actionstrYes
workspace_pathstrYes
local_pathOptional[str]No
max_workersintNo
overwriteboolNo
recursiveboolNo