This gem runs in .
Overview
This page describes how to use Databricks external Source and Target gems to read from or write to tables. Only use an external Source and Target gem when Databricks is not the configured SQL warehouse connection. Otherwise, use the Table gem.If you’re working with file types like CSV or Parquet from Databricks file storage, see File
types for guidance. This page focuses only on catalog
tables.
Create a Databricks gem
To create a Databricks Source or Target gem in your pipeline:1
Add a Source or Target gem to the pipeline
- Open your pipeline in the Studio.
- Click on Source/Target in the canvas.
- Select Source or Target from the dropdown.
- Click on the gem to open the configuration.
2
Select Databricks format
In the Type tab, select Databricks under Table. Do not select Databricks under
File. Then, click Next.
3
Set location details
In the Location tab, set your connection details and table location. To learn more, jump to Source location and Target location.
4
Set table properties
In the Properties tab, set the table properties. To learn more, jump to Source
properties and Target properties.
5
Preview data (Source only)
In the Preview tab, load a sample of the data and verify that it looks correct.
Source configuration
Use these settings to configure a Databricks Source gem for reading data.Source location
| Parameter | Description |
|---|---|
| Format type | Table format for the source. For Databricks tables, set to databricks. |
| Select or create connection | Select or create a new Databricks connection in the Prophecy fabric you will use. |
| Database | Database including the schema where the table is located. |
| Schema | Schema containing the table you want to read from. |
| Name | Exact name of the Databricks table to read data from. |
Target configuration
Use these settings to configure a Databricks Target gem for writing data.Target location
| Parameter | Description |
|---|---|
| Format type | Table format for the target. For Databricks tables, set to databricks. |
| Select or create connection | Select or create a new Databricks connection in the Prophecy fabric you will use. |
| Database | Database including the schema where the table is/will be located. |
| Schema | Schema where the target table will be created or updated. |
| Name | Name of the Databricks table to write data to. If the table doesn’t exist, it will be created automatically. |
Target properties
| Property | Description | Default |
|---|---|---|
| Description | Description of the table. | None |
| Write Mode | Whether to overwrite the table completely, append new data to the table, or throw an error if the table exists. | None |
Cross-workspace access
If your fabric uses Databricks as the SQL warehouse, you can’t select Databricks in an external Source or Target gem. Instead, you must use Table gems, which are limited to the Databricks warehouse defined in the SQL warehouse connection. To work with tables from a different Databricks workspace, use Delta Sharing. Delta Sharing lets you access data across workspaces without creating additional Databricks connections.Prophecy implements this guardrail to avoid using external connections when the data can be made
available in your warehouse. External connections introduce an extra data transfer step, which
slows down pipeline execution and adds unnecessary complexity. For best performance, Prophecy
always prefers reading and writing directly within the warehouse.

