Documentation Index
Fetch the complete documentation index at: https://docs.prophecy.ai/llms.txt
Use this file to discover all available pages before exploring further.
This gem runs in .
Overview
In Prophecy, datasets stored in the SQL Warehouse Connection defined in your fabric are accessed using Table gems. Unlike Source and Target gems, Table gems run directly within the data warehouse, eliminating extra orchestration steps and improving performance. Available configurations for Table gems vary based on your SQL warehouse provider. This page explains how to use the Table gem for a Snowflake SQL warehouse, including supported table types, configuration options, and guidance for managing Snowflake tables in your Prophecy pipelines.Table types
The following table types are supported for Snowflake connections.| Name | Description | Type |
|---|---|---|
| Table | Persistent storage of structured data in your SQL warehouse. Optimized for frequent queries and large datasets. | Source or Target |
| View | A virtual table that derives data dynamically from a query. Recomputed at runtime. | Source or Target |
| Seed | Small CSV-format files that you can write directly in Prophecy. | Source only |
Create a new table
Once you create a Table gem, you can reuse the table throughout your project. All created tables appear in the Project tab in the left sidebar. To create a table in your pipeline:Add a table gem to the pipeline
- Open your pipeline in the Studio.
- Click on Source/Target in the canvas.
- Select Table from the dropdown.
- Click on the gem to open the configuration.
Gem configuration
Tables
Tables are persistent storage objects used to store structured data in Snowflake.Source parameters
| Parameter | Description |
|---|---|
| Location | Specify the table’s location using database, schema, and name. |
| Properties | Define or infer schema. Add a description if needed. |
| Preview | Load a sample of the data before saving. |
Target parameters
| Parameter | Description |
|---|---|
| Location | Choose the location where the table will be stored. You can create a new table by writing a new table name. |
| Properties | Define certain properties of the table. The schema cannot be changed for targets. |
| Write Options | Select how you want the data to be written each time you run the pipeline (Table only). Learn more in Write strategies. |
Views
Views are virtual tables recomputed at runtime from a query.Source parameters
| Parameter | Description |
|---|---|
| Location | Enter the database, schema, and table (view) name. |
| Properties | Define or infer schema. Add a description if needed. |
| Preview | Load data based on the view’s underlying query. |
Target parameters
| Parameter | Description |
|---|---|
| Location | Define the name of the view to be created or replaced. |
| Properties | Define certain properties of the view. The schema cannot be changed for targets. |
| Preview | Load a preview of the resulting view. |
Every time the pipeline runs, the target view is replaced. This is because the view is recomputed
from the underlying query logic each time the pipeline executes. No additional write modes are supported.
Seeds
Seeds are lightweight CSV datasets defined in your project. Seeds are source-only.| Parameter | Description |
|---|---|
| Properties | Copy-paste your CSV data and define certain properties of the table. |
| Preview | Load a preview of your seed in table format. |
Seeds are implemented as dbt seeds under the hood. The
CSV data you define is stored in your Prophecy project files and materialized as a table in your
Snowflake data warehouse. This table is created in the default database and schema specified in
your Snowflake connection.
Reusing and sharing tables
After you create a table in Prophecy, you can reuse its configuration across your entire project. All created tables appear in the Project tab in the left sidebar. To make tables available to other teams, you can share your project as a package in the Package Hub. Other users will be able to use the shared table configuration, provided they have the necessary permissions in Snowflake to access the underlying data.Limitations
Currently, Snowflake tables do not support:- Case-sensitive identifiers.
- Creating new partitioned tables.
- Modifying partitioning of existing tables.

