When you create a SQL project in Prophecy, the project is initialized with a small dbt macro namedDocumentation Index
Fetch the complete documentation index at: https://docs.prophecy.ai/llms.txt
Use this file to discover all available pages before exploring further.
generate_schema_name under the macros/ directory. This macro controls how Prophecy resolves the final schema name for every table in your pipeline.
This page explains what the macro does, what happens if you delete it, and how to restore it.
Prophecy auto-generates this macro for every new SQL project. If you started a project before this auto-generation rolled out, or you removed the file at some point, you will need to add it back manually. See Restore the macro.
What happens if you delete the macro
If you deletemacros/generate_schema_name.sql from the project, dbt falls back to its built-in version of the macro. The built-in version concatenates the fabric’s default schema with whatever you set on the Table gem, joined by an underscore.
For example, with a Databricks fabric whose connection schema is default and a Table gem whose target schema is my_schema:
| Macro present | Resolved schema | Resulting fully qualified name |
|---|---|---|
Prophecy generate_schema_name | my_schema | <catalog>.my_schema.<table> |
dbt built-in (generate_schema_name macro deleted) | default_my_schema | <catalog>.default_my_schema.<table> |
- The preflight check in the Prophecy editor reports that the target table cannot be found, because the concatenated schema does not exist in the warehouse.
- A pipeline run creates a new schema named
<fabric_default>_<your_target_schema>in your warehouse and writes data there, instead of the schema you configured.
How schemas are resolved
Each Table gem in a SQL project has two schema-related inputs:- The fabric default schema — set on the Databricks, Snowflake, or BigQuery connection used by the fabric. dbt exposes it as
target.schema. - The table target schema — an optional override set per Table gem. dbt passes it to the macro as
custom_schema_name.
generate_schema_name macro resolves these two inputs as follows:
macros/generate_schema_name.sql
- If a Table gem does not set a target schema, the table is written to the fabric default schema.
- If a Table gem does set a target schema, the table is written to that schema exactly as written.
Restore the macro
If you have already deletedgenerate_schema_name, you can recreate it from the project sidebar.
Open the project sidebar
Open the project in the Studio. In the left sidebar, expand the Functions section and
confirm that no entry named
generate_schema_name exists. If prophecy_tmp_source is the only
function listed, the schema-resolution macro has been removed.Create a new function
- Click + Add Entity in the project sidebar.
- Select Function.
- Name the function
generate_schema_name. - Keep the function in the
macrosdirectory. - Click Create.
Paste the macro body
Replace the default function body with the following content:Save the function.
macros/generate_schema_name.sql
Verify resolution
Open any pipeline that writes to a Table gem with an explicit target schema. The fully
qualified name in the gem preview and in the preflight check should now use the target schema
directly, without the fabric default schema as a prefix.
A table with target schema
my_schema resolves to <catalog>.my_schema.<table>, not
<catalog>.default_my_schema.<table>.Customizing the macro
The macro is a regular dbt macro and you can customize it if you have specific routing rules. For example, you could prefix schemas by environment or by team. As long as the customized macro is namedgenerate_schema_name and lives in the macros/ directory, dbt will use it instead of the built-in version.
Related topics
- Functions — Other user-built macros in SQL projects.
- Databricks connection — Where the fabric default schema is configured for Databricks.
- Snowflake connection — Where the fabric default schema is configured for Snowflake.
- dbt’s documentation on custom schemas — Upstream documentation for the macro.

