Available for Enterprise Edition only.
Integrate with GitHub Actions
PBT can be integrated with your own CI/CD solution to build, test, and deploy Prophecy code. The steps for setting up PBT with GitHub Actions on your repository containing a Prophecy project are mentioned below.View an Example GitHub repository.
Prerequisites
- A Prophecy project that is currently hosted in a GitHub repository
Set up environment variables and secrets
PBT requires environment variables DATABRICKS_URL and DATABRICKS_TOKEN to be set for complete functionality. The DATABRICKS_TOKEN that needs to be used can be set as a secret inside the GitHub repository of the project. Steps:- Go to Settings > Secrets > Actions from the GitHub repository menu
- Click ‘New Repository secret’
- Add the secret with name DATABRICKS_TOKEN and value of the Databricks token to be used by PBT.

Set up a GitHub Actions Workflow on every push to prod branch
We’re now ready to setup CI/CD on the Prophecy project. To setup a workflow to build, run all unit tests and then deploy the built jar (Scala)/ whl (Python) on Databricks on every push to theprod branch automatically:
-
Create a .YML file in the project repository at the below location (relative to root):
-
Add the below contents to
exampleWorkflow.yml:
- Triggers on every change that is pushed to the branch
prod. - Sets the environment variables required for PBT to run: DATABRICKS_HOST and DATABRICKS_TOKEN.
- Sets up JDK 11, Python 3 and other dependencies required for PBT to run.
- Validate that the pipeline code is free of syntax errors.
- Builds all the pipelines present in the project and generates a .jar/.whl file. If the build fails at any point a non-zero exit code is returned which stops the workflow from proceeding further and the workflow run is marked as a failure.
- Runs all the unit tests present in the project using FABRIC_NAME(optional) as the configuration. If any of the unit tests fail a non-zero exit code is returned which stops the workflow from proceeding further and the workflow run is marked as a failure.
- Deploys the built .jar/.whl to the Databricks location mentioned in
databricks-job.json, located in thejobsdirectory of the project. If the job already exists in Databricks it is updated with the new .jar/.whl. - Deploys pipeline configurations, if present, to the DBFS path mentioned in
databricks-job.json. - If this process fails at any step, a non-zero exit code is returned which stops the workflow from proceeding further and the workflow run is marked as a failure.

