Applicable to the Express and Enterprise Editions only.
- The SaaS deployment uses a Prophecy-managed OpenAI subscription with GPT-4.1 and GPT-4.1 mini.
- Dedicated SaaS deployments connect to customer-managed endpoints.
- Agents use only the
transform_agentconfiguration in AI Model Provider Creds to determine how Claude is accessed by Agents. - Copilot uses provider credentials and model selection to power features like expression generation.
Navigate to Copilot settings
- Log in to Prophecy as an Administrator.
- Go to Settings > Admin > Copilot Settings.
- AI Model Providers Creds: Add credentials for LLM providers.
- Available AI Models: Define models for Copilot.
- Available AI Speech Models: Define models for speech features.
How Copilot and Agents use these settings
Agents usetransform_agent in AI Model Providers Creds
Copilot uses:
These configurations are independent and can be used together.
Configure AI Model Providers Creds (Agent and Copilot)
In the AI Model Providers Creds subtab, you provide the credentials to connect to your LLM provider. You can configure both Copilot providers and an Agent provider in the same credentials block. (It is common to use both.) The following YAML example shows the required fields for different LLM providers.transform_agent block is used only by Agents to configure access to Claude. Copilot does not use this configuration.For details on configuring endpoints for the Agents, see Configure Claude providers.
Other providers in this section are used only by Copilot when selecting models. You can add multiple credentials here, but Copilot will only connect to the models defined in the Available AI Models subtab. Agents use the transform_agent configuration instead.
When editing credentials in Prophecy, you will not be able to see previously-entered values.
Likewise, once you save your credentials, values will be masked by asterisks
**** as shown in
the example above.Set up Vertex AI
Vertex AI does not require an API key. Instead, you provide a Google Cloud service account file. Reach out to the Support team with the following information to add Vertex AI credentials to your deployment:- The service account file that can authenticate your connection to Vertex AI.
- (Optional) The Vertex AI region, if it differs from the service account region.
- (Optional) The Vertex AI project, if it differs from the service account project.
Available AI Models for Copilot
This configuration applies only to Copilot features. Agents do not use these models. In the Available AI Models subtab, you define two models:- Smart model: Prophecy uses this model for complex tasks. Recommended Model:
gpt-4.1 - Fast model: Prophecy uses this model for easy and quick tasks. Recommended Model:
gpt-4.1-mini
Available AI Speech Models
This configuration applies only to Copilot speech features. In the Available AI Speech Models subtab, you define different models for speech-to-text and text-to-speech operations. You can select models from any of the providers you configured in the AI Model Providers Creds subtab. The following YAML example shows how to format your speech-to-text (stt) and text-to-speech (tts) model configuration.
Prerequisites for configuring Copilot providers
These prerequisites apply only to Copilot features. To add your LLM provider and model details to Copilot settings:- You must be logged in as a Prophecy cluster admin.
- Copilot must already be enabled in your Prophecy deployment. Reach out to the Support team to enable Copilot.

