Skip to main content
/images/icon.png
Applicable to the Express and Enterprise Editions only.
By default, Prophecy is configured to use the public Anthropic API under Settings > Admin > Copilot Settings:
{
  "transform_agent": {
    "anthropic_key": "<anthropic API key>"
  }
}
You can configure Prophecy to use Claude models through Azure AI Foundry by updating this setting. This setup allows Prophecy to route LLM requests through your Azure-managed deployment instead of the public Anthropic API. This configuration is useful when you want centralized credential management, controlled deployments, or enterprise governance through Azure AI Foundry.
  1. Go to Settings > Admin > Copilot Settings.
  2. Locate AI Model Provider Credentials and edit the Transform Agent JSON.
  3. Replace the existing configuration with the following:
{
  "transform_agent": {
    "foundry_config": {
      "use_foundry": "true",
      "foundry_api_key": "YOUR_API_KEY",
      "foundry_base_url": "YOUR_BASE_URL",
      "foundry_model": "claude-opus-4-5"
    }
  }
}

Configuration parameters

ParameterDescription
foundry_api_keyYour Azure AI Foundry API key
foundry_base_urlYour Azure AI Foundry endpoint URL
foundry_modelThe deployed Claude model name

Important

  • Remove /v1/messages from the base URL provided by Azure AI Foundry.
  • The model name must match the deployment name in Azure Foundry.

How it works

Prophecy uses the Transform Agent configuration to determine which LLM provider to call. Adding a foundry_config block enables Azure AI Foundry as the provider. Only one provider can be configured at a time, either Anthropic or Azure AI Foundry.