- Smart LLM for complex tasks, such as
gpt-4o. - Fast LLM for lightweight tasks, such as
gpt-4o-mini.
For Dedicated SaaS deployments, see Copilot
settings to
learn how to connect Prophecy to your LLM.
Supported providers
Prophecy supports model families from the following providers:- OpenAI
- Azure OpenAI
- Google Gemini
- Vertex AI
- Anthropic
Supported models
While Prophecy can connect to all providers shown in the diagram, the following models are officially tested and supported:gpt-4ogpt-4o-minigemini-2.5-flashgemini-2.5-flash-lite

