How knowledge graphs work
When you interact with AI in Prophecy, the system enriches your prompt with context from the knowledge graph before sending it to the Large Language Model (LLM). This enrichment step adds metadata about your project’s datasets, schemas, and other entities, which helps the LLM understand what you’re referring to. The LLM then generates SQL code based on this enhanced context.Knowledge graph generation
Prophecy generates one knowledge graph per fabric. Each knowledge graph indexes metadata from all data connections in the fabric. The indexer crawls your data storage using either your identity or a separately configured identity. You can schedule automatic refreshes or trigger manual indexing to keep the knowledge graph current.Prophecy uses the knowledge graph of the fabric attached to your project. If you attach to a
different fabric, AI features will use that fabric’s knowledge graph.

