Skip to main content
Version: PromptQL

PromptQL Configuration

Overview

You can manage PromptQL's LLM settings and system instructions from a single promptql-config.hml file, automatically created in the globals/metadata directory of your project when you initialize a project with the --with-promptql flag using the CLI.

Examples

Minimal configuration:
kind: PromptQlConfig
version: v2
definition:
llm:
provider: hasura
Custom providers for LLM & AI primitives and custom system instructions:
kind: PromptQlConfig
version: v2
definition:
llm:
provider: openai
model: o3-mini
apiKey:
valueFromEnv: OPENAI_API_KEY
aiPrimitiveLlm:
provider: openai
model: gpt-4o
apiKey:
valueFromEnv: OPENAI_API_KEY
systemInstructions: |
You are a helpful AI Assistant.
With schema selector for performance optimization:
kind: PromptQlConfig
version: v2
definition:
llm:
provider: openai
model: gpt-4o
apiKey:
valueFromEnv: OPENAI_API_KEY
schemaSelector:
type: llm_selector
llm:
provider: gemini
model: gemini-2.5-flash
apiKey:
valueFromEnv: GEMINI_API_KEY
maxItems: 25
instructions: Focus on user-related tables and core business entities
Mapping environment variables

If you do specify environment variables in your promptql-config.hml, don't forget to add them to the globals subgraph's subgraph.yaml under the envMapping section.

With promptql-config.hml file, you can:

  • Set the LLM provider and model used across the application.
  • Define a separate LLMs for AI Primitives such as Classification, Summarization, and Extraction.
  • Add system instructions that apply to every PromptQL interaction.
  • Configure schema selection to optimize DDN metadata context available to the PromptQL agent.

Next steps