Feature description
AI modes in waveai.json let you configure model, endpoint, provider, and display settings, but there's no way to set a system prompt. Every chat starts with Wave's built-in context, and you can't change what the AI knows or how it behaves before you start talking to it.
Adding an ai:systemprompt field to the mode config would fix that. A few places where this matters:
- Right now the only thing differentiating modes is the model. The "personality" is always the same. With a system prompt you could have a DevOps mode, a code review mode, and a general assistant mode and switch between them from the dropdown.
- Some people want code-only responses. Others want explanations. A system prompt per mode would let you set that once instead of repeating yourself every chat.
- If you're always working in the same stack (say, Go + sqlc + templ), you could bake the project context into the prompt so the AI doesn't start from zero each time.
- Non-English users could set "Always respond in Spanish" and not have to ask twice.
You could also pair this with model selection in interesting ways. Cheap model + opinionated system prompt for quick Q&A, frontier model + detailed system prompt for architecture review.
Implementation suggestion
Support both inline strings and file paths:
{
"devops-mode": {
"display:name": "DevOps Assistant",
"ai:provider": "openrouter",
"ai:model": "anthropic/claude-sonnet-4",
"ai:systemprompt": "You are a senior DevOps engineer. Focus on Kubernetes, Terraform, and CI/CD. Prefer kubectl commands and IaC over manual steps.",
"ai:capabilities": ["tools"]
},
"project-mode": {
"display:name": "Project Assistant",
"ai:provider": "openrouter",
"ai:model": "anthropic/claude-sonnet-4",
"ai:systempromptfile": "~/.config/waveterm/prompts/project-context.md",
"ai:capabilities": ["tools"]
}
}
Looking at the codebase, getSystemPrompt() in pkg/aiusechat/usechat.go already returns a []string that each API backend (OpenAI Chat, OpenAI Responses, Anthropic, Google) formats appropriately. The user-defined prompt could be appended to that array after the built-in system prompt is selected, so it layers on top of Wave's default context rather than replacing it.
The changes would touch:
pkg/wconfig/settingsconfig.go - add SystemPrompt and SystemPromptFile fields to AIModeConfigType
pkg/aiusechat/usechat.go - read the field in getSystemPrompt() or just after it, resolve file paths, append to the []string
schema/waveai.json - add the new fields to the JSON schema
Anything else?
I'd be interested in taking a shot at implementing this. Wanted to check alignment first per the contributing guide.
Feature description
AI modes in
waveai.jsonlet you configure model, endpoint, provider, and display settings, but there's no way to set a system prompt. Every chat starts with Wave's built-in context, and you can't change what the AI knows or how it behaves before you start talking to it.Adding an
ai:systempromptfield to the mode config would fix that. A few places where this matters:You could also pair this with model selection in interesting ways. Cheap model + opinionated system prompt for quick Q&A, frontier model + detailed system prompt for architecture review.
Implementation suggestion
Support both inline strings and file paths:
{ "devops-mode": { "display:name": "DevOps Assistant", "ai:provider": "openrouter", "ai:model": "anthropic/claude-sonnet-4", "ai:systemprompt": "You are a senior DevOps engineer. Focus on Kubernetes, Terraform, and CI/CD. Prefer kubectl commands and IaC over manual steps.", "ai:capabilities": ["tools"] }, "project-mode": { "display:name": "Project Assistant", "ai:provider": "openrouter", "ai:model": "anthropic/claude-sonnet-4", "ai:systempromptfile": "~/.config/waveterm/prompts/project-context.md", "ai:capabilities": ["tools"] } }Looking at the codebase,
getSystemPrompt()inpkg/aiusechat/usechat.goalready returns a[]stringthat each API backend (OpenAI Chat, OpenAI Responses, Anthropic, Google) formats appropriately. The user-defined prompt could be appended to that array after the built-in system prompt is selected, so it layers on top of Wave's default context rather than replacing it.The changes would touch:
pkg/wconfig/settingsconfig.go- addSystemPromptandSystemPromptFilefields toAIModeConfigTypepkg/aiusechat/usechat.go- read the field ingetSystemPrompt()or just after it, resolve file paths, append to the[]stringschema/waveai.json- add the new fields to the JSON schemaAnything else?
I'd be interested in taking a shot at implementing this. Wanted to check alignment first per the contributing guide.