Single source of truth for tiered prompt content (MNT-009).
Local models (Ollama / mistral:7b) have effective context windows of ~4K–8K tokens. Cloud models (Anthropic, OpenAI, Google) have 128K+. This module centralises ALL tier-dependent prompt text so other modules import from one place instead of scattering local/cloud variants across files.
Architecture
selfHealing.jsowns self-healing rules (CORE_RULES, SELF_HEALING_PROMPT_RULES) and exposesgetPromptRules(tier).promptTiers.js(this file) owns everything else that varies by tier: assertion rules, stability rules, code requirements, and config.outputSchema.jsassembles the final system prompt by importing from both.
Exports
getTier— Returns"cloud"or"local"based on the active provider.TIER_CONFIG— Per-tier configuration (maxElements).getAssertionRules— Tier-aware assertion rules.getStabilityRules— Tier-aware stability rules.getCodeRequirements— Tier-aware code requirement block.
- Source:
Members
(static, constant) TIER_CONFIG :Object.<string, TierConfig>
Type:
- Object.<string, TierConfig>
- Source:
Methods
(static) getAssertionRules(tier) → {string}
Parameters:
| Name | Type | Description |
|---|---|---|
tier |
"cloud" | "local" |
- Source:
Returns:
- Type
- string
(static) getCodeRequirements(tier) → {string}
Parameters:
| Name | Type | Description |
|---|---|---|
tier |
"cloud" | "local" |
- Source:
Returns:
- Type
- string
(static) getStabilityRules(tier) → {string}
Parameters:
| Name | Type | Description |
|---|---|---|
tier |
"cloud" | "local" |
- Source:
Returns:
- Type
- string
(static) getTier() → {"cloud"|"local"}
Determine the prompt tier based on the active AI provider.
- Source:
Returns:
The tier name.
- Type
- "cloud" | "local"
Type Definitions
TierConfig
Type:
- Object
Properties:
| Name | Type | Description |
|---|---|---|
maxElements |
number | Max DOM elements to include in prompts. |
- Source: