AI Builder
The AI Builder is a shared module that provides AI-powered generation and refinement of instructions. It supports multiple entity types (voice agents, text agents, plan templates, analyzers) through a unified API, component system, and prompt template structure.
What Problem Does It Solve?
Section titled “What Problem Does It Solve?”Writing effective agent instructions is difficult:
- Users struggle to articulate complex behavioral rules in natural language
- Instructions need to be comprehensive, covering multiple scenarios and edge cases
- Different agent types require different instruction formats (plain text vs. structured fields)
- Iterative refinement requires domain expertise
The AI Builder solves these problems by:
- Generating instructions from natural language — users describe what they want in plain English, and the AI produces structured, production-ready instructions
- Providing targeted editing actions — predefined transformations (improve tone, add examples, strengthen safety) applied via AI
- Supporting extensibility — new entity types can be added by following a clear pattern
How It Works
Section titled “How It Works”The AI Builder uses OpenRouter models to transform user input into agent instructions. It supports two primary flows:
Voice Agents (Plain Text)
Section titled “Voice Agents (Plain Text)”Voice agents use a single instructions field with markdown formatting.
Generation flow:
- User provides a natural language description (“A friendly receptionist that schedules appointments…”)
ai_builder_generate_apisends the description + area (VoiceAgent) to the AI- AI uses the
voice_agent_generate.mdprompt template to produce structured markdown - Instructions are returned as plain text
Editing flow:
- User selects an action (e.g., “Improve tone”) or provides custom edit instruction
ai_builder_edit_apiorai_builder_custom_edit_apisends current instructions + action/custom instruction- AI uses the corresponding prompt template (e.g.,
voice_agent_improve_tone.md) - Refined instructions are returned
Analyzers (Plain Text)
Section titled “Analyzers (Plain Text)”Analyzers use a single prompt field that defines how an LLM should analyze call transcriptions. The AI Builder generates structured analysis prompts with categories, output formats, and edge case handling.
Generation flow:
- User describes the analysis they need (“Analyze calls for sentiment, key issues, and resolution status”)
ai_builder_generate_apisends the description + area (Analyzer) to the AI- AI uses the
analyzer/analyzer_generate.mdprompt template to produce a structured analysis prompt - The prompt includes: Role & Objective, Analysis Categories (with extraction targets, formats, fallbacks), Output Format, Instructions & Rules, and Edge Cases
Key differences from agent generation:
- No personality or tone section — analyzers are purely objective
- No conversation flow — analyzers process completed transcriptions
- Focus on structured data extraction with explicit fallback behaviors
- Edge cases cover transcription quality issues (short calls, poor audio, multiple speakers)
Editing flow: Same as voice agents — uses the shared toolbar with all 8 predefined actions plus custom edits, each backed by analyzer-specific prompt templates in src/shared/ai_builder/prompts/analyzer/.
Text Agents (Structured Object)
Section titled “Text Agents (Structured Object)”Text agents use four distinct fields: purpose, custom_instructions, escalation_instructions, restricted_topics.
Generation flow:
- User provides a natural language description
text_agent_ai_generate_apisends the description to the AI- AI uses
.schema::<TextAgentAiResponse>()to produce structured JSON matching the 4-field schema - All four fields are populated atomically
Editing flow:
- User selects an action or provides custom instruction
text_agent_ai_edit_apiortext_agent_ai_custom_edit_apisends all 4 current fields + action- AI refines all fields together, maintaining consistency
- Updated 4-field object is returned
The Extensibility Pattern
Section titled “The Extensibility Pattern”Adding a new entity type to the AI Builder follows this step-by-step process:
1. Add an AiBuilderArea Variant
Section titled “1. Add an AiBuilderArea Variant”Edit src/shared/ai_builder/types/ai_builder_area_type.rs:
#[derive(Debug, Serialize, Deserialize, Clone, Copy, PartialEq)]pub enum AiBuilderArea { VoiceAgent, TextAgent, PlanTemplate, Analyzer, NewEntityType, // ← Add your variant}2. Define a Structured Response Type (if needed)
Section titled “2. Define a Structured Response Type (if needed)”If your entity uses multiple fields (like text agents), create a response type with schemars::JsonSchema:
use serde::{Deserialize, Serialize};
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq, schemars::JsonSchema)]#[schemars(deny_unknown_fields)]pub struct NewEntityAiResponse { pub field_one: String, pub field_two: String,}Register it in src/shared/ai_builder/types/mod.rs:
mod new_entity_ai_response_type;pub use new_entity_ai_response_type::*;When to use structured output:
- Entity has multiple distinct fields (like text agents: purpose, custom_instructions, etc.)
- Fields need to be populated atomically
- You want type-safe, validated output
When to use plain text:
- Single instruction field (like voice agents)
- Flexibility in formatting
- Markdown or free-form content
3. Add Prompt Templates
Section titled “3. Add Prompt Templates”Create markdown files in src/shared/ai_builder/prompts/ following the naming convention:
{area}_{action}.mdRequired templates:
new_entity_type_generate.md— generation from scratchnew_entity_type_custom_edit.md— custom user instruction- One file per predefined action:
new_entity_type_improve.mdnew_entity_type_make_concise.mdnew_entity_type_improve_tone.mdnew_entity_type_add_examples.mdnew_entity_type_fix.mdnew_entity_type_strengthen_safety.mdnew_entity_type_add_personality.mdnew_entity_type_enhance_with_org_data.md
Prompt template conventions:
- Start with role definition — “You are an expert prompt engineer specializing in…”
- Define output format — “Output ONLY the raw instructions — no explanations, no meta-commentary”
- Include structural requirements — headings, bullet points, formatting rules
- Provide concrete examples — show the expected structure
- Specify domain-specific constraints — safety rules, turn-taking, escalation thresholds
For structured output (like text agents), include JSON schema field definitions and constraints.
Update src/shared/ai_builder/prompts/mod.rs:
pub fn generation_prompt(area: &AiBuilderArea) -> &'static str { match area { AiBuilderArea::VoiceAgent => include_str!("voice_agent_generate.md"), AiBuilderArea::TextAgent => include_str!("text_agent_generate.md"), AiBuilderArea::NewEntityType => include_str!("new_entity_type_generate.md"), }}
pub fn editing_prompt(area: &AiBuilderArea, action: &AiBuilderAction) -> &'static str { match (area, action) { // ... existing matches (AiBuilderArea::NewEntityType, AiBuilderAction::Improve) => { include_str!("new_entity_type_improve.md") } // ... add all actions }}4. Add API Endpoints
Section titled “4. Add API Endpoints”For plain text output, the existing ai_builder_generate_api, ai_builder_edit_api, and ai_builder_custom_edit_api handle the new area automatically once prompts are registered.
For structured output, create dedicated endpoints:
use dioxus::prelude::*;use crate::shared::ai_builder::{NewEntityAiResponse, NewEntityGenerateRequest};
#[post("/api/ai-builder/new-entity/generate", session: crate::auth::Session)]pub async fn new_entity_ai_generate_api( request: NewEntityGenerateRequest,) -> Result<NewEntityAiResponse, HttpError> { _handler(session, request).await}
#[cfg(not(target_family = "wasm"))]pub(crate) async fn _handler( session: crate::auth::Session, request: NewEntityGenerateRequest,) -> Result<NewEntityAiResponse, HttpError> { use aisdk::{ core::{DynamicModel, LanguageModelRequest}, providers::openrouter::Openrouter, }; use crate::shared::ai_builder::prompts;
super::check_area_permission(&session, &request.area) .or_forbidden("Not allowed")?;
let openrouter_conf: crate::mods::ai::OpenRouterCoreConf = crate::bases::conf::get_core_conf() .await .or_internal_server_error("Failed to load configuration")?;
let model = Openrouter::<DynamicModel>::builder() .api_key(openrouter_conf.openrouter_api_key) .model_name(crate::mods::ai::AiModels::GENERATE_INSTRUCTIONS) .build() .or_internal_server_error("Failed to initialize AI model")?;
let system_prompt = prompts::new_entity_generation_prompt();
let org_context = crate::mods::agent::build_organization_context(session.organization.id) .await .or_internal_server_error("Failed to load organization context")?;
let user_message = if org_context.is_empty() { request.user_description.clone() } else { format!( "{}\n\n## Organization Data\n\n{}", request.user_description, org_context ) };
let response = LanguageModelRequest::builder() .model(model) .system(system_prompt) .prompt(&user_message) .schema::<NewEntityAiResponse>() // ← Structured output .build() .generate_text() .await .or_internal_server_error("AI generation failed")?;
let result: NewEntityAiResponse = response .into_schema() .or_internal_server_error("Failed to parse structured output")?;
Ok(result)}Add edit and custom edit endpoints following the same pattern (see text_agent_edit_api.rs and text_agent_custom_edit_api.rs for examples).
Register in src/shared/ai_builder/api/mod.rs.
Update check_area_permission to handle the new area:
pub(crate) fn check_area_permission( session: &crate::auth::Session, area: &super::AiBuilderArea,) -> bool { use crate::bases::auth::Resource;
match area { super::AiBuilderArea::VoiceAgent => { /* ... */ } super::AiBuilderArea::TextAgent => { /* ... */ } super::AiBuilderArea::NewEntityType => { crate::bases::auth::NewEntityResource::has_collection_permission( session, crate::bases::auth::NewEntityCollectionPermission::Create, ) } }}5. Build the Entity-Specific Component
Section titled “5. Build the Entity-Specific Component”Create a component that wires the AI Builder into your entity’s form.
For plain text (voice agent pattern):
use crate::shared::ai_builder::{ AiBuilderArea, AiGenerateInstructions, AiInstructionToolbar, InstructionPreview, InstructionsSkeleton,};
#[component]pub fn NewEntityForm( instructions: Signal<String>,) -> Element { let ai_loading = use_signal(|| false);
rsx! { div { class: "space-y-4", // AI generation section AiGenerateInstructions { area: AiBuilderArea::NewEntityType, on_generated: move |generated: String| { instructions.set(generated); }, is_loading: ai_loading, }
// Preview + editing toolbar if !instructions().is_empty() { if ai_loading() { InstructionsSkeleton {} } else { InstructionPreview { instructions: instructions(), } AiInstructionToolbar { area: AiBuilderArea::NewEntityType, current_instructions: instructions(), on_edited: move |edited: String| { instructions.set(edited); }, } } } } }}For structured output (text agent pattern):
Build a custom component that handles multiple fields. See TextAgentInstructionBuilder in src/mods/text_agent/components/text_agent_instruction_builder_component.rs for a complete example.
Key elements:
- Separate signals for each field
- Generate button that calls
new_entity_ai_generate_api - Action toolbar that calls
new_entity_ai_edit_apifor each action - Custom edit input + button
- Undo stack for reverting changes
Key Types Reference
Section titled “Key Types Reference”AiBuilderArea
Section titled “AiBuilderArea”Enum identifying which entity type is being built.
pub enum AiBuilderArea { VoiceAgent, TextAgent, PlanTemplate, Analyzer,}Location: src/shared/ai_builder/types/ai_builder_area_type.rs
AiBuilderAction
Section titled “AiBuilderAction”Predefined editing transformations available for all areas.
pub enum AiBuilderAction { Improve, MakeConcise, ImproveTone, AddExamples, Fix, StrengthenSafety, AddPersonality, EnhanceWithOrgData,}Location: src/shared/ai_builder/types/ai_builder_action_type.rs
Methods:
.label()— UI button text.template_suffix()— filename suffix for prompt template lookup
Constant: ALL_AI_BUILDER_ACTIONS — array of all actions for toolbar iteration
Request Types
Section titled “Request Types”AiBuilderGenerateRequest
Section titled “AiBuilderGenerateRequest”Plain text generation from natural language description.
pub struct AiBuilderGenerateRequest { pub area: AiBuilderArea, pub user_description: String,}Location: src/shared/ai_builder/types/generate_request_type.rs
AiBuilderEditRequest
Section titled “AiBuilderEditRequest”Plain text editing with predefined action.
pub struct AiBuilderEditRequest { pub area: AiBuilderArea, pub action: AiBuilderAction, pub current_instructions: String,}Location: src/shared/ai_builder/types/edit_request_type.rs
AiBuilderCustomEditRequest
Section titled “AiBuilderCustomEditRequest”Plain text editing with custom user instruction.
pub struct AiBuilderCustomEditRequest { pub area: AiBuilderArea, pub edit_instruction: String, pub current_instructions: String,}Location: src/shared/ai_builder/types/custom_edit_request_type.rs
TextAgentGenerateRequest
Section titled “TextAgentGenerateRequest”Structured generation for text agents.
pub struct TextAgentGenerateRequest { pub user_description: String,}Location: src/shared/ai_builder/types/text_agent_generate_request_type.rs
TextAgentEditRequest
Section titled “TextAgentEditRequest”Structured editing with predefined action.
pub struct TextAgentEditRequest { pub action: AiBuilderAction, pub current: TextAgentAiResponse,}Location: src/shared/ai_builder/types/text_agent_edit_request_type.rs
TextAgentCustomEditRequest
Section titled “TextAgentCustomEditRequest”Structured editing with custom instruction.
pub struct TextAgentCustomEditRequest { pub edit_instruction: String, pub current: TextAgentAiResponse,}Location: src/shared/ai_builder/types/text_agent_custom_edit_request_type.rs
Response Types
Section titled “Response Types”AiBuilderResponse
Section titled “AiBuilderResponse”Plain text response.
pub struct AiBuilderResponse { pub instructions: String,}Location: src/shared/ai_builder/types/ai_response_type.rs
TextAgentAiResponse
Section titled “TextAgentAiResponse”Structured 4-field response for text agents.
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq, schemars::JsonSchema)]#[schemars(deny_unknown_fields)]pub struct TextAgentAiResponse { pub purpose: String, pub custom_instructions: String, pub escalation_instructions: String, pub restricted_topics: String,}Location: src/shared/ai_builder/types/text_agent_ai_response_type.rs
Note: schemars::JsonSchema derive is required for .schema::<T>() in aisdk.
Components
Section titled “Components”AiGenerateInstructions
Section titled “AiGenerateInstructions”NLP input section for generating instructions from natural language.
Props:
area: AiBuilderArea— which entity typeon_generated: EventHandler<String>— callback when AI completesis_loading: Signal<bool>— loading state signal
Location: src/shared/ai_builder/components/ai_generate_instructions_component.rs
Usage:
AiGenerateInstructions { area: AiBuilderArea::VoiceAgent, on_generated: move |generated: String| { instructions.set(generated); }, is_loading: ai_loading,}AiInstructionToolbar
Section titled “AiInstructionToolbar”Action button toolbar for editing existing instructions.
Props:
area: AiBuilderAreacurrent_instructions: Stringon_edited: EventHandler<String>
Location: src/shared/ai_builder/components/ai_instruction_toolbar_component.rs
Usage:
AiInstructionToolbar { area: AiBuilderArea::VoiceAgent, current_instructions: instructions(), on_edited: move |edited: String| { instructions.set(edited); },}InstructionPreview
Section titled “InstructionPreview”Markdown preview panel for instructions.
Props:
instructions: String
Location: src/shared/ai_builder/components/instruction_preview_component.rs
InstructionsSkeleton
Section titled “InstructionsSkeleton”Shimmer loading skeleton shown during AI generation.
Location: src/shared/ai_builder/components/ai_generate_instructions_component.rs
Prompt Template Conventions
Section titled “Prompt Template Conventions”All prompt templates follow these conventions:
1. Role Definition
Section titled “1. Role Definition”Start with the AI’s specialized role:
You are an expert prompt engineer specializing in production-ready voice AI agents...2. Output Format Specification
Section titled “2. Output Format Specification”Explicitly state what the AI should return:
IMPORTANT: Output ONLY the raw system prompt — no explanations, no meta-commentary, no wrapping.For structured output:
IMPORTANT: You MUST respond with a valid JSON object matching the schema provided.3. Structural Requirements
Section titled “3. Structural Requirements”Define headings, sections, and formatting:
## Required output structure
Produce the system prompt with ALL of the following labeled sections:
## 1. Role & Objective## 2. Personality & Tone## 3. Conversation Flow...4. Domain-Specific Constraints
Section titled “4. Domain-Specific Constraints”Include behavioral rules tailored to the entity type:
- ASK AT MOST ONE QUESTION PER TURN.- DO NOT ASK MULTI-PART OR COMPOUND QUESTIONS.- Keep each response to 2-3 sentences.5. Tool Guidance
Section titled “5. Tool Guidance”If the entity uses tools, include usage rules:
## 5. Tools
- **query_knowledge** — when to use, preamble, failure behavior- **transfer_call** — when to use, preamble6. Safety & Escalation
Section titled “6. Safety & Escalation”Define escalation thresholds and sensitive data rules:
- 2 failed tool attempts → escalate- Caller requests human → IMMEDIATELY transfer- NEVER solicit passwords or SSNs7. Examples and Sample Phrases
Section titled “7. Examples and Sample Phrases”Provide concrete examples of expected output:
Sample phrases:- "Let me look that up for you."- "One moment while I pull up your information."Organization Context
Section titled “Organization Context”The AI Builder automatically enriches prompts with organization-specific context when available:
- Voice agents: Appended to user description before generation
- Text agents: Appended to user description in both generation and
EnhanceWithOrgDataediting - Analyzers: Appended to user description, enabling domain-specific analysis categories
- Source:
crate::mods::agent::build_organization_context(org_id)
Organization context may include:
- Business name, address, hours
- Services offered
- FAQs and knowledge base snippets
This allows the AI to generate domain-specific, contextually relevant instructions without requiring users to manually include this information.
API Endpoints
Section titled “API Endpoints”Plain Text (Voice Agents, Analyzers)
Section titled “Plain Text (Voice Agents, Analyzers)”| Endpoint | Method | Purpose |
|---|---|---|
/api/ai-builder/generate | POST | Generate instructions from description |
/api/ai-builder/edit | POST | Edit instructions with predefined action |
/api/ai-builder/custom-edit | POST | Edit instructions with custom instruction |
The area field in the request body (VoiceAgent or Analyzer) determines which prompt templates are used.
Structured (Text Agents)
Section titled “Structured (Text Agents)”| Endpoint | Method | Purpose |
|---|---|---|
/api/ai-builder/text-agent/generate | POST | Generate all 4 fields from description |
/api/ai-builder/text-agent/edit | POST | Edit all 4 fields with predefined action |
/api/ai-builder/text-agent/custom-edit | POST | Edit all 4 fields with custom instruction |
All endpoints require authentication and check area-specific permissions via check_area_permission.
Next Steps
Section titled “Next Steps”- Voice Agent Configuration — how voice agents use AI Builder
- Text Agent AI Instruction Builder — how text agents use AI Builder
- Analyzer Module — analyzer domain module
- Agent Module — voice agent domain module
- Messaging Module — text agent domain module