Skip to content

AI Services

Loquent routes all AI calls — background services, text agents, and the instruction builder — through OpenRouter. A single API key accesses models from OpenAI, Google, and Anthropic. Admins can override the model used for any AI area from the admin panel without code changes.

The openrouter_api_key lives in the core_conf database table. Load it via the standard config pattern:

use crate::bases::conf::get_core_conf;
use crate::mods::ai::OpenRouterCoreConf;
let conf: OpenRouterCoreConf = get_core_conf().await?;

Set the key during setup by adding OPENROUTER_API_KEY to seed.env. The migration seeds it into core_conf automatically.

Each background AI task maps to an AiArea — a named slot that resolves to an OpenRouter model ID at runtime. The AiModels struct holds the hardcoded defaults, and the ai_model_config table stores admin overrides.

AreaDefault ModelTier
Auto Tag Contactdeepseek/deepseek-v3.2Simple
Identify Speakersdeepseek/deepseek-v3.2Simple
Knowledge Querygoogle/gemini-3.1-flash-lite-previewSimple
Summarize Callgoogle/gemini-3.1-pro-previewMedium
Update System Notegoogle/gemini-3.1-pro-previewMedium
Enrich Contactdeepseek/deepseek-v3.2Medium
Enrich Contact from Messagesgoogle/gemini-3.1-flash-lite-previewMedium
Extract Todosgoogle/gemini-3.1-pro-previewMedium
Analyze Callgoogle/gemini-3.1-pro-previewMedium
Execute Todoanthropic/claude-sonnet-4.6Complex
Generate Instructionsanthropic/claude-sonnet-4.6Complex
Edit Instructionsanthropic/claude-sonnet-4.6Complex
Custom Edit Instructionsanthropic/claude-sonnet-4.6Complex
Execute Plananthropic/claude-sonnet-4.6Complex
Execute Plan (Fallback)google/gemini-3.1-pro-previewComplex
Create Plan from Callanthropic/claude-opus-4.6Plan Creator
Create Plan from SMSanthropic/claude-opus-4.6Plan Creator
Create Plan from Templateanthropic/claude-opus-4.6Plan Creator
Text Agent Suggestionsgoogle/gemini-3.1-pro-previewText Agent
Platform Assistant (Vernis)google/gemini-3.1-flash-lite-previewAssistant

The AiArea enum identifies each configurable AI slot. Every variant has a stable string key (stored in the database), a human-readable label, a default model, and a complexity tier.

src/mods/ai/types/ai_models_type.rs
pub enum AiArea {
AutoTagContact, // key: "auto_tag_contact"
IdentifySpeakers, // key: "identify_speakers"
KnowledgeQuery, // key: "knowledge_query"
SummarizeCall, // key: "summarize_call"
// ... 16 more variants
}
impl AiArea {
pub fn key(&self) -> &'static str { /* ... */ }
pub fn label(&self) -> &'static str { /* ... */ }
pub fn default_model(&self) -> &'static str { /* ... */ }
pub fn tier(&self) -> &'static str { /* ... */ }
pub fn from_key(key: &str) -> Option<Self> { /* ... */ }
}

resolve_model() is the single entry point every AI service uses to determine which model to call. It checks the ai_model_config table for an admin override; if none exists, it returns the hardcoded default.

src/mods/ai/services/resolve_model_service.rs
pub async fn resolve_model(area: AiArea) -> Result<String, AppError>

Behavior:

  1. Queries ai_model_config for a row matching area.key()
  2. If found → returns the stored model_id
  3. If not found → returns area.default_model()
  4. If the table doesn’t exist yet (pre-migration) → warns and returns the default

Services use it like this:

use crate::mods::ai::{AiArea, resolve_model};
let model_id = resolve_model(AiArea::SummarizeCall).await?;
let model = Openrouter::<DynamicModel>::builder()
.api_key(conf.openrouter_api_key)
.model_name(&model_id)
.build()?;

Super admins can change the model for any AI area from Admin → AI Models.

GET /api/admin/ai-model-configs

Returns all 20 AI areas with their current model, default model, override status, and who last changed it. Requires super admin.

Response:

struct AiModelConfigData {
entries: Vec<AiModelConfigEntry>,
}
struct AiModelConfigEntry {
area: String, // "summarize_call"
label: String, // "Summarize Call"
tier: String, // "Medium"
model_id: String, // Current active model
default_model_id: String, // Hardcoded default
is_default: bool, // true if no override exists
updated_by: Option<String>,// Name of admin who last changed it
updated_at: Option<String>,// When it was last changed
}
PUT /api/admin/ai-model-config

Sets the model for one AI area. If the new model matches the default, the override row is deleted (reset to default). Requires super admin.

Request:

struct UpdateAiModelConfig {
area: String, // "summarize_call"
model_id: String, // "openai/gpt-4.1"
}

Both endpoints are audit-logged via record_admin_audit_entry with action types config.ai_model.update and config.ai_model.reset.

The ai_model_config table stores only overrides — areas using the default have no row.

ColumnTypeNotes
idUUIDPrimary key
areaTEXTUnique — matches AiArea::key()
model_idTEXTOpenRouter model identifier
updated_byUUIDFK to user.id, cascade delete
updated_atTimestamptzAuto-set on upsert

The upsert uses INSERT ... ON CONFLICT (area) DO UPDATE.

Every AI service follows the same pattern — resolve the model, load the config, send the request:

use aisdk::core::{DynamicModel, LanguageModelRequest};
use aisdk::providers::openrouter::Openrouter;
use crate::mods::ai::{AiArea, OpenRouterCoreConf, resolve_model};
let conf: OpenRouterCoreConf = get_core_conf().await?;
let model_id = resolve_model(AiArea::EnrichContact).await?;
let model = Openrouter::<DynamicModel>::builder()
.api_key(conf.openrouter_api_key)
.model_name(&model_id)
.build()
.map_err(|e| AppError::Internal(e.to_string()))?;
let response = LanguageModelRequest::builder()
.model(model)
.system("Your system prompt")
.prompt("User input")
.build()
.generate_text()
.await
.map_err(|e| AppError::Internal(e.to_string()))?;

For type-safe responses, define a schema struct and pass it to .schema::<T>():

#[derive(schemars::JsonSchema, serde::Deserialize, Debug)]
#[schemars(deny_unknown_fields)]
struct CallerInfo {
first_name: String,
last_name: String,
email: Option<String>,
}
let response = LanguageModelRequest::builder()
.model(model)
.system("Extract caller information from the transcription.")
.prompt(transcription)
.schema::<CallerInfo>()
.build()
.generate_text()
.await?;
let caller: CallerInfo = response.into_schema()?;

Text agents let users pick their model from the UI. The TextAgentModel enum in src/mods/text_agent/types/text_agent_model_type.rs defines the available options:

ModelOpenRouter IDProvider
GPT-4.1openai/gpt-4.1OpenAI
GPT-5.3 Chatopenai/gpt-5.3-chatOpenAI
Gemini 3 Flashgoogle/gemini-3-flash-previewGoogle
Gemini 3.1 Progoogle/gemini-3.1-pro-previewGoogle
Gemini 3.1 Flash Litegoogle/gemini-3.1-flash-lite-previewGoogle
Claude Sonnet 4.6anthropic/claude-sonnet-4.6Anthropic

At runtime, model.openrouter_id() converts the enum to the OpenRouter model string. Old model keys (from before the migration) are mapped to their closest equivalents via FromStr.

FilePurpose
src/mods/ai/types/openrouter_core_conf_type.rsConfig struct for API key
src/mods/ai/types/ai_models_type.rsAiModels defaults + AiArea enum
src/mods/ai/services/resolve_model_service.rsRuntime model resolution (DB override → default)
src/mods/admin/api/get_ai_model_configs_api.rsList all model configs (super admin)
src/mods/admin/api/update_ai_model_config_api.rsUpdate model for an area (super admin)
src/mods/admin/services/admin_ai_model_config_service.rsModel config service (get/upsert/reset)
src/mods/admin/components/admin_ai_models_tab_component.rsAdmin UI for model selection
migration/src/m20260320_130000_create_ai_model_config_table.rsai_model_config table migration
src/mods/text_agent/types/text_agent_model_type.rsUser-facing model enum