providers - composable LLM provider system
OVERVIEW
SLICC supports multiple LLM providers through a composable provider system built on pi-ai. Providers are auto-discovered at startup — most need no configuration files at all. Built-in support includes Anthropic, OpenAI, Google, AWS Bedrock, and others. Custom providers can be added for corporate proxies, OAuth-based access, or OpenAI-compatible endpoints.
MODEL ID ALIASES
Always use pi-ai aliases for model IDs, not dated snapshot names. For example, use claude-opus-4-6 instead of a versioned snapshot ID. Pi-ai resolves aliases to the correct underlying model automatically.
PROVIDER DISCOVERY
Providers come from three sources, in order of precedence:
-
Pi-ai auto-discovery
getProviders()returns all pi-ai providers automatically. No files needed. Filtered at build time bypackages/dev-tools/providers.build.json(include: ["*"]= all,exclude: ["*"]= none). Provider-settings generates a fallback config (display name derived from ID,requiresApiKey: true) so the provider appears in the Settings UI automatically. -
Built-in extensions
packages/webapp/src/providers/built-in/*.ts— only for providers needing customregister()functions (e.g., bedrock-camp). Also filtered by providers.build.json. -
External providers
packages/webapp/providers/*.ts(gitignored) — always included, never filtered. Used for custom OAuth providers, corporate proxies, etc.
Each provider module exports a config: ProviderConfig and optionally a register(): void function for custom stream functions.
PROVIDER COMPOSITION
Model capabilities are resolved through a three-layer merge, each layer overriding the previous:
- Pi-ai defaults — base model metadata from the pi-ai registry.
- modelOverrides (static) — per-model capability overrides declared in the provider config. Useful for config-only providers that need custom context windows or capability flags.
- getModelIds (dynamic) — when present, replaces the default model list. Each ID is resolved against the pi-ai registry; unknown IDs get fallback model objects. Can return
ModelMetadatafields per model. Setapi: 'openai'to route a model throughstreamOpenAICompletions.
OPENAI-COMPATIBLE MODELS
Models that use the OpenAI completions API (instead of Anthropic messages) are routed through streamOpenAICompletions when api: 'openai' is set in model metadata via getModelIds(). This allows a single provider to serve both Anthropic-style and OpenAI-style models through the same config.
MODEL CAPABILITIES
Providers can override per-model capabilities including context window size, vision support, tool use, streaming behavior, and maximum output tokens. Override via modelOverrides (static, declared in config) or through getModelIds() metadata (dynamic, per-model).
OAUTH INTEGRATION
Providers that use OAuth set isOAuth: true in their config and implement onOAuthLogin and onOAuthLogout handlers. The OAuth launcher is created via createOAuthLauncher() from packages/webapp/src/providers/oauth-service.ts. Tokens retrieved via OAuth can also be used in the shell with oauth-token.
REGISTRATION
Provider registration runs in both entry points to ensure providers are available in all runtime contexts:
packages/webapp/src/ui/main.ts— standalone/CLI mode.packages/chrome-extension/src/offscreen.ts— extension mode (agent engine runs in offscreen document).
Both files import ../providers/index.js which auto-discovers and registers all built-in and external providers at module load time.
SWITCHING MODELS
The agent can use different models for different tasks, and switching providers or models mid-conversation is supported. Model selection is configured in the Settings UI, and agents may be directed to use specific models via conversation context.
ADDING A PROVIDER
Most providers need no files — pi-ai auto-discovery handles them. Only create a provider file if you need custom stream functions or OAuth. For external providers, create a file in packages/webapp/providers/:
// packages/webapp/providers/my-provider.ts
import type { ProviderConfig } from '../src/providers/types.js';
export const config: ProviderConfig = {
id: 'my-provider',
name: 'My Provider',
description: 'Models via My Provider API',
requiresApiKey: true,
requiresBaseUrl: false,
};
export function register(): void {
// Optional: registerApiProvider({ api: ..., stream: ..., streamSimple: ... });
}
SEE ALSO
man skill — skill system for extending agent capabilities. man scoop — sub-agent isolation and orchestration.