Providers
Provider orchestration and model loading as implemented in packages/nikcli/src/provider/provider.ts.
Provider Stack
Providers are loaded via Provider.list() using bundled SDKs plus optional plugins.
Custom loader rules are defined in CUSTOM_LOADERS and influence autoload and model resolution.
OpenAI + GitHub Copilot routes use the Responses API for GPT-5+ models when available.
Bundled Providers
| Provider ID | Notes |
|---|---|
@ai-sdk/amazon-bedrock | Amazon Bedrock |
@ai-sdk/anthropic | Anthropic |
@ai-sdk/azure | Azure OpenAI |
@ai-sdk/google | Google Generative AI |
@ai-sdk/google-vertex | Google Vertex AI |
@ai-sdk/google-vertex/anthropic | Vertex Anthropic |
@ai-sdk/openai | OpenAI |
@ai-sdk/openai-compatible | OpenAI-Compatible |
@openrouter/ai-sdk-provider | OpenRouter |
@ai-sdk/xai | xAI |
@ai-sdk/mistral | Mistral |
@ai-sdk/groq | Groq |
@ai-sdk/deepinfra | DeepInfra |
@ai-sdk/cerebras | Cerebras |
@ai-sdk/cohere | Cohere |
@ai-sdk/gateway | AI Gateway |
@ai-sdk/togetherai | Together AI |
@ai-sdk/perplexity | Perplexity |
@ai-sdk/vercel | Vercel AI Gateway |
@gitlab/gitlab-ai-provider | GitLab AI Provider |
@ai-sdk/github-copilot | GitHub Copilot (compat) |
Custom Loaders
anthropic
Injects Claude Code + interleaved thinking headers.
openai / github-copilot
Uses Responses API for GPT-5+ (except gpt-5-mini).
azure
Chooses responses vs chat based on useCompletionUrls.
amazon-bedrock
Supports bearer tokens and AWS credential chain with region/profile overrides.
nikcli
Filters paid models when no API key is present.
Model Catalog
Model metadata is sourced from models.dev and cached under
Global.Path.cache/models.json. Refresh via
nikcli models --refresh.
Auth & Env
Credential storage is handled by packages/nikcli/src/auth and surfaced by nikcli auth.
Bedrock prioritizes bearer tokens, then AWS credential chain (profile, access keys, IAM roles).