Providers and Models
Lectic speaks to several providers. You pick a provider and a model in your YAML header, or let Lectic choose a default based on which credentials are available in your environment.
Picking a default provider
If you do not set provider, Lectic checks for keys in this order and uses the first one it finds:
Anthropic → Gemini → OpenAI → OpenRouter.
Set one of these environment variables before you run Lectic:
- ANTHROPIC_API_KEY
- GEMINI_API_KEY
- OPENAI_API_KEY
- OPENROUTER_API_KEY
AWS credentials for Bedrock are not used for auto-selection. If you want Anthropic via Bedrock, set provider: anthropic/bedrock explicitly and make sure your AWS environment is configured.
ChatGPT subscription (no API key)
Lectic also supports provider: codex, which uses your ChatGPT subscription via the official Codex / ChatGPT OAuth flow.
Notes:
- This provider is not auto-selected (there is no API key env var to check). If you want it by default, set
provider: codexin yourlectic.yamlor in the.lecfrontmatter. - On first use, Lectic opens a browser window for login and stores tokens at
$LECTIC_STATE/codex_auth.json(for example,~/.local/state/lectic/codex_auth.jsonon Linux). - Login starts a local callback server on port 1455. If that port is in use, stop the other process and try again.
- If you want to “log out”, delete that file.
Discover models
Not sure which models are available? Run:
lectic modelsThis queries each provider you have credentials for and prints the available models.
The LSP can also autocomplete model names as you type in the YAML header. See Editor Integration.
provider: codex models only show up in lectic models after you have logged in at least once (since the login is browser-based, not an API key). If you have not logged in yet, run a .lec file with provider: codex first.
OpenAI: two provider strings
OpenAI has two modes in Lectic today.
openaiselects the Responses API. Choose this when you want native tools like search and code.openai/chatselects the legacy Chat Completions API.
Examples
These examples show the minimal configuration for each provider. You can omit provider and model if you want Lectic to pick defaults based on your environment.
Anthropic (direct API):
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: anthropicAnthropic via Bedrock:
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: anthropic/bedrock
model: anthropic.claude-3-haiku-20240307-v1:0OpenAI (Responses API):
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: openaiOpenAI Chat Completions (legacy API):
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: openai/chatCodex (via ChatGPT subscription):
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: codex
model: gpt-5.1-codexGemini:
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: geminiOpenRouter:
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: openrouter
model: meta-llama/llama-3.1-70b-instructOllama (local inference):
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
provider: ollama
model: llama3.1Capabilities and media
Providers differ in what they accept as input. Here’s a rough guide:
| Provider | Text | Images | PDFs | Audio | Video |
|---|---|---|---|---|---|
| Anthropic | ✓ | ✓ | ✓ | ✗ | ✗ |
| Gemini | ✓ | ✓ | ✓ | ✓ | ✓ |
| OpenAI | ✓ | ✓ | ✓ | varies* | ✗ |
| Codex | ✓ | ✓ | ✓ | varies* | ✗ |
| OpenRouter | varies by model | ||||
| Ollama | ✓ | varies | ✗ | ✗ | ✗ |
* Audio support depends on model. For OpenAI audio workflows you may need provider: openai/chat with an audio-capable model.
Support changes quickly. Consult each provider’s documentation for current limits on formats, sizes, and rate limits.
In Lectic, you attach external content by linking files in the user message body. Lectic packages these and sends them to the provider in a way that fits that provider’s API. See External Content for examples and tips.