Getting Started with Lectic
This guide helps you install Lectic and run your first conversation. Along the way, you will verify your install, set an API key, and see a simple tool in action.
Installation
Choose the method that fits your system.
Homebrew
brew install gleachkr/lectic/lecticArch Linux (AUR)
yay -S lectic-bin
# or: paru -S lectic-binLinux / macOS quick install (GitHub Releases)
curl -fsSL https://raw.githubusercontent.com/gleachkr/lectic/main/install.sh \
| shTo update later, rerun the same installer command.
Linux / macOS manual install (release tarballs)
Download the matching release tarball from the GitHub Releases page, extract it, and put lectic on your PATH.
tar -xzf lectic-vX.Y.Z-<platform>-<arch>.tar.gz
mkdir -p ~/.local/bin
install -m 755 ./lectic ~/.local/bin/lecticLinux AppImages are also published in the release assets.
Nix
If you use Nix, install directly from the repository:
nix profile install github:gleachkr/lecticVerify the install
lectic --versionIf you see a version number, you are ready to go.
Set up an API key
Lectic talks to LLM providers. Put at least one provider key in your environment:
export ANTHROPIC_API_KEY="your-api-key-here"Lectic chooses a default provider by checking for keys in this order: Anthropic → Gemini → OpenAI → OpenRouter. You only need one.
You can also use your ChatGPT subscription with provider: codex. This does not use an API key environment variable, so it is not auto-selected. On first use, Lectic opens a browser window for login and stores tokens at $LECTIC_STATE/codex_auth.json.
The login flow starts a local callback server on port 1455.
Common issue: If you see “No API key found,” make sure you exported the key in the same shell session where you’re running Lectic. If you set it in .bashrc, you may need to restart your terminal or run source ~/.bashrc.
Your first conversation
The conversation format
A Lectic conversation is a markdown file with a YAML header. The header configures the LLM (which we call an “interlocutor”). Everything below the header is the conversation: your messages as plain text, and the LLM’s responses in special :::Name blocks.
Here’s a minimal example:
---
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
---
What is a fun fact about the Rust programming language?The name identifies who’s speaking in the response blocks. The prompt is the system prompt. Lectic picks a default provider and model based on your API keys, so you don’t need to specify them.
Create and run
Create a file called hello.lec with the content above, then run:
lectic -if hello.lecThe -i flag updates the file in place, and -f selects the conversation file. Lectic sends your message to the LLM and appends the response:
---
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
---
What is a fun fact about the Rust programming language?
:::Assistant
Rust's mascot is a crab named Ferris! The name is a pun on
"ferrous," relating to iron (Fe), which connects to "rust." You'll
often see Ferris in Rust documentation and community materials.
:::To continue the conversation, add your next message below the ::: block and run lectic -if hello.lec again.
Use a tool
Now let’s give the assistant a tool. Create a new file called tools.lec:
---
interlocutor:
name: Assistant
prompt: You are a helpful assistant.
tools:
- exec: date
name: get_date
---
What is today's date?Run it:
lectic -if tools.lecThe response now includes an XML block showing the tool call and its results:
:::Assistant
<tool-call with="get_date">
<arguments><argv>[]</argv></arguments>
<results>
<result type="text">
┆<stdout>Fri Jun 13 10:42:17 PDT 2025</stdout>
</result>
</results>
</tool-call>
Today is Friday, June 13th, 2025.
:::The <tool-call> block is part of the conversation record. It shows what the LLM requested, what arguments it passed, and what came back. Editor plugins typically fold these blocks to reduce clutter.
You can load prompts from files or compute them with commands using file: and exec:. See External Prompts.
Tab completion (optional)
Lectic has an extensible tab completion system that supports standard flags and Custom Subcommands.
To enable it, source the completion script in your shell configuration (e.g., ~/.bashrc):
# Adjust path to where you cloned/extracted Lectic
source /path/to/lectic/extra/tab_complete/lectic_completion.bashOr place the script in ~/.local/share/bash-completion/completions/.
Troubleshooting
“No API key found” or similar error
Lectic needs at least one provider key in your environment. Make sure you’ve exported it in the same shell session:
export ANTHROPIC_API_KEY="your-api-key-here"
lectic -if hello.lecIf you set the key in .bashrc or .zshrc, restart your terminal or run source ~/.bashrc to pick it up.
Response is empty or tool calls aren’t working
Check that your YAML header is valid. Common mistakes:
- Indentation errors (YAML requires consistent spacing)
- Missing colons after keys
- Forgetting the closing
---after the frontmatter
The LSP server catches many of these. See Editor Integration to set it up.
“Model not found” errors
Model names vary by provider. Use lectic models to see what’s available for your configured API keys.
Tools aren’t being called
Make sure tools are defined under the tools key inside interlocutor, not at the top level:
# Correct
interlocutor:
name: Assistant
prompt: You are helpful.
tools:
- exec: date
name: get_date
# Wrong — tools at top level won't work
interlocutor:
name: Assistant
prompt: You are helpful.
tools:
- exec: dateNext steps
Now that you have Lectic working:
Set up your editor. The intended workflow is to run Lectic with a single keypress. See Editor Integration for Neovim, VS Code, and other editors.
Learn the configuration system. You can set global defaults, project-specific settings, and per-conversation overrides. See Configuration.
Explore the cookbook. The Cookbook has ready-to-use recipes for common workflows like coding assistants, commit message generation, and multi-perspective research.