Getting Started with Lectic

This short guide helps you install Lectic and run your first conversation. Along the way, you will verify your install, set an API key, and see a simple tool in action.

Installation

Choose the method that fits your system.

Nix

If you use Nix, install directly from the repository:

nix profile install github:gleachkr/lectic

Linux (AppImage)

Download the AppImage from the GitHub Releases page. Make it executable and put it on your PATH.

chmod +x lectic-*.AppImage
mv lectic-*.AppImage ~/.local/bin/lectic

macOS

Download the macOS binary from the GitHub Releases page and put it on your PATH.

Verify the install

lectic --version

If you see a version number, you are ready to go.

Tab completion (optional)

Lectic has an extensible tab completion system that supports standard flags and Custom Subcommands.

To enable it, source the completion script in your shell configuration (e.g., ~/.bashrc):

# Adjust path to where you cloned/extracted Lectic
source /path/to/lectic/extra/tab_complete/lectic_completion.bash

or place the script in ~/.local/share/bash-completion/completions/ or one of the other standard locations for completion scripts.

Your first conversation

Set up an API key

Lectic talks to LLM providers. Put at least one provider key in your environment so Lectic can pick a default.

export ANTHROPIC_API_KEY="your-api-key-here"

Lectic chooses a default provider by checking for keys in this order: Anthropic, then Gemini, then OpenAI, then OpenRouter. If you need Bedrock, set provider: anthropic/bedrock explicitly in your file and make sure your AWS credentials are configured. Bedrock is not auto‑selected.

Finally, OpenAI has two provider choices. Use openai for the newer Responses API and native tools. Use openai/chat for the legacy Chat Completions API when you need it.

Create a conversation file

Make a new file, for example my_convo.lec. The .lec extension helps with editor integration.

Add a minimal YAML header and your first user message:

---
interlocutor:
  name: Assistant
  prompt: You are a helpful assistant.
  provider: anthropic
  model: claude-3-haiku-20240307
  # Optional thinking controls (Anthropic/Gemini):
  # thinking_budget: 1024     # integer token budget for reasoning
---

Hello, world! What is a fun fact about the Rust programming language?

Run Lectic

From your terminal, run Lectic on the file. The -i flag updates the file in place.

lectic -i my_convo.lec

Lectic sends your message to the model and appends its response in a new assistant block. You can add your next message under that block and run the command again to continue the conversation.

Use a tiny tool

Now add a very small tool to see the tool flow. This one exposes date.

---
interlocutor:
  name: Assistant
  prompt: You are a helpful assistant.
  provider: anthropic
  model: claude-3-haiku-20240307
  tools:
    - exec: date
      name: get_date
---

What is today's date?

Run Lectic again. The assistant block will now include an XML tool call and the recorded results. You will see tags like <tool-call>, <arguments>, and <results> in the block.

Tip

You can load prompts from files or compute them with commands using file: and exec:. See External Prompts.

Troubleshooting

Here are solutions to common issues when getting started.

“No API key found” or similar error

Lectic needs at least one provider key in your environment. Make sure you’ve exported it in the same shell session:

export ANTHROPIC_API_KEY="sk-ant-..."
lectic -i my_convo.lec

If you set the key in a config file (like .bashrc), you may need to restart your terminal or run source ~/.bashrc.

Response is empty or tool calls aren’t working

Check that your YAML header is valid. Common mistakes:

  • Indentation errors (YAML requires consistent spacing)
  • Missing colons after keys
  • Forgetting the closing --- after the frontmatter

The LSP server catches many of these. See Editor Integration to set it up.

“Model not found” errors

Model names vary by provider. Use lectic models to see what’s available for your configured API keys. Some common model names:

  • Anthropic: claude-sonnet-4-20250514, claude-3-haiku-20240307
  • OpenAI: gpt-4o, gpt-4o-mini
  • Gemini: gemini-2.5-flash, gemini-2.5-pro

The LSP server can autocomplete model names, so tab-complete is your friend here.

Tools aren’t being called

Make sure tools are defined under the tools key inside interlocutor, not at the top level:

# Correct
interlocutor:
  name: Assistant
  prompt: You are helpful.
  tools:
    - exec: date
      name: get_date

# Wrong - tools at top level
interlocutor:
  name: Assistant
  prompt: You are helpful.
tools:  # This won't work
  - exec: date

Next steps

Now that you have Lectic working, you’ll want to:

  1. Set up your editor. The intended workflow is to run Lectic with a single keypress. See Editor Integration for Neovim, VS Code, and other editors.

  2. Learn the configuration system. You can set global defaults, project-specific settings, and per-conversation overrides. See Configuration.

  3. Explore the cookbook. The Cookbook has ready-to-use recipes for common workflows like coding assistants, commit message generation, and multi-perspective research.