Getting Started with Lectic

This short guide helps you install Lectic and run your first conversation. Along the way, you will verify your install, set an API key, and see a simple tool in action.

Installation

Choose the method that fits your system.

Nix

If you use Nix, install directly from the repository:

nix profile install github:gleachkr/lectic

Linux (AppImage)

Download the AppImage from the GitHub Releases page. Make it executable and put it on your PATH.

chmod +x lectic-*.AppImage
mv lectic-*.AppImage ~/.local/bin/lectic

macOS

Download the macOS binary from the GitHub Releases page and put it on your PATH.

Verify the install

lectic --version

If you see a version number, you are ready to go.

Your first conversation

Set up an API key

Lectic talks to LLM providers. Put at least one provider key in your environment so Lectic can pick a default.

export ANTHROPIC_API_KEY="your-api-key-here"

Lectic chooses a default provider by checking for keys in this order: Anthropic, then Gemini, then OpenAI, then OpenRouter. If you need Bedrock, set provider: anthropic/bedrock explicitly in your file and make sure your AWS credentials are configured. Bedrock is not auto‑selected.

Finally, OpenAI has two provider choices. Use openai for the newer Responses API and native tools. Use openai/chat for the legacy Chat Completions API when you need it.

Create a conversation file

Make a new file, for example my_convo.lec. The .lec extension helps with editor integration.

Add a minimal YAML header and your first user message:

---
interlocutor:
  name: Assistant
  prompt: You are a helpful assistant.
  provider: anthropic
  model: claude-3-haiku-20240307
---

Hello, world! What is a fun fact about the Rust programming language?

Run Lectic

From your terminal, run Lectic on the file. The -i flag updates the file in place.

lectic -i my_convo.lec

Lectic sends your message to the model and appends its response in a new assistant block. You can add your next message under that block and run the command again to continue the conversation.

Use a tiny tool

Now add a very small tool to see the tool flow. This one exposes date.

---
interlocutor:
  name: Assistant
  prompt: You are a helpful assistant.
  provider: anthropic
  model: claude-3-haiku-20240307
  tools:
    - exec: date
      name: get_date
---

What is today's date?

Run Lectic again. The assistant block will now include an XML tool call and the recorded results. You will see tags like , , and in the block.

Tip

You can load prompts from files or compute them with commands using file: and exec:. See External Prompts.

Editor integration

The intended workflow is to wire Lectic into your editor so you can update a conversation with a single key press.

Neovim

A full‑featured plugin is available in extra/lectic.nvim. It provides syntax, folding, and one‑shot submission.

VS Code

An extension is available in extra/lectic.vscode.

Other editors

Most editors support sending the current buffer to an external command. You can configure yours to replace the buffer with the output of:

cat your_file.lec | lectic