Skip to content

Autocomplete

tallow has two independent autocomplete systems that work together in the editor input:

SystemTriggerWhat it does
Structural/, @, file paths, TabCompletes commands, file references, and paths deterministically
LLM ghost textTyping 4+ charactersCalls a fast model to predict how you’ll finish your sentence

Structural autocomplete is built into the TUI editor. It uses no LLM — completions come from the filesystem, registered commands, and fuzzy matching. Results appear in a dropdown menu.

Type / at the start of a line to see all available commands. Keep typing to fuzzy-filter the list.

/rev → matches /rewind, /reviewer
/task → matches /tasks

After selecting a command and pressing Space, some commands offer argument completions (for example, /context-fork suggests branch names).

Type @ followed by a filename to fuzzy-search your project. The search uses fd under the hood — it’s fast, respects .gitignore, and searches the full directory tree.

@login → matches src/auth/login.ts, tests/auth/login.test.ts
@pack → matches package.json, packages/

Each suggestion shows:

FieldContent
LabelFilename (with trailing / for directories)
DescriptionRelative path from project root
ValueFull @path insertion text

Directories are sorted first. Filenames with spaces are automatically quoted (@"path with spaces/file.ts").

When you select a file, the file-reference extension reads it and inlines the contents into your prompt.

Paths are completed via directory listing whenever the input contains path-like patterns:

  • ./src/ — relative paths
  • ~/ — home directory paths
  • ../ — parent directory paths
  • Any text containing /

Press Tab to force file completion even when the input doesn’t look like a path yet.

KeyBehavior
TabAccept selected completion / force file completions
↑ / ↓Navigate completion list
EscapeDismiss completions
TypingFilters the completion list

When you type 4 or more characters (and aren’t typing a / command), tallow calls a fast, cheap model to suggest how you might finish your sentence. The suggestion appears as dim ghost text after your cursor.

fix the login bug on the ← you typed this
authentication page ← ghost text
KeyBehavior
TabAccept ghost text into the editor
Enter (empty input)Accept idle suggestion and submit
EscapeDismiss ghost text
Any characterDismiss ghost text, type normally

The default model is Groq Llama 3.1 8B at $0.05 / $0.08 per million tokens. If it’s unavailable, tallow falls back through a chain of cheap models. See the prompt-suggestions extension for the full fallback chain, conversation context details, and configuration.

Ghost text is capped at 200 API calls per session. At ~50 tokens per call, a full session of autocomplete costs roughly $0.004.

When the editor is empty and the agent is idle, a random prompt suggestion appears as ghost text (no model call — these are picked from a curated template list). Press Enter to accept and submit, or start typing to dismiss.

All LLM autocomplete settings live in ~/.tallow/settings.json:

{
"prompt-suggestions.enabled": true,
"prompt-suggestions.autocomplete": true,
"prompt-suggestions.model": "groq/llama-3.1-8b-instant",
"prompt-suggestions.debounceMs": 600
}
SettingDefaultDescription
prompt-suggestions.enabledtrueEnable/disable the entire extension (idle + LLM)
prompt-suggestions.autocompletetrueEnable/disable LLM autocomplete only
prompt-suggestions.modelgroq/llama-3.1-8b-instantModel for autocomplete (provider/model-id)
prompt-suggestions.debounceMs600Delay in ms before calling the model

Structural autocomplete has no configuration — it’s always available.