API Reference

Note

For detailed usage guides, see the REPL Interface, Tools, Configuration, Ollama Integration, and Claude Integration pages.

Zana.ClaudeConfigType
ClaudeConfig

Configuration for Anthropic Claude API calls.

  • api_key::String - Anthropic API key
  • model::String - Model name (default: "claude-sonnet-4-20250514")
  • base_url::String - API base URL (default: "https://api.anthropic.com")
  • max_tokens::Int - Maximum tokens in response (default: 8192)
  • temperature::Float64 - Sampling temperature (default: 0.1)
  • api_version::String - Anthropic API version header (default: "2023-06-01")
source
Zana.OllamaConfigType
OllamaConfig

Configuration for Ollama API calls.

Supports both the native Ollama API (/api/chat) and the OpenAI-compatible endpoint (/v1/chat/completions).

  • model::String - Model name (e.g., "llama3.1", "qwen2.5", "mistral")
  • host::String - Ollama server address (default: "http://localhost:11434")
  • max_tokens::Int - Maximum tokens in response (default: 32768)
  • temperature::Float64 - Sampling temperature (default: 0.1)
  • repeat_penalty::Float64 - Repetition penalty (default: 1.2; Ollama default is 1.1)
  • frequency_penalty::Float64 - Frequency penalty for OpenAI-compat mode (default: 0.0)
  • use_openai_compat::Bool - Use OpenAI-compatible endpoint (default: false)
source
Zana.ToolType
Tool

Represents a callable tool that the LLM can invoke.

source
Zana.ZanaAgentType
ZanaAgent

AI-powered coding copilot agent. Supports both OpenAI-compatible and Ollama backends.

source
Zana.ZanaAgentMethod
ZanaAgent(config, working_dir)

Create a ZanaAgent with the given configuration and working directory.

source
Zana._extract_tool_calls_from_textMethod
_extract_tool_calls_from_text(content)

Fallback parser: extract tool calls from text content when models embed them in their output instead of using the native tool calling API. Handles common patterns like <toolcall>...</toolcall> and json {"name":...} blocks.

Returns (cleanedcontent, toolcalls) where tool_calls may be nothing.

source
Zana._glob_to_regexMethod
_glob_to_regex(pattern)

Convert a glob pattern to a Regex. Supports: * (any non-/ chars), ** (any chars including /), ? (single char).

source
Zana._read_inputMethod
_read_input(state)

Read user input with history navigation and multi-line support. Returns nothing on EOF or Ctrl-C.

source
Zana._read_line_rawMethod
_read_line_raw(prompt, color, history) -> Union{String, Nothing}

Read a single line from a TTY with arrow key history navigation. Supports: Up/Down (history), Left/Right (cursor), Home/End, Backspace, Delete, Ctrl-A/E/K/U/W/L, and UTF-8 input. Returns nothing on Ctrl-C or Ctrl-D.

source
Zana._resolve_pathMethod
_resolve_path(state, path)

Resolve a path relative to the working directory. Absolute paths pass through.

source
Zana._strip_think_tagsMethod
_strip_think_tags(text)

Remove <think>...</think> blocks from model output (common in reasoning models).

source
Zana._truncate_repetitionMethod
_truncate_repetition(text; max_repeats=3)

Detect and truncate repetitive LLM output. If any sentence (>= 10 chars) appears max_repeats or more times, the text is truncated after the 2nd occurrence and a warning is appended. Short responses (< 200 chars) bypass the check entirely.

source
Zana.call_claudeMethod
call_claude(config, messages, tools)

Make an API call to the Anthropic Claude Messages API.

source
Zana.call_ollamaMethod
call_ollama(config, messages, tools)

Make an API call to Ollama using the native /api/chat endpoint.

source
Zana.check_ollama_connectionMethod
check_ollama_connection(; host="http://localhost:11434")

Check if Ollama server is running and accessible.

  • Bool - true if server is reachable
source
Zana.list_ollama_modelsMethod
list_ollama_models(; host="http://localhost:11434")

List available models on the Ollama server.

  • Vector{String} of model names
source
Zana.messages_to_claude_formatMethod
messages_to_claude_format(messages) -> (system_text, claude_messages)

Convert Message objects to Anthropic Claude API format.

Key differences from OpenAI:

  • System messages are extracted and returned separately (top-level system param)
  • Tool results are sent as user role messages with tool_result content blocks
  • Assistant messages with tool calls become content block arrays
  • Consecutive tool results are grouped into a single user message
source
Zana.parse_claude_responseMethod
parse_claude_response(response)

Parse Anthropic Claude response into a Message.

Iterates response["content"] blocks:

  • "text" blocks are collected into the content string
  • "tool_use" blocks are collected into ToolCall objects
source
Zana.parse_ollama_responseMethod
parse_ollama_response(config, response)

Parse Ollama response into Message and tool calls. Handles both native and OpenAI-compatible formats. Falls back to text parsing if the model embeds tool calls in content.

source
Zana.process_turnMethod
process_turn(agent, user_input; on_event=nothing) -> TurnResult

Process a single conversation turn. Sends the user message to the LLM, executes any tool calls, and returns the final response.

The optional on_event callback receives status updates:

  • (:thinking,) — LLM is generating a response
  • (:tool_start, name, args) — a tool is about to execute
  • (:tool_done, exec) — a tool finished (ToolExecution)
source
Zana.tools_to_claude_formatMethod
tools_to_claude_format(tools)

Convert Tool objects to Anthropic's tool format. Anthropic uses flat tool objects with input_schema instead of the OpenAI {"type":"function","function":{...}} wrapper.

source
Zana.zanaMethod
zana(; api_key, model, base_url, ollama, claude, host, use_openai_compat, working_dir)

Start the Zana interactive coding copilot.

  • api_key::String - API key (default: ENV["OPENAIAPIKEY"] or ENV["ANTHROPICAPIKEY"])
  • model::String - Model name (default: "gpt-4o", "llama3.1", or "claude-sonnet-4-20250514")
  • base_url::String - API base URL (default: "https://api.openai.com/v1")
  • ollama::Bool - Use Ollama backend (default: false)
  • claude::Bool - Use Anthropic Claude backend (default: false)
  • host::String - Ollama host (default: "http://localhost:11434")
  • use_openai_compat::Bool - Use Ollama's OpenAI-compatible endpoint (default: false)
  • working_dir::String - Working directory (default: pwd())
using Zana

zana(ollama=true, model="qwen3")

zana(claude=true)

zana(api_key="sk-...")

zana(api_key="key", base_url="https://api.example.com/v1", model="my-model")
source