API Reference
For detailed usage guides, see the REPL Interface, Tools, Configuration, and Ollama Integration pages.
Bilge.BilgeAgent — Type
BilgeAgentAI-powered coding copilot agent. Supports both OpenAI-compatible and Ollama backends.
Bilge.BilgeAgent — Method
BilgeAgent(config, working_dir)Create a BilgeAgent with the given configuration and working directory.
Bilge.BilgeConfig — Type
BilgeConfigConfiguration for the Bilge coding copilot.
Bilge.BilgeState — Type
BilgeStateMutable state maintained across the coding session.
Bilge.LLMConfig — Type
LLMConfigConfiguration for LLM API calls.
Bilge.Message — Type
MessageA message in the conversation.
Bilge.OllamaConfig — Type
OllamaConfigConfiguration for Ollama API calls.
Supports both the native Ollama API (/api/chat) and the OpenAI-compatible endpoint (/v1/chat/completions).
model::String- Model name (e.g., "llama3.1", "qwen2.5", "mistral")host::String- Ollama server address (default: "http://localhost:11434")max_tokens::Int- Maximum tokens in response (default: 32768)temperature::Float64- Sampling temperature (default: 0.1)use_openai_compat::Bool- Use OpenAI-compatible endpoint (default: false)
Bilge.Tool — Type
ToolRepresents a callable tool that the LLM can invoke.
Bilge.ToolCall — Type
ToolCallRepresents a tool call request from the LLM.
Bilge.ToolExecution — Type
ToolExecutionRecord of a single tool execution.
Bilge.TurnResult — Type
TurnResultResult of a single conversation turn.
Bilge._create_tools — Method
_create_tools(state, max_output_chars)Create all coding tools for the agent.
Bilge._extract_tool_calls_from_text — Method
_extract_tool_calls_from_text(content)Fallback parser: extract tool calls from text content when models embed them in their output instead of using the native tool calling API. Handles common patterns like <toolcall>...</toolcall> and json {"name":...} blocks.
Returns (cleanedcontent, toolcalls) where tool_calls may be nothing.
Bilge._glob_to_regex — Method
_glob_to_regex(pattern)Convert a glob pattern to a Regex. Supports: * (any non-/ chars), ** (any chars including /), ? (single char).
Bilge._handle_slash_command — Method
_handle_slash_command(agent, input) -> BoolHandle a slash command. Returns true to continue the REPL, false to exit.
Bilge._print_tool_summary — Method
_print_tool_summary(exec)Print a brief summary of a tool execution.
Bilge._read_input — Method
_read_input(state)Read user input with history navigation and multi-line support. Returns nothing on EOF or Ctrl-C.
Bilge._read_line_raw — Method
_read_line_raw(prompt, color, history) -> Union{String, Nothing}Read a single line from a TTY with arrow key history navigation. Supports: Up/Down (history), Left/Right (cursor), Home/End, Backspace, Delete, Ctrl-A/E/K/U/W/L, and UTF-8 input. Returns nothing on Ctrl-C or Ctrl-D.
Bilge._resolve_path — Method
_resolve_path(state, path)Resolve a path relative to the working directory. Absolute paths pass through.
Bilge._show_history — Method
_show_history(agent)Display a summary of the conversation history.
Bilge._strip_think_tags — Method
_strip_think_tags(text)Remove <think>...</think> blocks from model output (common in reasoning models).
Bilge._truncate_output — Method
_truncate_output(text, max_chars)Truncate output to max_chars, adding a notice if truncated.
Bilge.bilge — Method
bilge(; api_key, model, base_url, ollama, host, use_openai_compat, working_dir)Start the Bilge interactive coding copilot.
api_key::String- OpenAI API key (default: ENV["OPENAIAPIKEY"])model::String- Model name (default: "gpt-4o" or "llama3.1" for Ollama)base_url::String- API base URL (default: "https://api.openai.com/v1")ollama::Bool- Use Ollama backend (default: false)host::String- Ollama host (default: "http://localhost:11434")use_openai_compat::Bool- Use Ollama's OpenAI-compatible endpoint (default: false)working_dir::String- Working directory (default: pwd())
using Bilge
bilge(ollama=true, model="qwen3")
bilge(api_key="sk-...")
bilge(api_key="key", base_url="https://api.example.com/v1", model="my-model")Bilge.build_system_prompt — Method
build_system_prompt(working_dir, model_name)Build the system prompt for the Bilge coding copilot.
Bilge.call_llm — Method
call_llm(config, messages, tools)Make an API call to the LLM.
Bilge.call_ollama — Method
call_ollama(config, messages, tools)Make an API call to Ollama using the native /api/chat endpoint.
Bilge.call_ollama_openai_compat — Method
call_ollama_openai_compat(config, messages, tools)Make an API call using Ollama's OpenAI-compatible endpoint (/v1/chat/completions).
Bilge.check_ollama_connection — Method
check_ollama_connection(; host="http://localhost:11434")Check if Ollama server is running and accessible.
Bool- true if server is reachable
Bilge.execute_tool — Method
execute_tool(agent, name, args)Execute a tool by name with the given arguments.
Bilge.format_tool_result — Method
format_tool_result(tool_call_id, result)Create a tool result message.
Bilge.list_ollama_models — Method
list_ollama_models(; host="http://localhost:11434")List available models on the Ollama server.
Vector{String}of model names
Bilge.messages_to_ollama_format — Method
messages_to_ollama_format(messages)Convert Message objects to Ollama's native message format.
Bilge.messages_to_openai_format — Method
messages_to_openai_format(messages)Convert Message objects to OpenAI API format.
Bilge.parse_llm_response — Method
parse_llm_response(response)Parse LLM response into Message and tool calls.
Bilge.parse_ollama_response — Method
parse_ollama_response(config, response)Parse Ollama response into Message and tool calls. Handles both native and OpenAI-compatible formats. Falls back to text parsing if the model embeds tool calls in content.
Bilge.process_turn — Method
process_turn(agent, user_input; on_event=nothing) -> TurnResultProcess a single conversation turn. Sends the user message to the LLM, executes any tool calls, and returns the final response.
The optional on_event callback receives status updates:
(:thinking,)— LLM is generating a response(:tool_start, name, args)— a tool is about to execute(:tool_done, exec)— a tool finished (ToolExecution)
Bilge.tools_to_ollama_format — Method
tools_to_ollama_format(tools)Convert Tool objects to Ollama's native tool calling format.
Bilge.tools_to_openai_format — Method
tools_to_openai_format(tools)Convert Tool objects to OpenAI function calling format.