Bilge.jl

Bilge.jl logo

Stable Dev Build Status Coverage

Bilge is a REPL-based AI coding copilot for Julia. It connects to LLMs (OpenAI, Ollama, or any OpenAI-compatible API) and gives them tools to read, write, edit, and search your codebase — all from an interactive terminal session.

Bilge — Turkish for "wise", embodies the pursuit of intelligent assistance through language models. Like a knowledgeable companion at your side, Bilge brings the reasoning power of modern AI directly into your Julia development workflow.

This site documents the development version. After your first tagged release, see stable docs for the latest release.


About TAFS

TAFS (Time Series Analysis and Forecasting Society) is a non-profit association ("Verein") in Vienna, Austria. It connects academics, experts, practitioners, and students focused on time-series, forecasting, and decision science. Contributions remain fully open source. Learn more at taf-society.org.


Installation

Bilge is under active development. For the latest dev version:

using Pkg
Pkg.add(url="https://github.com/taf-society/Bilge.jl")

For local development:

] dev /path/to/Bilge.jl

REPL Interface (Primary Usage)

Bilge provides an interactive REPL that gives an LLM full access to your codebase through 7 built-in tools. This is the recommended approach for most users.

Quick Example: Ollama

using Bilge

bilge(ollama=true, model="qwen3")

Quick Example: OpenAI

using Bilge

# Reads OPENAI_API_KEY from environment
bilge()

# Or pass the key directly
bilge(api_key="sk-...")

Quick Example: Any OpenAI-Compatible API

using Bilge

bilge(api_key="your-key", base_url="https://api.example.com/v1", model="your-model")

Once inside the REPL, the LLM can autonomously read files, search code, run commands, and make edits to help you with any coding task.

Available Commands

Type /help inside the Bilge REPL to see all available commands. Use /exit to quit.


Programmatic Usage (Agent API)

For integration into scripts or custom workflows, use BilgeAgent and process_turn directly:

using Bilge

config = BilgeConfig(
    ollama = OllamaConfig(model="qwen3")
)

agent = BilgeAgent(config, pwd())
result = process_turn(agent, "List all Julia files in this project")

# Access the response
println(result.response)

# Inspect tool executions
for exec in result.tool_executions
    println("  $(exec.tool_name) ($(exec.duration_ms)ms)")
end

Available Tools

Bilge equips the LLM with 7 coding tools:

ToolDescriptionUse Case
read_fileRead file contents with line numbersInspecting code, understanding structure
write_fileCreate or overwrite filesGenerating new modules, configs, scripts
edit_fileExact string replacementTargeted code modifications
run_bashExecute shell commands with timeoutRunning tests, git operations, builds
glob_filesFind files by glob patternDiscovering project structure
grep_codeRegex search across filesFinding usages, patterns, definitions
list_directoryList directory contents with sizesExploring directory layout

For detailed documentation on each tool, see the Tools Guide.


Key Features

  • Multiple LLM Backends — OpenAI, Ollama, or any OpenAI-compatible API
  • 7 Built-in Tools — Read, write, edit, search, and run commands
  • Interactive REPL — Conversational coding with full context
  • Programmatic API — Use BilgeAgent directly in scripts
  • Minimal Dependencies — Only HTTP, JSON3, UUIDs
  • Multi-line Input — Backslash continuation for complex prompts
  • Token Tracking — Monitor LLM usage across the session

License

MIT License.


What's next