Bilge.jl
Bilge is a REPL-based AI coding copilot for Julia. It connects to LLMs (OpenAI, Ollama, or any OpenAI-compatible API) and gives them tools to read, write, edit, and search your codebase — all from an interactive terminal session.
Bilge — Turkish for "wise", embodies the pursuit of intelligent assistance through language models. Like a knowledgeable companion at your side, Bilge brings the reasoning power of modern AI directly into your Julia development workflow.
This site documents the development version. After your first tagged release, see stable docs for the latest release.
About TAFS
TAFS (Time Series Analysis and Forecasting Society) is a non-profit association ("Verein") in Vienna, Austria. It connects academics, experts, practitioners, and students focused on time-series, forecasting, and decision science. Contributions remain fully open source. Learn more at taf-society.org.
Installation
Bilge is under active development. For the latest dev version:
using Pkg
Pkg.add(url="https://github.com/taf-society/Bilge.jl")For local development:
] dev /path/to/Bilge.jlREPL Interface (Primary Usage)
Bilge provides an interactive REPL that gives an LLM full access to your codebase through 7 built-in tools. This is the recommended approach for most users.
Quick Example: Ollama
using Bilge
bilge(ollama=true, model="qwen3")Quick Example: OpenAI
using Bilge
# Reads OPENAI_API_KEY from environment
bilge()
# Or pass the key directly
bilge(api_key="sk-...")Quick Example: Any OpenAI-Compatible API
using Bilge
bilge(api_key="your-key", base_url="https://api.example.com/v1", model="your-model")Once inside the REPL, the LLM can autonomously read files, search code, run commands, and make edits to help you with any coding task.
Type /help inside the Bilge REPL to see all available commands. Use /exit to quit.
Programmatic Usage (Agent API)
For integration into scripts or custom workflows, use BilgeAgent and process_turn directly:
using Bilge
config = BilgeConfig(
ollama = OllamaConfig(model="qwen3")
)
agent = BilgeAgent(config, pwd())
result = process_turn(agent, "List all Julia files in this project")
# Access the response
println(result.response)
# Inspect tool executions
for exec in result.tool_executions
println(" $(exec.tool_name) ($(exec.duration_ms)ms)")
endAvailable Tools
Bilge equips the LLM with 7 coding tools:
| Tool | Description | Use Case |
|---|---|---|
read_file | Read file contents with line numbers | Inspecting code, understanding structure |
write_file | Create or overwrite files | Generating new modules, configs, scripts |
edit_file | Exact string replacement | Targeted code modifications |
run_bash | Execute shell commands with timeout | Running tests, git operations, builds |
glob_files | Find files by glob pattern | Discovering project structure |
grep_code | Regex search across files | Finding usages, patterns, definitions |
list_directory | List directory contents with sizes | Exploring directory layout |
For detailed documentation on each tool, see the Tools Guide.
Key Features
- Multiple LLM Backends — OpenAI, Ollama, or any OpenAI-compatible API
- 7 Built-in Tools — Read, write, edit, search, and run commands
- Interactive REPL — Conversational coding with full context
- Programmatic API — Use
BilgeAgentdirectly in scripts - Minimal Dependencies — Only HTTP, JSON3, UUIDs
- Multi-line Input — Backslash continuation for complex prompts
- Token Tracking — Monitor LLM usage across the session
License
MIT License.
What's next
- Quick Start — Get started quickly with Ollama or OpenAI
- User Guide pages:
- REPL Interface — Commands, multi-line input, and session management
- Tools — Detailed documentation for all 7 coding tools
- Configuration —
BilgeConfig,LLMConfig,OllamaConfig - Ollama Integration — Local model setup, utilities, and tips
- API Reference — Complete API documentation