Build, manage, and run autonomous AI agents on your own hardware.
No cloud. No API costs. Your data, your agents, your rules.
$ git clone https://github.com/okhmat-anton/babber.git
$ cd babber
$ make install
▸ Copying .env.example → .env
▸ Starting Docker services...
▸ Pulling default Ollama model...
✓ Babber is running at http://localhost:4200
Babber is an open-source, self-hosted AI agent platform. Create intelligent agents with custom personalities, belief systems, and goals — then let them run autonomously using local LLMs via Ollama or any OpenAI-compatible API. All your data stays on your machine.
Agents with personalities, beliefs, aspirations, and long-term memory. Each agent is a unique entity that evolves over time.
Agents run autonomously using thinking protocols — making decisions, executing skills, and tracking their own TODO lists.
Everything runs on your hardware. MongoDB, Redis, ChromaDB, Ollama — all in Docker. Zero cloud dependencies.
Extend with addons that bundle backend routes, frontend pages, skills, and protocols in a single package.
Create agents with custom personalities, system prompts, and per-agent generation parameters. Each agent has a belief system with immutable core beliefs and mutable additional beliefs, plus aspirations — dreams, desires, and concrete goals with lock/unlock boundaries.
{
"name": "Research Analyst",
"personality": "Meticulous, curious, data-driven",
"beliefs": {
"core": ["Always verify sources", "Data over opinions"],
"additional": ["Prefer academic papers"]
},
"aspirations": {
"goals": ["Master market analysis"],
"dreams": ["Build predictive models"]
},
"model": "llama3.1:70b",
"temperature": 0.4,
"autonomous": true
}
Define step-by-step reasoning workflows for your agents. Standard protocols for structured analysis, orchestrator protocols that delegate to child protocols, and loop protocols for autonomous work cycles.
A VS Code-style resizable chat panel with persistent sessions. Compare responses from different models side by side, or have multiple agents collaborate in a single conversation.
Let agents run independently — processing tasks, making decisions, invoking skills, and writing code. Two execution modes: continuous (until stopped) or cycle-based (N iterations).
A rich personal profile that agents use to personalize their behavior. Your identity, strengths, goals with sub-goals, dreams, cities of interest — everything agents need to understand you.
Connect agents to Telegram with full auth flow, trusted user permissions, humanized responses, and typing indicators. Your agent becomes a real chat companion.
Everything runs in Docker. Backend and frontend can run locally for development with hot reload.
Modern reactive frontend with Composition API, Pinia stores, and Material Design 3
FrontendHigh-performance async Python backend with automatic validation and OpenAPI docs
BackendAll data stored in MongoDB via Motor async driver. Flexible schema for agents, chats, skills, and more
DatabaseRun models locally with Ollama or connect any OpenAI-compatible API — GPT-4, Claude, Mistral
LLMVector database for semantic long-term memory. Agents search and connect their own memories
MemoryIn-memory cache for fast session storage, rate limiting, and real-time pub/sub communication
Cache4200Frontend4700Backend API4717MongoDB4379Redis4800ChromaDB11434OllamaAgents invoke skills autonomously. Each skill is a modular tool that an agent can call during conversation or autonomous execution.
Add videos from YouTube, TikTok, Instagram, Facebook, Twitter. Auto-transcribe and discuss with agents.
Manage trusted sources with trust levels. Simple and deep research modes for different depth needs.
Full note and idea management with categories, priorities, drag-and-drop reordering, and AI suggestions.
Built-in proxy management with bulk import, auto-testing, country filtering. Route agent traffic privately.
Manual and auto-backups with configurable intervals. Download, restore, or clean up old backups.
Select any text in the app and instantly save it as a note, goal, fact, belief, or analysis topic.
Create structured analysis topics, connect facts, and discuss findings with agents. Per-agent or global.
Text-to-speech with multiple voices and speech-to-text recognition. Per-agent voice selection with preview.
Create projects with file browsing, task management, and code editing. Agents write code to isolated dirs.
Addons bundle backend routes, frontend pages, skills, and protocols into a single installable package.
Personal budget management: income, expenses, accounts, loans with monthly history and agent integration.
budget_view budget_add_entryPrediction market integration: view events, odds, place bets and track betting history via Polymarket API.
polymarket_events polymarket_place_betSee the full Installation Guide for detailed setup on macOS, Linux, and Windows.
git clone https://github.com/okhmat-anton/babber.git
cd babber && make install
Copies config, starts Docker services, pulls the default Ollama model.
make run
Starts all services in production mode. For dev mode with hot reload: make run-dev
http://localhost:4200
Login with admin / admin123, create your first agent, and start chatting.
Babber is free, open-source, and self-hosted. Star us on GitHub and get started today.