LLM Orchestrator automation open-source platform
Find a file
Rodrigo Rodriguez (Pragmatismo) c95b56f257 fix(llm): Detect AVX2 support and gracefully disable LLM on incompatible CPUs
- Add cpu_supports_avx2() function to check /proc/cpuinfo for AVX2 flag
- Skip LLM binary download on CPUs without AVX2 (pre-built llama.cpp requires it)
- Add runtime check for llama-server binary compatibility (catches 'Illegal instruction')
- Provide helpful error messages about alternatives (compile from source or use external API)
- Sandy Bridge (2nd gen i7) and older CPUs now gracefully skip local LLM
2025-12-10 08:35:35 -03:00
.forgejo/workflows Migration to Rust removal of Azure. 2025-10-06 10:29:14 -03:00
.vscode Migrate HTTP API from Actix to Axum 2025-11-20 13:28:35 -03:00
.zed Add indicatif for progress bars and enhance bootstrap 2025-10-19 19:28:08 -03:00
config fix(bootstrap): Improve Vault startup diagnostics and error handling 2025-12-10 08:30:49 -03:00
deploy/kubernetes Add implementation plan and multi-agent features 2025-11-30 19:18:23 -03:00
docs Update: General project updates 2025-12-06 11:09:12 -03:00
gen/schemas feat: add HTTP server and refactor initialization 2025-11-15 09:48:46 -03:00
migrations Fix config.csv loading on startup 2025-12-08 00:19:29 -03:00
scripts - Executive summary. 2025-11-29 17:46:05 -03:00
src fix(llm): Detect AVX2 support and gracefully disable LLM on incompatible CPUs 2025-12-10 08:35:35 -03:00
templates config: Enable llm-server by default for clean stack installs 2025-12-09 00:28:46 -03:00
.env.example feat(bootstrap): implement mTLS for Vault access 2025-12-07 02:13:28 -03:00
.gitignore feat: add offline installer cache and health endpoints 2025-12-08 14:08:49 -03:00
3rdparty.toml feat(console): Show UI immediately with live system logs 2025-12-08 23:35:33 -03:00
add-req.sh Migrate HTTP API from Actix to Axum 2025-11-20 13:28:35 -03:00
askama.toml Add toml dependency for Askama custom filters config 2025-11-30 23:48:08 -03:00
Cargo.lock feat(bootstrap): implement mTLS for Vault access 2025-12-07 02:13:28 -03:00
Cargo.toml feat(bootstrap): implement mTLS for Vault access 2025-12-07 02:13:28 -03:00
diesel.toml - Remove all compilation errors. 2025-10-11 12:29:03 -03:00
fix-errors.sh refactor: simplify UI panels, use pooled DB, add --noui flag 2025-11-11 09:42:52 -03:00
LICENSE fix: update URLs and email addresses to reflect new domain 2025-04-15 12:49:05 -03:00
logo.png Add files via upload 2024-12-22 14:32:30 -03:00
PROMPT.md feat(auth): Add OAuth login for Google, Discord, Reddit, Twitter, Microsoft, Facebook 2025-12-04 22:53:40 -03:00
README.md Add balanced documentation structure 2025-12-04 12:44:18 -03:00

General Bots - Enterprise-Grade LLM Orchestrator

General Bot Logo

A strongly-typed LLM conversational platform focused on convention over configuration and code-less approaches.

What is General Bots?

General Bots is a self-hosted AI automation platform that provides:

  • Multi-Vendor LLM API - Unified interface for OpenAI, Groq, Claude, Anthropic
  • MCP + LLM Tools Generation - Instant tool creation from code/functions
  • Semantic Caching - Intelligent response caching (70% cost reduction)
  • Web Automation Engine - Browser automation + AI intelligence
  • Enterprise Data Connectors - CRM, ERP, database native integrations
  • Git-like Version Control - Full history with rollback capabilities

Quick Start

Prerequisites

Installation

git clone https://github.com/GeneralBots/BotServer
cd BotServer
cargo run

On first run, BotServer automatically sets up PostgreSQL, S3 storage, Redis cache, and downloads AI models.

The server will be available at http://localhost:8080.

Documentation

docs/
├── api/                        # API documentation
│   ├── README.md               # API overview
│   ├── rest-endpoints.md       # HTTP endpoints
│   └── websocket.md            # Real-time communication
├── guides/                     # How-to guides
│   ├── getting-started.md      # Quick start
│   ├── deployment.md           # Production setup
│   └── templates.md            # Using templates
└── reference/                  # Technical reference
    ├── basic-language.md       # BASIC keywords
    ├── configuration.md        # Config options
    └── architecture.md         # System design

Key Features

4 Essential Keywords

USE KB "kb-name"        ' Load knowledge base into vector database
CLEAR KB "kb-name"      ' Remove KB from session
USE TOOL "tool-name"    ' Make tool available to LLM
CLEAR TOOLS             ' Remove all tools from session

Example Bot

' customer-support.bas
USE KB "support-docs"
USE TOOL "create-ticket"
USE TOOL "check-order"

SET CONTEXT "support" AS "You are a helpful customer support agent."

TALK "Welcome! How can I help you today?"

Command-Line Options

cargo run                    # Default: console UI + web server
cargo run -- --noconsole     # Background service mode
cargo run -- --desktop       # Desktop application (Tauri)
cargo run -- --tenant <name> # Specify tenant
cargo run -- --container     # LXC container mode

Environment Variables

Only directory service variables are required:

Variable Purpose
DIRECTORY_URL Zitadel instance URL
DIRECTORY_CLIENT_ID OAuth client ID
DIRECTORY_CLIENT_SECRET OAuth client secret

All service credentials are managed automatically. See Configuration for details.

Current Status

Version: 6.0.8
Build Status: SUCCESS
Production Ready: YES

Deployment

See Deployment Guide for:

  • Single server setup
  • Docker Compose
  • LXC containers
  • Kubernetes
  • Reverse proxy configuration

Contributing

We welcome contributions! Please read our contributing guidelines before submitting PRs.

Security

Security issues should be reported to: security@pragmatismo.com.br

License

General Bot Copyright (c) pragmatismo.com.br. All rights reserved.
Licensed under the AGPL-3.0.

According to our dual licensing model, this program can be used either under the terms of the GNU Affero General Public License, version 3, or under a proprietary license.

Support

Contributors


General Bots Code Name: Guaribas

"No one should have to do work that can be done by a machine." - Roberto Mangabeira Unger