- Even more keywords.
This commit is contained in:
parent
a6c62d24db
commit
31a10b7b05
61 changed files with 4708 additions and 2713 deletions
334
README.md
334
README.md
|
|
@ -1,263 +1,175 @@
|
|||
# General Bots - KB and TOOL System
|
||||
# General Bots - Enterprise-Grade LLM Orchestrator
|
||||
|
||||
## Core System: 4 Essential Keywords
|
||||

|
||||
|
||||
General Bots provides a minimal, focused system for dynamically managing Knowledge Bases and Tools:
|
||||
**A strongly-typed LLM conversational platform focused on convention over configuration and code-less approaches.**
|
||||
|
||||
### Knowledge Base (KB) Commands
|
||||
## 🚀 Quick Links
|
||||
|
||||
- **`USE_KB "kb-name"`** - Loads and embeds files from `.gbkb/kb-name/` folder into vector database, making them available for semantic search in the current conversation session
|
||||
- **`CLEAR_KB "kb-name"`** - Removes a specific KB from current session (or `CLEAR_KB` to remove all)
|
||||
- **[Complete Documentation](docs/INDEX.md)** - Full documentation index
|
||||
- **[Quick Start Guide](docs/QUICK_START.md)** - Get started in minutes
|
||||
- **[Current Status](docs/07-STATUS.md)** - Production readiness (v6.0.8)
|
||||
- **[Changelog](CHANGELOG.md)** - Version history
|
||||
|
||||
### Tool Commands
|
||||
## 📚 Documentation Structure
|
||||
|
||||
- **`USE_TOOL "tool-name"`** - Makes a tool (`.bas` file) available for the LLM to call in the current session. Must be called in `start.bas` or from another tool. The tool's `DESCRIPTION` field is what the LLM reads to know when to call the tool.
|
||||
- **`CLEAR_TOOLS`** - Removes all tools from current session
|
||||
All documentation has been organized into the **[docs/](docs/)** directory:
|
||||
|
||||
---
|
||||
### Core Documentation (Numbered Chapters)
|
||||
- **[Chapter 0: Introduction & Getting Started](docs/00-README.md)**
|
||||
- **[Chapter 1: Build & Development Status](docs/01-BUILD_STATUS.md)**
|
||||
- **[Chapter 2: Code of Conduct](docs/02-CODE_OF_CONDUCT.md)**
|
||||
- **[Chapter 3: Código de Conduta (PT-BR)](docs/03-CODE_OF_CONDUCT-pt-br.md)**
|
||||
- **[Chapter 4: Contributing Guidelines](docs/04-CONTRIBUTING.md)**
|
||||
- **[Chapter 5: Integration Status](docs/05-INTEGRATION_STATUS.md)**
|
||||
- **[Chapter 6: Security Policy](docs/06-SECURITY.md)**
|
||||
- **[Chapter 7: Production Status](docs/07-STATUS.md)**
|
||||
|
||||
### Key Facts
|
||||
- LLM Orchestrator AGPL licensed (to use as custom-label SaaS, contributing back)
|
||||
- True community governance
|
||||
- No single corporate control
|
||||
- 5+ years of stability
|
||||
- Never changed license
|
||||
- Enterprise-grad
|
||||
- Hosted locally or Multicloud
|
||||
### Technical Documentation
|
||||
- **[KB & Tools System](docs/KB_AND_TOOLS.md)** - Core system architecture
|
||||
- **[Security Features](docs/SECURITY_FEATURES.md)** - Security implementation
|
||||
- **[Semantic Cache](docs/SEMANTIC_CACHE.md)** - LLM caching with 70% cost reduction
|
||||
- **[SMB Deployment](docs/SMB_DEPLOYMENT_GUIDE.md)** - Small business deployment guide
|
||||
- **[Universal Messaging](docs/BASIC_UNIVERSAL_MESSAGING.md)** - Multi-channel communication
|
||||
|
||||
## Contributors
|
||||
### Book-Style Documentation
|
||||
- **[Detailed Docs](docs/src/)** - Comprehensive book-format documentation
|
||||
|
||||
<a href="https://github.com/generalbots/botserver/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=generalbots/botserver" />
|
||||
</a>
|
||||
## 🎯 What is General Bots?
|
||||
|
||||
## Overview
|
||||
General Bots is a **self-hosted AI automation platform** that provides:
|
||||
|
||||
| Area | Status |
|
||||
|------------------------------|----------------------------------------------------------------------------------------------------|
|
||||
| Releases | [](https://www.npmjs.com/package/botserver/) [](https://www.npmjs.com/package/botlib/) [](https://github.com/semantic-release/semantic-release)|
|
||||
| Community | [](https://stackoverflow.com/search?q=%23generalbots&s=966e24e7-4f7a-46ee-b159-79d643d6b74a) [](https://badges.frapsoft.com) [](http://makeapullrequest.com) [](https://github.com/GeneralBots/BotServer/blob/master/LICENSE.txt)|
|
||||
| Management | [](https://gitHub.com/GeneralBots/BotServer/graphs/commit-activity) |
|
||||
| Security | [](https://snyk.io/test/github/GeneralBots/BotServer) |
|
||||
| Building & Quality | [](https://coveralls.io/github/GeneralBots/BotServer) [](https://github.com/prettier/prettier) |
|
||||
| Packaging | [](https://badge.fury.io) [](http://commitizen.github.io/cz-cli/) |
|
||||
| Samples | [BASIC](https://github.com/GeneralBots/BotServer/tree/master/packages/default.gbdialog) or [](https://github.com/GeneralBots/AzureADPasswordReset.gbapp)
|
||||
| [Docker Image](https://github.com/lpicanco/docker-botserver)  <br/> *Provided by [@lpicanco](https://github.com/lpicanco/docker-botserver)* |
|
||||
- ✅ **Multi-Vendor LLM API** - Unified interface for OpenAI, Groq, Claude, Anthropic
|
||||
- ✅ **MCP + LLM Tools Generation** - Instant tool creation from code/functions
|
||||
- ✅ **Semantic Caching** - Intelligent response caching (70% cost reduction)
|
||||
- ✅ **Web Automation Engine** - Browser automation + AI intelligence
|
||||
- ✅ **External Data APIs** - Integrated services via connectors
|
||||
- ✅ **Enterprise Data Connectors** - CRM, ERP, database native integrations
|
||||
- ✅ **Git-like Version Control** - Full history with rollback capabilities
|
||||
- ✅ **Contract Analysis** - Legal document review and summary
|
||||
|
||||
# BotServer - Just Run It! 🚀
|
||||
## 🏆 Key Features
|
||||
|
||||
)
|
||||
### 4 Essential Keywords
|
||||
General Bots provides a minimal, focused system for managing Knowledge Bases and Tools:
|
||||
|
||||
General Bot is a strongly typed LLM conversational platform package based chat bot server focused in convention over configuration and code-less approaches, which brings software packages and application server concepts to help parallel bot development.
|
||||
```basic
|
||||
USE_KB "kb-name" # Load knowledge base into vector database
|
||||
CLEAR_KB "kb-name" # Remove KB from session
|
||||
USE_TOOL "tool-name" # Make tool available to LLM
|
||||
CLEAR_TOOLS # Remove all tools from session
|
||||
```
|
||||
|
||||
## GENERAL BOTS SELF-HOST AI AUTOMATION PLATFORM
|
||||
### Strategic Advantages
|
||||
- **vs ChatGPT/Claude**: Automates entire business processes, not just chat
|
||||
- **vs n8n/Make**: Simpler approach with little programming needed
|
||||
- **vs Microsoft 365**: User control, not locked systems
|
||||
- **vs Salesforce**: Open-source AI orchestration connecting all systems
|
||||
|
||||
| FEATURE | STATUS | STRATEGIC ADVANTAGE | COMPETITIVE GAP |
|
||||
|---------|--------|---------------------|-----------------|
|
||||
| **Multi-Vendor LLM API** | ✅ DEPLOYED | Unified interface for OpenAI, Groq, Claude, Anthropic | Vendor lock-in |
|
||||
| **MCP + LLM Tools Generation** | ✅ DEPLOYED | Instant tool creation from code/functions | Manual tool development |
|
||||
| **Semantic Caching with Valkey** | ✅ DEPLOYED | Intelligent LLM response caching with semantic similarity matching - 70% cost reduction | No caching or basic key-value |
|
||||
| **Cross-Platform Desktop** | ⚡ NEAR-TERM | Native MacOS/Windows/Linux applications | Web-only interfaces |
|
||||
| **Git-like Version Control** | ✅ DEPLOYED | Full history with rollback capabilities | Basic undo/redo |
|
||||
| **Web Automation Engine** | ✅ DEPLOYED | Browser automation + AI intelligence | Separate RPA tools |
|
||||
| **External Data APIs** | ✅ DEPLOYED | integrated services via connectors | Limited integrations |
|
||||
| **Document Intelligence Suite** | ⚡ NEAR-TERM | AI-powered document creation & analysis | Basic file processing |
|
||||
| **Workflow Collaboration** | ⚡ NEAR-TERM | Real-time team automation building | Individual automation |
|
||||
| **Enterprise Data Connectors** | ✅ DEPLOYED | CRM, ERP, database native integrations | API-only connections |
|
||||
| **Real-time Co-editing** | 🔶 MEDIUM-TERM | Multiple users edit workflows simultaneously | Single-user editors |
|
||||
| **Advanced Analytics Dashboard** | ⚡ NEAR-TERM | Business intelligence with AI insights | Basic metrics |
|
||||
| **Compliance Automation** | 🔶 MEDIUM-TERM | Regulatory compliance workflows | Manual compliance |
|
||||
| **Presentation Generation** | ⚡ NEAR-TERM | AI-driven slide decks and reports | Manual creation |
|
||||
| **Spreadsheet Intelligence** | ⚡ NEAR-TERM | AI analysis of complex data models | Basic CSV processing |
|
||||
| **Calendar Automation** | 🔶 MEDIUM-TERM | Meeting scheduling and coordination | Manual calendar management |
|
||||
| **Email Campaign Engine** | 🔶 MEDIUM-TERM | Personalized bulk email with AI | Basic mailing lists |
|
||||
| **Project Management Sync** | 🔶 MEDIUM-TERM | AI coordinates across multiple tools | Siloed project data |
|
||||
| **Contract Analysis** | ✅ DEPLOYED | Legal document review and summary | Manual legal review |
|
||||
| **Budget Forecasting** | ⚡ NEAR-TERM | AI-powered financial projections | Spreadsheet-based |
|
||||
|
||||
**STATUS LEGEND:**
|
||||
- ✅ DEPLOYED - Production ready
|
||||
- ⚡ NEAR-TERM - 6 month development (foundation exists)
|
||||
- 🔶 MEDIUM-TERM - 12 month development
|
||||
|
||||
**ENTERPRISE PRODUCTIVITY SUITE CAPABILITIES:**
|
||||
|
||||
**Document Intelligence**
|
||||
- AI-powered document creation from templates
|
||||
- Smart content summarization and analysis
|
||||
- Multi-format compatibility (PDF, Word, Markdown)
|
||||
- Version control with change tracking
|
||||
|
||||
**Data Analysis & Reporting**
|
||||
- Spreadsheet AI with natural language queries
|
||||
- Automated dashboard generation
|
||||
- Predictive analytics and trend identification
|
||||
- Export to multiple business formats
|
||||
|
||||
**Communication & Collaboration**
|
||||
- Team workspace with shared automation
|
||||
- Meeting automation and minute generation
|
||||
- Cross-platform notification system
|
||||
- Approval workflow automation
|
||||
|
||||
**Business Process Automation**
|
||||
- End-to department workflow orchestration
|
||||
- Compliance and audit trail automation
|
||||
- Customer lifecycle management
|
||||
- Supply chain intelligence
|
||||
|
||||
**Competitive Positioning:**
|
||||
- **vs ChatGPT/Claude**: We automate entire business processes, not just chat
|
||||
- **vs n8n/Make**: Simpler approach and stimulate little programming.
|
||||
- **vs Microsoft 365**: We give control to users, not sell locked systems
|
||||
- **vs Salesforce**: We connect all business systems with open-source AI orchestration
|
||||
|
||||
|
||||
|
||||
## What is a Bot Server?
|
||||
|
||||
Bot Server accelerates the process of developing a bot. It provisions all code
|
||||
base, resources and deployment to the cloud, and gives you templates you can
|
||||
choose from whenever you need a new bot. The server has a database and service
|
||||
backend allowing you to further modify your bot package directly by downloading
|
||||
a zip file, editing and uploading it back to the server (deploying process) with
|
||||
no code. The Bot Server also provides a framework to develop bot packages in a more
|
||||
advanced fashion writing custom code in editors like Visual Studio Code, Atom or Brackets.
|
||||
|
||||
Everyone can create bots by just copying and pasting some files and using their
|
||||
favorite tools from Office (or any text editor) or Photoshop (or any image
|
||||
editor). LLM and BASIC can be mixed used to build custom dialogs so Bot can be extended just like VBA for Excel.
|
||||
|
||||
## Getting Started
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Prerequisites
|
||||
- **Rust** (latest stable) - [Install from rustup.rs](https://rustup.rs/)
|
||||
- **Git** (latest stable) - [Download from git-scm.com](https://git-scm.com/downloads)
|
||||
|
||||
Before you embark on your General Bots journey, ensure you have the following tools installed:
|
||||
### Installation
|
||||
|
||||
- **Rust (latest stable version)**: General Bots server is built with Rust for performance and safety. Install from [rustup.rs](https://rustup.rs/).
|
||||
- **Git (latest stable version)**: Essential for version control and collaborating on bot projects. Get it from [git-scm.com](https://git-scm.com/downloads).
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://github.com/GeneralBots/BotServer
|
||||
cd BotServer
|
||||
|
||||
**Optional (for Node.js bots):**
|
||||
- **Node.js (version 20 or later)**: For Node.js-based bot packages. Download from [nodejs.org](https://nodejs.org/en/download/).
|
||||
# Run the server (auto-installs dependencies)
|
||||
cargo run
|
||||
```
|
||||
|
||||
### Quick Start Guide (Rust Version)
|
||||
On first run, BotServer automatically:
|
||||
- Installs required components (PostgreSQL, MinIO, Redis, LLM)
|
||||
- Sets up database with migrations
|
||||
- Downloads AI models
|
||||
- Uploads template bots
|
||||
- Starts HTTP server at `http://127.0.0.1:8080`
|
||||
|
||||
Follow these steps to get your General Bots server up and running:
|
||||
|
||||
1. Clone the repository:
|
||||
```bash
|
||||
git clone https://github.com/GeneralBots/BotServer
|
||||
```
|
||||
This command creates a local copy of the General Bots server repository on your machine.
|
||||
|
||||
2. Navigate to the project directory:
|
||||
```bash
|
||||
cd BotServer
|
||||
```
|
||||
This changes your current directory to the newly cloned BotServer folder.
|
||||
|
||||
3. Run the server:
|
||||
```bash
|
||||
cargo run
|
||||
```
|
||||
On first run, BotServer will automatically:
|
||||
- Install required components (PostgreSQL, MinIO, Redis, LLM)
|
||||
- Set up the database with migrations
|
||||
- Download AI models
|
||||
- Upload template bots from `templates/` folder
|
||||
- Start the HTTP server on `http://127.0.0.1:8080` (or your configured port)
|
||||
|
||||
**Management Commands:**
|
||||
### Management Commands
|
||||
```bash
|
||||
botserver start # Start all components
|
||||
botserver stop # Stop all components
|
||||
botserver restart # Restart all components
|
||||
botserver list # List available components
|
||||
botserver status <component> # Check component status
|
||||
botserver install <component> # Install optional component
|
||||
```
|
||||
|
||||
### Accessing Your Bot
|
||||
## 📊 Current Status
|
||||
|
||||
Once the server is running, you can access your bot at `http://localhost:8080/` (or your configured `SERVER_PORT`). This local server allows you to interact with your bot and test its functionality in real-time.
|
||||
**Version:** 6.0.8
|
||||
**Build Status:** ✅ SUCCESS
|
||||
**Production Ready:** YES
|
||||
**Compilation:** 0 errors
|
||||
**Warnings:** 82 (all Tauri desktop UI - intentional)
|
||||
|
||||
**Anonymous Access:** Every visitor automatically gets a unique session tracked by cookie. No login required to start chatting!
|
||||
See **[docs/07-STATUS.md](docs/07-STATUS.md)** for detailed status.
|
||||
|
||||
**Authentication:** Users can optionally register/login at `/static/auth/login.html` to save conversations across devices.
|
||||
## 🤝 Contributing
|
||||
|
||||
**About Page:** Visit `/static/about/index.html` to learn more about BotServer and its maintainers.
|
||||
We welcome contributions! Please read:
|
||||
- **[Contributing Guidelines](docs/04-CONTRIBUTING.md)**
|
||||
- **[Code of Conduct](docs/02-CODE_OF_CONDUCT.md)**
|
||||
- **[Build Status](docs/01-BUILD_STATUS.md)** for current development status
|
||||
|
||||
Several samples, including a Bot for AD Password Reset, are avaiable on the [repository list](https://github.com/GeneralBots).
|
||||
## 🔒 Security
|
||||
|
||||
### Using complete General Bots Conversational Data Analytics
|
||||
Security issues should be reported to: **security@pragmatismo.com.br**
|
||||
|
||||

|
||||
See **[docs/06-SECURITY.md](docs/06-SECURITY.md)** for our security policy.
|
||||
|
||||
```
|
||||
TALK "General Bots Labs presents FISCAL DATA SHOW BY BASIC"
|
||||
## 📄 License
|
||||
|
||||
TALK "Gift Contributions to Reduce the Public Debt API (https://fiscaldata.treasury.gov/datasets/gift-contributions-reduce-debt-held-by-public/gift-contributions-to-reduce-the-public-debt)"
|
||||
General Bot Copyright (c) pragmatismo.com.br. All rights reserved.
|
||||
Licensed under the **AGPL-3.0**.
|
||||
|
||||
result = GET "https://api.fiscaldata.treasury.gov/services/api/fiscal_service/v2/accounting/od/gift_contributions?page[size]=500"
|
||||
data = result.data
|
||||
data = SELECT YEAR(record_date) as Yr, SUM(CAST(contribution_amt AS NUMBER)) AS Amount FROM data GROUP BY YEAR(record_date)
|
||||
According to our dual licensing model, this program can be used either under the terms of the GNU Affero General Public License, version 3, or under a proprietary license.
|
||||
|
||||
TALK "Demonstration of Gift Contributions with AS IMAGE keyword"
|
||||
SET THEME dark
|
||||
png = data as IMAGE
|
||||
SEND FILE png
|
||||
See [LICENSE](LICENSE) for details.
|
||||
|
||||
DELAY 5
|
||||
TALK " Demonstration of Gift Contributions CHART keyword"
|
||||
img = CHART "bar", data
|
||||
## 🌟 Key Facts
|
||||
|
||||
- ✅ LLM Orchestrator AGPL licensed (contribute back for custom-label SaaS)
|
||||
- ✅ True community governance
|
||||
- ✅ No single corporate control
|
||||
- ✅ 5+ years of stability
|
||||
- ✅ Never changed license
|
||||
- ✅ Enterprise-grade
|
||||
- ✅ Hosted locally or multicloud
|
||||
|
||||
## 📞 Support & Resources
|
||||
|
||||
- **Documentation:** [docs.pragmatismo.com.br](https://docs.pragmatismo.com.br)
|
||||
- **GitHub:** [github.com/GeneralBots/BotServer](https://github.com/GeneralBots/BotServer)
|
||||
- **Stack Overflow:** Tag questions with `generalbots`
|
||||
- **Video Tutorial:** [7 AI General Bots LLM Templates](https://www.youtube.com/watch?v=KJgvUPXi3Fw)
|
||||
|
||||
## 🎬 Demo
|
||||
|
||||
See conversational data analytics in action:
|
||||
|
||||
```basic
|
||||
TALK "General Bots Labs presents FISCAL DATA SHOW BY BASIC"
|
||||
result = GET "https://api.fiscaldata.treasury.gov/services/api/..."
|
||||
data = SELECT YEAR(record_date) as Yr, SUM(...) AS Amount FROM data
|
||||
img = CHART "bar", data
|
||||
SEND FILE img
|
||||
```
|
||||
|
||||
## Guide
|
||||
## 👥 Contributors
|
||||
|
||||
[Read the General Bots BotBook Guide](https://docs.pragmatismo.com.br)
|
||||
<a href="https://github.com/generalbots/botserver/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=generalbots/botserver" />
|
||||
</a>
|
||||
|
||||
# Videos
|
||||
---
|
||||
|
||||
7 AI General Bots LLM Templates for Goodness
|
||||
[https://www.youtube.com/watch?v=KJgvUPXi3Fw](https://www.youtube.com/watch?v=KJgvUPXi3Fw)
|
||||
**General Bots Code Name:** [Guaribas](https://en.wikipedia.org/wiki/Guaribas) (a city in Brazil, state of Piauí)
|
||||
|
||||
# Contributing
|
||||
> "No one should have to do work that can be done by a machine." - Roberto Mangabeira Unger
|
||||
|
||||
This project welcomes contributions and suggestions.
|
||||
See our [Contribution Guidelines](https://github.com/pragmatismo-io/BotServer/blob/master/CONTRIBUTING.md) for more details.
|
||||
|
||||
# Reporting Security Issues
|
||||
|
||||
Security issues and bugs should be reported privately, via email, to the pragmatismo.com.br Security
|
||||
team at [security@pragmatismo.com.br](mailto:security@pragmatismo.com.br). You should
|
||||
receive a response within 24 hours. If for some reason you do not, please follow up via
|
||||
email to ensure we received your original message.
|
||||
|
||||
# License & Warranty
|
||||
|
||||
General Bot Copyright (c) pragmatismo.com.br. All rights reserved.
|
||||
Licensed under the AGPL-3.0.
|
||||
|
||||
According to our dual licensing model, this program can be used either
|
||||
under the terms of the GNU Affero General Public License, version 3,
|
||||
or under a proprietary license.
|
||||
|
||||
The texts of the GNU Affero General Public License with an additional
|
||||
permission and of our proprietary license can be found at and
|
||||
in the LICENSE file you have received along with this program.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
"General Bot" is a registered trademark of pragmatismo.com.br.
|
||||
The licensing of the program under the AGPLv3 does not imply a
|
||||
trademark license. Therefore any rights, title and interest in
|
||||
our trademarks remain entirely with us.
|
||||
|
||||
<a href="https://stackoverflow.com/questions/ask?tags=generalbots">:speech_balloon: Ask a question</a> <a href="https://github.com/GeneralBots/BotBook">:book: Read the Docs</a>
|
||||
Team pictures made with [contrib.rocks](https://contrib.rocks).
|
||||
General Bots Code Name is [Guaribas](https://en.wikipedia.org/wiki/Guaribas), the name of a city in Brazil, state of Piaui.
|
||||
[Roberto Mangabeira Unger](http://www.robertounger.com/en/): "No one should have to do work that can be done by a machine".
|
||||
<a href="https://stackoverflow.com/questions/ask?tags=generalbots">:speech_balloon: Ask a question</a> <a href="https://github.com/GeneralBots/BotBook">:book: Read the Docs</a>
|
||||
|
|
|
|||
263
docs/00-README.md
Normal file
263
docs/00-README.md
Normal file
|
|
@ -0,0 +1,263 @@
|
|||
# General Bots - KB and TOOL System
|
||||
|
||||
## Core System: 4 Essential Keywords
|
||||
|
||||
General Bots provides a minimal, focused system for dynamically managing Knowledge Bases and Tools:
|
||||
|
||||
### Knowledge Base (KB) Commands
|
||||
|
||||
- **`USE_KB "kb-name"`** - Loads and embeds files from `.gbkb/kb-name/` folder into vector database, making them available for semantic search in the current conversation session
|
||||
- **`CLEAR_KB "kb-name"`** - Removes a specific KB from current session (or `CLEAR_KB` to remove all)
|
||||
|
||||
### Tool Commands
|
||||
|
||||
- **`USE_TOOL "tool-name"`** - Makes a tool (`.bas` file) available for the LLM to call in the current session. Must be called in `start.bas` or from another tool. The tool's `DESCRIPTION` field is what the LLM reads to know when to call the tool.
|
||||
- **`CLEAR_TOOLS`** - Removes all tools from current session
|
||||
|
||||
---
|
||||
|
||||
### Key Facts
|
||||
- LLM Orchestrator AGPL licensed (to use as custom-label SaaS, contributing back)
|
||||
- True community governance
|
||||
- No single corporate control
|
||||
- 5+ years of stability
|
||||
- Never changed license
|
||||
- Enterprise-grad
|
||||
- Hosted locally or Multicloud
|
||||
|
||||
## Contributors
|
||||
|
||||
<a href="https://github.com/generalbots/botserver/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=generalbots/botserver" />
|
||||
</a>
|
||||
|
||||
## Overview
|
||||
|
||||
| Area | Status |
|
||||
|------------------------------|----------------------------------------------------------------------------------------------------|
|
||||
| Releases | [](https://www.npmjs.com/package/botserver/) [](https://www.npmjs.com/package/botlib/) [](https://github.com/semantic-release/semantic-release)|
|
||||
| Community | [](https://stackoverflow.com/search?q=%23generalbots&s=966e24e7-4f7a-46ee-b159-79d643d6b74a) [](https://badges.frapsoft.com) [](http://makeapullrequest.com) [](https://github.com/GeneralBots/BotServer/blob/master/LICENSE.txt)|
|
||||
| Management | [](https://gitHub.com/GeneralBots/BotServer/graphs/commit-activity) |
|
||||
| Security | [](https://snyk.io/test/github/GeneralBots/BotServer) |
|
||||
| Building & Quality | [](https://coveralls.io/github/GeneralBots/BotServer) [](https://github.com/prettier/prettier) |
|
||||
| Packaging | [](https://badge.fury.io) [](http://commitizen.github.io/cz-cli/) |
|
||||
| Samples | [BASIC](https://github.com/GeneralBots/BotServer/tree/master/packages/default.gbdialog) or [](https://github.com/GeneralBots/AzureADPasswordReset.gbapp)
|
||||
| [Docker Image](https://github.com/lpicanco/docker-botserver)  <br/> *Provided by [@lpicanco](https://github.com/lpicanco/docker-botserver)* |
|
||||
|
||||
# BotServer - Just Run It! 🚀
|
||||
|
||||
)
|
||||
|
||||
General Bot is a strongly typed LLM conversational platform package based chat bot server focused in convention over configuration and code-less approaches, which brings software packages and application server concepts to help parallel bot development.
|
||||
|
||||
## GENERAL BOTS SELF-HOST AI AUTOMATION PLATFORM
|
||||
|
||||
| FEATURE | STATUS | STRATEGIC ADVANTAGE | COMPETITIVE GAP |
|
||||
|---------|--------|---------------------|-----------------|
|
||||
| **Multi-Vendor LLM API** | ✅ DEPLOYED | Unified interface for OpenAI, Groq, Claude, Anthropic | Vendor lock-in |
|
||||
| **MCP + LLM Tools Generation** | ✅ DEPLOYED | Instant tool creation from code/functions | Manual tool development |
|
||||
| **Semantic Caching with Valkey** | ✅ DEPLOYED | Intelligent LLM response caching with semantic similarity matching - 70% cost reduction | No caching or basic key-value |
|
||||
| **Cross-Platform Desktop** | ⚡ NEAR-TERM | Native MacOS/Windows/Linux applications | Web-only interfaces |
|
||||
| **Git-like Version Control** | ✅ DEPLOYED | Full history with rollback capabilities | Basic undo/redo |
|
||||
| **Web Automation Engine** | ✅ DEPLOYED | Browser automation + AI intelligence | Separate RPA tools |
|
||||
| **External Data APIs** | ✅ DEPLOYED | integrated services via connectors | Limited integrations |
|
||||
| **Document Intelligence Suite** | ⚡ NEAR-TERM | AI-powered document creation & analysis | Basic file processing |
|
||||
| **Workflow Collaboration** | ⚡ NEAR-TERM | Real-time team automation building | Individual automation |
|
||||
| **Enterprise Data Connectors** | ✅ DEPLOYED | CRM, ERP, database native integrations | API-only connections |
|
||||
| **Real-time Co-editing** | 🔶 MEDIUM-TERM | Multiple users edit workflows simultaneously | Single-user editors |
|
||||
| **Advanced Analytics Dashboard** | ⚡ NEAR-TERM | Business intelligence with AI insights | Basic metrics |
|
||||
| **Compliance Automation** | 🔶 MEDIUM-TERM | Regulatory compliance workflows | Manual compliance |
|
||||
| **Presentation Generation** | ⚡ NEAR-TERM | AI-driven slide decks and reports | Manual creation |
|
||||
| **Spreadsheet Intelligence** | ⚡ NEAR-TERM | AI analysis of complex data models | Basic CSV processing |
|
||||
| **Calendar Automation** | 🔶 MEDIUM-TERM | Meeting scheduling and coordination | Manual calendar management |
|
||||
| **Email Campaign Engine** | 🔶 MEDIUM-TERM | Personalized bulk email with AI | Basic mailing lists |
|
||||
| **Project Management Sync** | 🔶 MEDIUM-TERM | AI coordinates across multiple tools | Siloed project data |
|
||||
| **Contract Analysis** | ✅ DEPLOYED | Legal document review and summary | Manual legal review |
|
||||
| **Budget Forecasting** | ⚡ NEAR-TERM | AI-powered financial projections | Spreadsheet-based |
|
||||
|
||||
**STATUS LEGEND:**
|
||||
- ✅ DEPLOYED - Production ready
|
||||
- ⚡ NEAR-TERM - 6 month development (foundation exists)
|
||||
- 🔶 MEDIUM-TERM - 12 month development
|
||||
|
||||
**ENTERPRISE PRODUCTIVITY SUITE CAPABILITIES:**
|
||||
|
||||
**Document Intelligence**
|
||||
- AI-powered document creation from templates
|
||||
- Smart content summarization and analysis
|
||||
- Multi-format compatibility (PDF, Word, Markdown)
|
||||
- Version control with change tracking
|
||||
|
||||
**Data Analysis & Reporting**
|
||||
- Spreadsheet AI with natural language queries
|
||||
- Automated dashboard generation
|
||||
- Predictive analytics and trend identification
|
||||
- Export to multiple business formats
|
||||
|
||||
**Communication & Collaboration**
|
||||
- Team workspace with shared automation
|
||||
- Meeting automation and minute generation
|
||||
- Cross-platform notification system
|
||||
- Approval workflow automation
|
||||
|
||||
**Business Process Automation**
|
||||
- End-to department workflow orchestration
|
||||
- Compliance and audit trail automation
|
||||
- Customer lifecycle management
|
||||
- Supply chain intelligence
|
||||
|
||||
**Competitive Positioning:**
|
||||
- **vs ChatGPT/Claude**: We automate entire business processes, not just chat
|
||||
- **vs n8n/Make**: Simpler approach and stimulate little programming.
|
||||
- **vs Microsoft 365**: We give control to users, not sell locked systems
|
||||
- **vs Salesforce**: We connect all business systems with open-source AI orchestration
|
||||
|
||||
|
||||
|
||||
## What is a Bot Server?
|
||||
|
||||
Bot Server accelerates the process of developing a bot. It provisions all code
|
||||
base, resources and deployment to the cloud, and gives you templates you can
|
||||
choose from whenever you need a new bot. The server has a database and service
|
||||
backend allowing you to further modify your bot package directly by downloading
|
||||
a zip file, editing and uploading it back to the server (deploying process) with
|
||||
no code. The Bot Server also provides a framework to develop bot packages in a more
|
||||
advanced fashion writing custom code in editors like Visual Studio Code, Atom or Brackets.
|
||||
|
||||
Everyone can create bots by just copying and pasting some files and using their
|
||||
favorite tools from Office (or any text editor) or Photoshop (or any image
|
||||
editor). LLM and BASIC can be mixed used to build custom dialogs so Bot can be extended just like VBA for Excel.
|
||||
|
||||
## Getting Started
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Before you embark on your General Bots journey, ensure you have the following tools installed:
|
||||
|
||||
- **Rust (latest stable version)**: General Bots server is built with Rust for performance and safety. Install from [rustup.rs](https://rustup.rs/).
|
||||
- **Git (latest stable version)**: Essential for version control and collaborating on bot projects. Get it from [git-scm.com](https://git-scm.com/downloads).
|
||||
|
||||
**Optional (for Node.js bots):**
|
||||
- **Node.js (version 20 or later)**: For Node.js-based bot packages. Download from [nodejs.org](https://nodejs.org/en/download/).
|
||||
|
||||
### Quick Start Guide (Rust Version)
|
||||
|
||||
Follow these steps to get your General Bots server up and running:
|
||||
|
||||
1. Clone the repository:
|
||||
```bash
|
||||
git clone https://github.com/GeneralBots/BotServer
|
||||
```
|
||||
This command creates a local copy of the General Bots server repository on your machine.
|
||||
|
||||
2. Navigate to the project directory:
|
||||
```bash
|
||||
cd BotServer
|
||||
```
|
||||
This changes your current directory to the newly cloned BotServer folder.
|
||||
|
||||
3. Run the server:
|
||||
```bash
|
||||
cargo run
|
||||
```
|
||||
On first run, BotServer will automatically:
|
||||
- Install required components (PostgreSQL, MinIO, Redis, LLM)
|
||||
- Set up the database with migrations
|
||||
- Download AI models
|
||||
- Upload template bots from `templates/` folder
|
||||
- Start the HTTP server on `http://127.0.0.1:8080` (or your configured port)
|
||||
|
||||
**Management Commands:**
|
||||
```bash
|
||||
botserver start # Start all components
|
||||
botserver stop # Stop all components
|
||||
botserver restart # Restart all components
|
||||
botserver list # List available components
|
||||
botserver status <component> # Check component status
|
||||
botserver install <component> # Install optional component
|
||||
```
|
||||
|
||||
### Accessing Your Bot
|
||||
|
||||
Once the server is running, you can access your bot at `http://localhost:8080/` (or your configured `SERVER_PORT`). This local server allows you to interact with your bot and test its functionality in real-time.
|
||||
|
||||
**Anonymous Access:** Every visitor automatically gets a unique session tracked by cookie. No login required to start chatting!
|
||||
|
||||
**Authentication:** Users can optionally register/login at `/static/auth/login.html` to save conversations across devices.
|
||||
|
||||
**About Page:** Visit `/static/about/index.html` to learn more about BotServer and its maintainers.
|
||||
|
||||
Several samples, including a Bot for AD Password Reset, are avaiable on the [repository list](https://github.com/GeneralBots).
|
||||
|
||||
### Using complete General Bots Conversational Data Analytics
|
||||
|
||||

|
||||
|
||||
```
|
||||
TALK "General Bots Labs presents FISCAL DATA SHOW BY BASIC"
|
||||
|
||||
TALK "Gift Contributions to Reduce the Public Debt API (https://fiscaldata.treasury.gov/datasets/gift-contributions-reduce-debt-held-by-public/gift-contributions-to-reduce-the-public-debt)"
|
||||
|
||||
result = GET "https://api.fiscaldata.treasury.gov/services/api/fiscal_service/v2/accounting/od/gift_contributions?page[size]=500"
|
||||
data = result.data
|
||||
data = SELECT YEAR(record_date) as Yr, SUM(CAST(contribution_amt AS NUMBER)) AS Amount FROM data GROUP BY YEAR(record_date)
|
||||
|
||||
TALK "Demonstration of Gift Contributions with AS IMAGE keyword"
|
||||
SET THEME dark
|
||||
png = data as IMAGE
|
||||
SEND FILE png
|
||||
|
||||
DELAY 5
|
||||
TALK " Demonstration of Gift Contributions CHART keyword"
|
||||
img = CHART "bar", data
|
||||
SEND FILE img
|
||||
```
|
||||
|
||||
## Guide
|
||||
|
||||
[Read the General Bots BotBook Guide](https://docs.pragmatismo.com.br)
|
||||
|
||||
# Videos
|
||||
|
||||
7 AI General Bots LLM Templates for Goodness
|
||||
[https://www.youtube.com/watch?v=KJgvUPXi3Fw](https://www.youtube.com/watch?v=KJgvUPXi3Fw)
|
||||
|
||||
# Contributing
|
||||
|
||||
This project welcomes contributions and suggestions.
|
||||
See our [Contribution Guidelines](https://github.com/pragmatismo-io/BotServer/blob/master/CONTRIBUTING.md) for more details.
|
||||
|
||||
# Reporting Security Issues
|
||||
|
||||
Security issues and bugs should be reported privately, via email, to the pragmatismo.com.br Security
|
||||
team at [security@pragmatismo.com.br](mailto:security@pragmatismo.com.br). You should
|
||||
receive a response within 24 hours. If for some reason you do not, please follow up via
|
||||
email to ensure we received your original message.
|
||||
|
||||
# License & Warranty
|
||||
|
||||
General Bot Copyright (c) pragmatismo.com.br. All rights reserved.
|
||||
Licensed under the AGPL-3.0.
|
||||
|
||||
According to our dual licensing model, this program can be used either
|
||||
under the terms of the GNU Affero General Public License, version 3,
|
||||
or under a proprietary license.
|
||||
|
||||
The texts of the GNU Affero General Public License with an additional
|
||||
permission and of our proprietary license can be found at and
|
||||
in the LICENSE file you have received along with this program.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
"General Bot" is a registered trademark of pragmatismo.com.br.
|
||||
The licensing of the program under the AGPLv3 does not imply a
|
||||
trademark license. Therefore any rights, title and interest in
|
||||
our trademarks remain entirely with us.
|
||||
|
||||
<a href="https://stackoverflow.com/questions/ask?tags=generalbots">:speech_balloon: Ask a question</a> <a href="https://github.com/GeneralBots/BotBook">:book: Read the Docs</a>
|
||||
Team pictures made with [contrib.rocks](https://contrib.rocks).
|
||||
General Bots Code Name is [Guaribas](https://en.wikipedia.org/wiki/Guaribas), the name of a city in Brazil, state of Piaui.
|
||||
[Roberto Mangabeira Unger](http://www.robertounger.com/en/): "No one should have to do work that can be done by a machine".
|
||||
452
docs/05-INTEGRATION_STATUS.md
Normal file
452
docs/05-INTEGRATION_STATUS.md
Normal file
|
|
@ -0,0 +1,452 @@
|
|||
# BOTSERVER INTEGRATION STATUS
|
||||
|
||||
## 🎯 COMPLETE INTEGRATION PLAN - ACTIVATION STATUS
|
||||
|
||||
This document tracks the activation and exposure of all modules in the botserver system.
|
||||
|
||||
---
|
||||
|
||||
## ✅ COMPLETED ACTIVATIONS
|
||||
|
||||
### 1. **AUTH/ZITADEL.RS** - ⚠️ 80% COMPLETE
|
||||
**Status:** Core implementation complete - Facade integration in progress
|
||||
|
||||
**Completed:**
|
||||
- ✅ All structs made public and serializable (`ZitadelConfig`, `ZitadelUser`, `TokenResponse`, `IntrospectionResponse`)
|
||||
- ✅ `ZitadelClient` and `ZitadelAuth` structs fully exposed with public fields
|
||||
- ✅ All client methods made public (create_user, get_user, search_users, list_users, etc.)
|
||||
- ✅ Organization management fully exposed
|
||||
- ✅ User/org membership management public
|
||||
- ✅ Role and permission management exposed
|
||||
- ✅ User workspace structure fully implemented and public
|
||||
- ✅ JWT token extraction utility exposed
|
||||
- ✅ All methods updated to return proper Result types
|
||||
|
||||
**Remaining:**
|
||||
- 🔧 Complete ZitadelAuthFacade integration (type mismatches with facade trait)
|
||||
- 🔧 Test all Zitadel API endpoints
|
||||
- 🔧 Add comprehensive error handling
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct ZitadelClient { /* full API */ }
|
||||
pub struct ZitadelAuth { /* full API */ }
|
||||
pub struct UserWorkspace { /* full API */ }
|
||||
pub fn extract_user_id_from_token(token: &str) -> Result<String>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 2. **CHANNELS/WHATSAPP.RS** - ⚠️ 60% COMPLETE
|
||||
**Status:** All structures exposed, implementation needed
|
||||
|
||||
**Completed:**
|
||||
- ✅ All WhatsApp structs made public and Clone-able
|
||||
- ✅ Webhook structures exposed (`WhatsAppWebhook`, `WhatsAppMessage`)
|
||||
- ✅ Message types fully defined (`WhatsAppIncomingMessage`, `WhatsAppText`, `WhatsAppMedia`, `WhatsAppLocation`)
|
||||
- ✅ All entry/change/value structures exposed
|
||||
- ✅ Contact and profile structures public
|
||||
|
||||
**Needs Implementation:**
|
||||
- 🔧 Implement message sending methods
|
||||
- 🔧 Implement webhook verification handler
|
||||
- 🔧 Implement message processing handler
|
||||
- 🔧 Connect to Meta WhatsApp Business API
|
||||
- 🔧 Add router endpoints to main app
|
||||
- 🔧 Implement media download/upload
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct WhatsAppMessage { /* ... */ }
|
||||
pub struct WhatsAppIncomingMessage { /* ... */ }
|
||||
pub fn create_whatsapp_router() -> Router
|
||||
pub async fn send_whatsapp_message() -> Result<()>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 3. **CHANNELS/INSTAGRAM.RS** - 📋 PENDING
|
||||
**Status:** Not Started
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Expose all Instagram structs
|
||||
- [ ] Implement Meta Graph API integration
|
||||
- [ ] Add Instagram Direct messaging
|
||||
- [ ] Implement story/post interactions
|
||||
- [ ] Connect router to main app
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct InstagramMessage { /* ... */ }
|
||||
pub async fn send_instagram_dm() -> Result<()>
|
||||
pub fn create_instagram_router() -> Router
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 4. **CHANNELS/TEAMS.RS** - 📋 PENDING
|
||||
**Status:** Not Started
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Expose all Teams structs
|
||||
- [ ] Implement Microsoft Graph API integration
|
||||
- [ ] Add Teams bot messaging
|
||||
- [ ] Implement adaptive cards support
|
||||
- [ ] Connect router to main app
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct TeamsMessage { /* ... */ }
|
||||
pub async fn send_teams_message() -> Result<()>
|
||||
pub fn create_teams_router() -> Router
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 5. **BASIC/COMPILER/MOD.RS** - 📋 PENDING
|
||||
**Status:** Needs Exposure
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Mark all compiler methods as `pub`
|
||||
- [ ] Add `#[cfg(feature = "mcp-tools")]` guards
|
||||
- [ ] Expose tool format definitions
|
||||
- [ ] Make compiler infrastructure accessible
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct ToolCompiler { /* ... */ }
|
||||
pub fn compile_tool_definitions() -> Result<Vec<Tool>>
|
||||
pub fn validate_tool_schema() -> Result<()>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 6. **DRIVE_MONITOR/MOD.RS** - 📋 PENDING
|
||||
**Status:** Fields unused, needs activation
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Use all struct fields properly
|
||||
- [ ] Mark methods as `pub`
|
||||
- [ ] Implement Google Drive API integration
|
||||
- [ ] Add change monitoring
|
||||
- [ ] Connect to vectordb
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct DriveMonitor { /* full fields */ }
|
||||
pub async fn start_monitoring() -> Result<()>
|
||||
pub async fn sync_drive_files() -> Result<()>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 7. **MEET/SERVICE.RS** - 📋 PENDING
|
||||
**Status:** Fields unused, needs activation
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Use `connections` field for meeting management
|
||||
- [ ] Mark voice/transcription methods as `pub`
|
||||
- [ ] Implement meeting creation
|
||||
- [ ] Add participant management
|
||||
- [ ] Connect audio processing
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct MeetService { pub connections: HashMap<...> }
|
||||
pub async fn create_meeting() -> Result<Meeting>
|
||||
pub async fn start_transcription() -> Result<()>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 8. **PACKAGE_MANAGER/SETUP/** - ⚠️ IN PROGRESS
|
||||
**Status:** Structures exist, needs method exposure
|
||||
|
||||
#### Directory Setup
|
||||
- ✅ Core directory setup exists
|
||||
- [ ] Mark all methods as `pub`
|
||||
- [ ] Keep `generate_directory_config`
|
||||
- [ ] Expose setup infrastructure
|
||||
|
||||
#### Email Setup
|
||||
- ✅ `EmailDomain` struct exists
|
||||
- [ ] Mark all methods as `pub`
|
||||
- [ ] Keep `generate_email_config`
|
||||
- [ ] Full email setup activation
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub fn generate_directory_config() -> Result<DirectoryConfig>
|
||||
pub fn generate_email_config() -> Result<EmailConfig>
|
||||
pub struct EmailDomain { /* ... */ }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 9. **CONFIG/MOD.RS** - ✅ 90% COMPLETE
|
||||
**Status:** Most functionality already public
|
||||
|
||||
**Completed:**
|
||||
- ✅ `sync_gbot_config` is already public
|
||||
- ✅ Config type alias exists
|
||||
- ✅ ConfigManager fully exposed
|
||||
|
||||
**Remaining:**
|
||||
- [ ] Verify `email` field usage in `AppConfig`
|
||||
- [ ] Add proper accessor methods if needed
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub type Config = AppConfig;
|
||||
pub fn sync_gbot_config() -> Result<()>
|
||||
impl AppConfig { pub fn email(&self) -> &EmailConfig }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 10. **BOT/MULTIMEDIA.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed and documented
|
||||
|
||||
**Completed:**
|
||||
- ✅ `MultimediaMessage` enum is public with all variants
|
||||
- ✅ All multimedia types exposed (Text, Image, Video, Audio, Document, WebSearch, Location, MeetingInvite)
|
||||
- ✅ `SearchResult` struct public
|
||||
- ✅ `MediaUploadRequest` and `MediaUploadResponse` public
|
||||
- ✅ `MultimediaHandler` trait fully exposed
|
||||
- ✅ All structures properly documented
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub enum MultimediaMessage { /* ... */ }
|
||||
pub async fn process_image() -> Result<ProcessedImage>
|
||||
pub async fn process_video() -> Result<ProcessedVideo>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 11. **CHANNELS/MOD.RS** - 📋 PENDING
|
||||
**Status:** Incomplete implementation
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Implement `send_message` fully
|
||||
- [ ] Use `connections` field properly
|
||||
- [ ] Mark voice methods as `pub`
|
||||
- [ ] Complete channel abstraction
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub async fn send_message(channel: Channel, msg: Message) -> Result<()>
|
||||
pub async fn start_voice_call() -> Result<VoiceConnection>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 12. **AUTH/MOD.RS** - 📋 PENDING
|
||||
**Status:** Needs enhancement
|
||||
|
||||
**Required Actions:**
|
||||
- [ ] Keep Zitadel-related methods
|
||||
- [ ] Use `facade` field properly
|
||||
- [ ] Enhance SimpleAuth implementation
|
||||
- [ ] Complete auth abstraction
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct AuthManager { pub facade: Box<dyn AuthFacade> }
|
||||
pub async fn authenticate() -> Result<AuthResult>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 13. **BASIC/KEYWORDS/WEATHER.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed and functional
|
||||
|
||||
**Completed:**
|
||||
- ✅ `WeatherData` struct made public and Clone-able
|
||||
- ✅ `fetch_weather` function exposed as public
|
||||
- ✅ `parse_location` function exposed as public
|
||||
- ✅ Weather API integration complete (7Timer!)
|
||||
- ✅ Keyword registration exists
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub async fn get_weather(location: &str) -> Result<Weather>
|
||||
pub async fn get_forecast(location: &str) -> Result<Forecast>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 14. **SESSION/MOD.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed session management
|
||||
|
||||
**Completed:**
|
||||
- ✅ `provide_input` is already public
|
||||
- ✅ `update_session_context` is already public
|
||||
- ✅ SessionManager fully exposed
|
||||
- ✅ Session management API complete
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub async fn provide_input(session: &mut Session, input: Input) -> Result<()>
|
||||
pub async fn update_session_context(session: &mut Session, ctx: Context) -> Result<()>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 15. **LLM/LOCAL.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed and functional
|
||||
|
||||
**Completed:**
|
||||
- ✅ All functions are already public
|
||||
- ✅ `chat_completions_local` endpoint exposed
|
||||
- ✅ `embeddings_local` endpoint exposed
|
||||
- ✅ `ensure_llama_servers_running` public
|
||||
- ✅ `start_llm_server` and `start_embedding_server` public
|
||||
- ✅ Server health checking exposed
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub async fn generate_local(prompt: &str) -> Result<String>
|
||||
pub async fn embed_local(text: &str) -> Result<Vec<f32>>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 16. **LLM_MODELS/MOD.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed model handlers
|
||||
|
||||
**Completed:**
|
||||
- ✅ `ModelHandler` trait is public
|
||||
- ✅ `get_handler` function is public
|
||||
- ✅ All model implementations exposed (gpt_oss_20b, gpt_oss_120b, deepseek_r3)
|
||||
- ✅ Analysis utilities accessible
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub fn list_available_models() -> Vec<ModelInfo>
|
||||
pub async fn analyze_with_model(model: &str, input: &str) -> Result<Analysis>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 17. **NVIDIA/MOD.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed monitoring system
|
||||
|
||||
**Completed:**
|
||||
- ✅ `SystemMetrics` struct public with `gpu_usage` and `cpu_usage` fields
|
||||
- ✅ `get_system_metrics` function public
|
||||
- ✅ `has_nvidia_gpu` function public
|
||||
- ✅ `get_gpu_utilization` function public
|
||||
- ✅ Full GPU/CPU monitoring exposed
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct NvidiaMonitor { pub gpu_usage: f32, pub cpu_usage: f32 }
|
||||
pub async fn get_gpu_stats() -> Result<GpuStats>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 18. **BASIC/KEYWORDS/USE_KB.RS** - ✅ 100% COMPLETE
|
||||
**Status:** Fully exposed knowledge base integration
|
||||
|
||||
**Completed:**
|
||||
- ✅ `ActiveKbResult` struct made public with all fields public
|
||||
- ✅ `get_active_kbs_for_session` is already public
|
||||
- ✅ Knowledge base activation exposed
|
||||
- ✅ Session KB associations accessible
|
||||
|
||||
**API Surface:**
|
||||
```rust
|
||||
pub struct ActiveKbResult { /* ... */ }
|
||||
pub async fn get_active_kbs_for_session(session: &Session) -> Result<Vec<Kb>>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔧 INTEGRATION CHECKLIST
|
||||
|
||||
### Phase 1: Critical Infrastructure (Priority 1)
|
||||
- [ ] Complete Zitadel integration
|
||||
- [ ] Expose all channel interfaces
|
||||
- [ ] Activate session management
|
||||
- [ ] Enable auth facade
|
||||
|
||||
### Phase 2: Feature Modules (Priority 2)
|
||||
- [ ] Activate all keyword handlers
|
||||
- [ ] Enable multimedia processing
|
||||
- [ ] Expose compiler infrastructure
|
||||
- [ ] Connect drive monitoring
|
||||
|
||||
### Phase 3: Advanced Features (Priority 3)
|
||||
- [ ] Enable meeting services
|
||||
- [ ] Activate NVIDIA monitoring
|
||||
- [ ] Complete knowledge base integration
|
||||
- [ ] Expose local LLM
|
||||
|
||||
### Phase 4: Complete Integration (Priority 4)
|
||||
- [ ] Connect all routers to main app
|
||||
- [ ] Test all exposed APIs
|
||||
- [ ] Document all public interfaces
|
||||
- [ ] Verify 0 warnings compilation
|
||||
|
||||
---
|
||||
|
||||
## 📊 OVERALL PROGRESS
|
||||
|
||||
**Total Modules:** 18
|
||||
**Fully Completed:** 8 (Multimedia, Weather, Session, LLM Local, LLM Models, NVIDIA, Use KB, Config)
|
||||
**Partially Complete:** 2 (Zitadel 80%, WhatsApp 60%)
|
||||
**In Progress:** 1 (Package Manager Setup)
|
||||
**Pending:** 7 (Instagram, Teams, Compiler, Drive Monitor, Meet Service, Channels Core, Auth Core)
|
||||
|
||||
**Completion:** ~50%
|
||||
|
||||
**Target:** 100% - All modules activated, exposed, and integrated with 0 warnings
|
||||
|
||||
---
|
||||
|
||||
## 🚀 NEXT STEPS
|
||||
|
||||
### Immediate Priorities:
|
||||
1. **Fix Zitadel Facade** - Complete type alignment in `ZitadelAuthFacade`
|
||||
2. **Complete WhatsApp** - Implement handlers and connect to Meta API
|
||||
3. **Activate Instagram** - Build full Instagram Direct messaging support
|
||||
4. **Activate Teams** - Implement Microsoft Teams bot integration
|
||||
|
||||
### Secondary Priorities:
|
||||
5. **Expose Compiler** - Make tool compiler infrastructure accessible
|
||||
6. **Activate Drive Monitor** - Complete Google Drive integration
|
||||
7. **Activate Meet Service** - Enable meeting and transcription features
|
||||
8. **Complete Package Manager** - Expose all setup utilities
|
||||
|
||||
### Testing Phase:
|
||||
9. Test all exposed APIs
|
||||
10. Verify 0 compiler warnings
|
||||
11. Document all public interfaces
|
||||
12. Create integration examples
|
||||
|
||||
---
|
||||
|
||||
## 📝 NOTES
|
||||
|
||||
- All structs should be `pub` and `Clone` when possible
|
||||
- All key methods must be `pub`
|
||||
- Use `#[cfg(feature = "...")]` for optional features
|
||||
- Ensure proper error handling in all public APIs
|
||||
- Document all public interfaces
|
||||
- Test thoroughly before marking as complete
|
||||
|
||||
**Goal:** Enterprise-grade, fully exposed, completely integrated bot platform with 0 compiler warnings.
|
||||
|
||||
---
|
||||
|
||||
## 🎉 MAJOR ACHIEVEMENTS
|
||||
|
||||
1. **8 modules fully activated** - Nearly half of all modules now completely exposed
|
||||
2. **Zero-warning compilation** for completed modules
|
||||
3. **Full API exposure** - All key utilities (weather, LLM, NVIDIA, KB) accessible
|
||||
4. **Enterprise-ready** - Session management, config, and multimedia fully functional
|
||||
5. **Strong foundation** - 80% of Zitadel auth complete, channels infrastructure ready
|
||||
|
||||
**Next Milestone:** 100% completion with full channel integration and 0 warnings across entire codebase.
|
||||
|
|
@ -1,268 +0,0 @@
|
|||
# Documentation Changelog
|
||||
|
||||
## 2024 Update - Truth-Based Documentation Revision
|
||||
|
||||
This changelog documents the major documentation updates to align with the actual BotServer 6.0.8 implementation.
|
||||
|
||||
### Overview
|
||||
|
||||
The documentation has been **comprehensively updated** to reflect the real architecture, features, and structure of the BotServer codebase. Previous documentation contained aspirational features and outdated architectural descriptions that didn't match the implementation.
|
||||
|
||||
---
|
||||
|
||||
## Major Changes
|
||||
|
||||
### Architecture Documentation (Chapter 06)
|
||||
|
||||
#### ✅ **Updated: Module Structure** (`chapter-06/crates.md`)
|
||||
- **Before**: Documentation referred to BotServer as a "multi-crate workspace"
|
||||
- **After**: Accurately describes it as a **single monolithic Rust crate** with modules
|
||||
- **Changes**:
|
||||
- Listed all 20+ actual modules from `src/lib.rs`
|
||||
- Documented internal modules (`ui/`, `drive/`, `riot_compiler/`, etc.)
|
||||
- Added feature flag documentation (`vectordb`, `email`, `desktop`)
|
||||
- Included dependency overview
|
||||
- Provided accurate build commands
|
||||
|
||||
#### ✅ **Updated: Building from Source** (`chapter-06/building.md`)
|
||||
- **Before**: Minimal or incorrect build instructions
|
||||
- **After**: Comprehensive build guide with:
|
||||
- System dependencies per platform (Linux, macOS, Windows)
|
||||
- Feature-specific builds
|
||||
- Cross-compilation instructions
|
||||
- Troubleshooting common issues
|
||||
- Build profile explanations
|
||||
- Size optimization tips
|
||||
|
||||
#### ✅ **Updated: Adding Dependencies** (`chapter-06/dependencies.md`)
|
||||
- **Before**: Empty or minimal content
|
||||
- **After**: Complete dependency management guide:
|
||||
- How to add dependencies to single `Cargo.toml`
|
||||
- Version constraints and best practices
|
||||
- Feature flag management
|
||||
- Git dependencies
|
||||
- Optional and platform-specific dependencies
|
||||
- Existing dependency inventory
|
||||
- Security auditing with `cargo audit`
|
||||
- Full example walkthrough
|
||||
|
||||
#### ✅ **Updated: Service Layer** (`chapter-06/services.md`)
|
||||
- **Before**: Empty file
|
||||
- **After**: Comprehensive 325-line module documentation:
|
||||
- All 20+ modules categorized by function
|
||||
- Purpose and responsibilities of each module
|
||||
- Key features and APIs
|
||||
- Service interaction patterns
|
||||
- Layered architecture description
|
||||
- Async/await and error handling patterns
|
||||
|
||||
#### ✅ **Updated: Chapter 06 Title** (`chapter-06/README.md`)
|
||||
- **Before**: "gbapp Reference" (gbapp doesn't exist)
|
||||
- **After**: "Rust Architecture Reference"
|
||||
- Added introduction explaining single-crate architecture
|
||||
|
||||
#### ✅ **Updated: Architecture Overview** (`chapter-06/architecture.md`)
|
||||
- Renamed section from "Architecture" to "Architecture Overview"
|
||||
- Kept existing Auto Bootstrap documentation (accurate)
|
||||
|
||||
---
|
||||
|
||||
### Package System Documentation (Chapter 02)
|
||||
|
||||
#### ✅ **Updated: Package Overview** (`chapter-02/README.md`)
|
||||
- **Before**: Brief table, unclear structure
|
||||
- **After**: 239-line comprehensive guide:
|
||||
- Template-based package system explanation
|
||||
- Actual package structure from `templates/` directory
|
||||
- Real examples: `default.gbai` and `announcements.gbai`
|
||||
- Package lifecycle documentation
|
||||
- Multi-bot hosting details
|
||||
- Storage location mapping
|
||||
- Best practices and troubleshooting
|
||||
|
||||
#### ✅ **Updated: .gbai Architecture** (`chapter-02/gbai.md`)
|
||||
- **Before**: Described fictional `manifest.json` and `dependencies.json`
|
||||
- **After**: Documents actual structure:
|
||||
- Real directory-based package structure
|
||||
- No manifest files (doesn't exist in code)
|
||||
- Actual bootstrap process from `src/bootstrap/mod.rs`
|
||||
- Real templates: `default.gbai` and `announcements.gbai`
|
||||
- Accurate naming conventions
|
||||
- Working examples from actual codebase
|
||||
|
||||
---
|
||||
|
||||
### Introduction and Core Documentation
|
||||
|
||||
#### ✅ **Updated: Introduction** (`introduction.md`)
|
||||
- **Before**: Generic overview with unclear architecture
|
||||
- **After**: 253-line accurate introduction:
|
||||
- Correct project name: "BotServer" (not "GeneralBots")
|
||||
- Accurate module listing with descriptions
|
||||
- Real technology stack from `Cargo.toml`
|
||||
- Actual feature descriptions
|
||||
- Correct version: 6.0.8
|
||||
- License: AGPL-3.0
|
||||
- Real repository link
|
||||
|
||||
#### ✅ **Updated: Core Features** (`chapter-09/core-features.md`)
|
||||
- **Before**: Empty file
|
||||
- **After**: 269-line feature documentation:
|
||||
- Multi-channel communication (actual implementation)
|
||||
- Authentication with Argon2 (real code)
|
||||
- BASIC scripting language
|
||||
- LLM integration details
|
||||
- Vector database (Qdrant) integration
|
||||
- MinIO/S3 object storage
|
||||
- PostgreSQL schema
|
||||
- Redis caching
|
||||
- Automation and scheduling
|
||||
- Email integration (optional feature)
|
||||
- LiveKit video conferencing
|
||||
- Auto-bootstrap system
|
||||
- Package manager with 20+ components
|
||||
- Security features
|
||||
- Testing infrastructure
|
||||
|
||||
#### ✅ **Updated: Documentation README** (`README.md`)
|
||||
- **Before**: Generic introduction to "GeneralBots"
|
||||
- **After**: Accurate project overview:
|
||||
- Documentation status indicators (✅ ⚠️ 📝)
|
||||
- Known gaps and missing documentation
|
||||
- Quick start guide
|
||||
- Architecture overview
|
||||
- Technology stack
|
||||
- Version and license information
|
||||
- Contribution guidelines
|
||||
|
||||
---
|
||||
|
||||
### Summary Table of Contents Updates
|
||||
|
||||
#### ✅ **Updated: SUMMARY.md**
|
||||
- Changed "Chapter 06: gbapp Reference" → "Chapter 06: Rust Architecture Reference"
|
||||
- Changed "Rust Architecture" → "Architecture Overview"
|
||||
- Changed "Crate Structure" → "Module Structure"
|
||||
|
||||
---
|
||||
|
||||
## What Remains Accurate
|
||||
|
||||
The following documentation was **already accurate** and unchanged:
|
||||
|
||||
- ✅ Bootstrap process documentation (matches `src/bootstrap/mod.rs`)
|
||||
- ✅ Package manager component list (matches implementation)
|
||||
- ✅ BASIC keyword examples (real keywords from `src/basic/`)
|
||||
- ✅ Database schema references (matches Diesel models)
|
||||
|
||||
---
|
||||
|
||||
## Known Documentation Gaps
|
||||
|
||||
The following areas **still need documentation**:
|
||||
|
||||
### 📝 Needs Documentation
|
||||
1. **UI Module** (`src/ui/`) - Drive UI, sync, streaming
|
||||
2. **UI Tree** (`src/ui_tree/`) - File tree implementation
|
||||
3. **Riot Compiler** (`src/riot_compiler/`) - Riot.js component compilation (unused?)
|
||||
4. **Prompt Manager** (`src/prompt_manager/`) - Prompt library (CSV file)
|
||||
5. **API Endpoints** - Full REST API reference
|
||||
6. **Web Server Routes** - Axum route documentation
|
||||
7. **WebSocket Protocol** - Real-time communication spec
|
||||
8. **MinIO Integration Details** - S3 API usage
|
||||
9. **LiveKit Integration** - Video conferencing setup
|
||||
10. **Qdrant Vector DB** - Semantic search implementation
|
||||
11. **Session Management** - Redis session storage
|
||||
12. **Drive Monitor** - File system watching
|
||||
|
||||
### ⚠️ Needs Expansion
|
||||
1. **BASIC Keywords** - Full reference for all keywords
|
||||
2. **Tool Integration** - Complete tool calling documentation
|
||||
3. **Authentication** - Detailed auth flow documentation
|
||||
4. **Configuration Parameters** - Complete `config.csv` reference
|
||||
5. **Testing** - Test writing guide
|
||||
6. **Deployment** - Production deployment guide
|
||||
7. **Multi-Tenancy** - Tenant isolation documentation
|
||||
|
||||
---
|
||||
|
||||
## Methodology
|
||||
|
||||
This documentation update was created by:
|
||||
|
||||
1. **Source Code Analysis**: Reading actual implementation in `src/`
|
||||
2. **Cargo.toml Review**: Identifying real dependencies and features
|
||||
3. **Template Inspection**: Examining `templates/` directory structure
|
||||
4. **Module Verification**: Checking `src/lib.rs` exports
|
||||
5. **Feature Testing**: Verifying optional features compile
|
||||
6. **Cross-Referencing**: Ensuring documentation matches code
|
||||
|
||||
---
|
||||
|
||||
## Verification
|
||||
|
||||
To verify this documentation matches reality:
|
||||
|
||||
```bash
|
||||
# Check module structure
|
||||
cat src/lib.rs
|
||||
|
||||
# Check Cargo features
|
||||
cat Cargo.toml | grep -A 10 '\[features\]'
|
||||
|
||||
# Check templates
|
||||
ls -la templates/
|
||||
|
||||
# Check version
|
||||
grep '^version' Cargo.toml
|
||||
|
||||
# Build with features
|
||||
cargo build --release --features vectordb,email
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Future Documentation Work
|
||||
|
||||
### Priority 1 - Critical
|
||||
- Complete API endpoint documentation
|
||||
- Full BASIC keyword reference
|
||||
- Configuration parameter guide
|
||||
|
||||
### Priority 2 - Important
|
||||
- UI module documentation
|
||||
- Deployment guide
|
||||
- Testing guide
|
||||
|
||||
### Priority 3 - Nice to Have
|
||||
- Architecture diagrams
|
||||
- Performance tuning guide
|
||||
- Advanced customization
|
||||
|
||||
---
|
||||
|
||||
## Contributing Documentation
|
||||
|
||||
When contributing documentation:
|
||||
|
||||
1. ✅ **Verify against source code** - Don't document aspirational features
|
||||
2. ✅ **Include version numbers** - Document what version you're describing
|
||||
3. ✅ **Test examples** - Ensure code examples actually work
|
||||
4. ✅ **Link to source** - Reference actual files when possible
|
||||
5. ✅ **Mark status** - Use ✅ ⚠️ 📝 to indicate documentation quality
|
||||
|
||||
---
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
This documentation update ensures BotServer documentation tells the truth about the implementation, making it easier for:
|
||||
- New contributors to understand the codebase
|
||||
- Users to set realistic expectations
|
||||
- Developers to extend functionality
|
||||
- Operators to deploy successfully
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2024
|
||||
**BotServer Version**: 6.0.8
|
||||
**Documentation Version**: 1.0 (Truth-Based Revision)
|
||||
263
docs/INDEX.md
Normal file
263
docs/INDEX.md
Normal file
|
|
@ -0,0 +1,263 @@
|
|||
# General Bots Documentation Index
|
||||
|
||||
This directory contains comprehensive documentation for the General Bots platform, organized as chapters for easy navigation.
|
||||
|
||||
## 📚 Core Documentation
|
||||
|
||||
### Chapter 0: Introduction & Getting Started
|
||||
**[00-README.md](00-README.md)** - Main project overview, quick start guide, and system architecture
|
||||
- Overview of General Bots platform
|
||||
- Installation and prerequisites
|
||||
- Quick start guide
|
||||
- Core features and capabilities
|
||||
- KB and TOOL system essentials
|
||||
- Video tutorials and resources
|
||||
|
||||
### Chapter 1: Build & Development Status
|
||||
**[01-BUILD_STATUS.md](01-BUILD_STATUS.md)** - Current build status, fixes, and development roadmap
|
||||
- Build status and metrics
|
||||
- Completed tasks
|
||||
- Remaining issues and fixes
|
||||
- Build commands for different configurations
|
||||
- Feature matrix
|
||||
- Testing strategy
|
||||
|
||||
### Chapter 2: Code of Conduct
|
||||
**[02-CODE_OF_CONDUCT.md](02-CODE_OF_CONDUCT.md)** - Community guidelines and standards (English)
|
||||
- Community pledge and standards
|
||||
- Responsibilities and scope
|
||||
- Enforcement policies
|
||||
- Reporting guidelines
|
||||
|
||||
### Chapter 3: Código de Conduta (Portuguese)
|
||||
**[03-CODE_OF_CONDUCT-pt-br.md](03-CODE_OF_CONDUCT-pt-br.md)** - Diretrizes da comunidade (Português)
|
||||
- Compromisso da comunidade
|
||||
- Padrões de comportamento
|
||||
- Responsabilidades
|
||||
- Aplicação das normas
|
||||
|
||||
### Chapter 4: Contributing Guidelines
|
||||
**[04-CONTRIBUTING.md](04-CONTRIBUTING.md)** - How to contribute to the project
|
||||
- Logging issues
|
||||
- Contributing bug fixes
|
||||
- Contributing features
|
||||
- Code requirements
|
||||
- Legal considerations
|
||||
- Running the entire system
|
||||
|
||||
### Chapter 5: Integration Status
|
||||
**[05-INTEGRATION_STATUS.md](05-INTEGRATION_STATUS.md)** - Complete module integration tracking
|
||||
- Module activation status
|
||||
- API surface exposure
|
||||
- Phase-by-phase integration plan
|
||||
- Progress metrics (50% complete)
|
||||
- Priority checklist
|
||||
|
||||
### Chapter 6: Security Policy
|
||||
**[06-SECURITY.md](06-SECURITY.md)** - Security policy and best practices
|
||||
- IT security evaluation
|
||||
- Data protection obligations
|
||||
- Information classification
|
||||
- Employee security training
|
||||
- Vulnerability reporting
|
||||
|
||||
### Chapter 7: Production Status
|
||||
**[07-STATUS.md](07-STATUS.md)** - Current production readiness and deployment guide
|
||||
- Build metrics and achievements
|
||||
- Active API endpoints
|
||||
- Configuration requirements
|
||||
- Architecture overview
|
||||
- Deployment instructions
|
||||
- Production checklist
|
||||
|
||||
## 🔧 Technical Documentation
|
||||
|
||||
### Knowledge Base & Tools
|
||||
**[KB_AND_TOOLS.md](KB_AND_TOOLS.md)** - Deep dive into the KB and TOOL system
|
||||
- Core system overview (4 essential keywords)
|
||||
- USE_KB and CLEAR_KB commands
|
||||
- USE_TOOL and CLEAR_TOOLS commands
|
||||
- .gbkb folder structure
|
||||
- Tool development with BASIC
|
||||
- Session management
|
||||
- Advanced patterns and examples
|
||||
|
||||
### Quick Start Guide
|
||||
**[QUICK_START.md](QUICK_START.md)** - Fast-track setup and first bot
|
||||
- Prerequisites installation
|
||||
- First bot creation
|
||||
- Basic conversation flows
|
||||
- Common patterns
|
||||
- Troubleshooting
|
||||
|
||||
### Security Features
|
||||
**[SECURITY_FEATURES.md](SECURITY_FEATURES.md)** - Detailed security implementation
|
||||
- Authentication mechanisms
|
||||
- OAuth2/OIDC integration
|
||||
- Data encryption
|
||||
- Security best practices
|
||||
- Zitadel integration
|
||||
- Session security
|
||||
|
||||
### Semantic Cache System
|
||||
**[SEMANTIC_CACHE.md](SEMANTIC_CACHE.md)** - LLM response caching with semantic similarity
|
||||
- Architecture and benefits
|
||||
- Implementation details
|
||||
- Redis integration
|
||||
- Performance optimization
|
||||
- Cache invalidation strategies
|
||||
- 70% cost reduction metrics
|
||||
|
||||
### SMB Deployment Guide
|
||||
**[SMB_DEPLOYMENT_GUIDE.md](SMB_DEPLOYMENT_GUIDE.md)** - Pragmatic deployment for small/medium businesses
|
||||
- Simple vs Enterprise deployment
|
||||
- Step-by-step setup
|
||||
- Configuration examples
|
||||
- Common SMB use cases
|
||||
- Troubleshooting for SMB environments
|
||||
|
||||
### Universal Messaging System
|
||||
**[BASIC_UNIVERSAL_MESSAGING.md](BASIC_UNIVERSAL_MESSAGING.md)** - Multi-channel communication
|
||||
- Channel abstraction layer
|
||||
- Email integration
|
||||
- WhatsApp Business API
|
||||
- Microsoft Teams integration
|
||||
- Instagram Direct messaging
|
||||
- Message routing and handling
|
||||
|
||||
## 🧹 Maintenance & Cleanup Documentation
|
||||
|
||||
### Cleanup Complete
|
||||
**[CLEANUP_COMPLETE.md](CLEANUP_COMPLETE.md)** - Completed cleanup tasks and achievements
|
||||
- Refactoring completed
|
||||
- Code organization improvements
|
||||
- Documentation consolidation
|
||||
- Technical debt removed
|
||||
|
||||
### Cleanup Warnings
|
||||
**[CLEANUP_WARNINGS.md](CLEANUP_WARNINGS.md)** - Warning analysis and resolution plan
|
||||
- Warning categorization
|
||||
- Resolution strategies
|
||||
- Priority levels
|
||||
- Technical decisions
|
||||
|
||||
### Fix Warnings Now
|
||||
**[FIX_WARNINGS_NOW.md](FIX_WARNINGS_NOW.md)** - Immediate action items for warnings
|
||||
- Critical warnings to fix
|
||||
- Step-by-step fixes
|
||||
- Code examples
|
||||
- Testing verification
|
||||
|
||||
### Warnings Summary
|
||||
**[WARNINGS_SUMMARY.md](WARNINGS_SUMMARY.md)** - Comprehensive warning overview
|
||||
- Total warning count
|
||||
- Warning distribution by module
|
||||
- Intentional vs fixable warnings
|
||||
- Long-term strategy
|
||||
|
||||
## 📖 Detailed Documentation (src subdirectory)
|
||||
|
||||
### Book-Style Documentation
|
||||
Located in `src/` subdirectory - comprehensive book-format documentation:
|
||||
|
||||
- **[src/README.md](src/README.md)** - Book introduction
|
||||
- **[src/SUMMARY.md](src/SUMMARY.md)** - Table of contents
|
||||
|
||||
#### Part I: Getting Started
|
||||
- **Chapter 1:** First Steps
|
||||
- Installation
|
||||
- First Conversation
|
||||
- Sessions
|
||||
|
||||
#### Part II: Package System
|
||||
- **Chapter 2:** Core Packages
|
||||
- gbai - AI Package
|
||||
- gbdialog - Dialog Package
|
||||
- gbdrive - Drive Integration
|
||||
- gbkb - Knowledge Base
|
||||
- gbot - Bot Package
|
||||
- gbtheme - Theme Package
|
||||
|
||||
#### Part III: Knowledge Management
|
||||
- **Chapter 3:** Vector Database & Search
|
||||
- Semantic Search
|
||||
- Qdrant Integration
|
||||
- Caching Strategies
|
||||
- Context Compaction
|
||||
- Indexing
|
||||
- Vector Collections
|
||||
|
||||
#### Part IV: User Interface
|
||||
- **Chapter 4:** Web Interface
|
||||
- HTML Structure
|
||||
- CSS Styling
|
||||
- Web Interface Configuration
|
||||
|
||||
#### Part V: BASIC Language
|
||||
- **Chapter 5:** BASIC Keywords
|
||||
- Basics
|
||||
- ADD_KB, ADD_TOOL, ADD_WEBSITE
|
||||
- CLEAR_TOOLS
|
||||
- CREATE_DRAFT, CREATE_SITE
|
||||
- EXIT_FOR
|
||||
- And 30+ more keywords...
|
||||
|
||||
#### Appendices
|
||||
- **Appendix I:** Database Schema
|
||||
- Tables
|
||||
- Relationships
|
||||
- Schema Documentation
|
||||
|
||||
## 📝 Changelog
|
||||
|
||||
**CHANGELOG.md** is maintained at the root directory level (not in docs/) and contains:
|
||||
- Version history
|
||||
- Release notes
|
||||
- Breaking changes
|
||||
- Migration guides
|
||||
|
||||
## 🗂️ Documentation Organization Principles
|
||||
|
||||
1. **Numbered Chapters (00-07)** - Core project documentation in reading order
|
||||
2. **Named Documents** - Technical deep-dives, organized alphabetically
|
||||
3. **src/ Subdirectory** - Book-style comprehensive documentation
|
||||
4. **Root CHANGELOG.md** - Version history at project root (the truth is in src)
|
||||
|
||||
## 🔍 Quick Navigation
|
||||
|
||||
### For New Users:
|
||||
1. Start with **00-README.md** for overview
|
||||
2. Follow **QUICK_START.md** for setup
|
||||
3. Read **KB_AND_TOOLS.md** to understand core concepts
|
||||
4. Check **07-STATUS.md** for current capabilities
|
||||
|
||||
### For Contributors:
|
||||
1. Read **04-CONTRIBUTING.md** for guidelines
|
||||
2. Check **01-BUILD_STATUS.md** for development status
|
||||
3. Review **05-INTEGRATION_STATUS.md** for module status
|
||||
4. Follow **02-CODE_OF_CONDUCT.md** for community standards
|
||||
|
||||
### For Deployers:
|
||||
1. Review **07-STATUS.md** for production readiness
|
||||
2. Read **SMB_DEPLOYMENT_GUIDE.md** for deployment steps
|
||||
3. Check **06-SECURITY.md** for security requirements
|
||||
4. Review **SECURITY_FEATURES.md** for implementation details
|
||||
|
||||
### For Developers:
|
||||
1. Check **01-BUILD_STATUS.md** for build instructions
|
||||
2. Review **05-INTEGRATION_STATUS.md** for API status
|
||||
3. Read **KB_AND_TOOLS.md** for system architecture
|
||||
4. Browse **src/** directory for detailed technical docs
|
||||
|
||||
## 📞 Support & Resources
|
||||
|
||||
- **GitHub Repository:** https://github.com/GeneralBots/BotServer
|
||||
- **Documentation Site:** https://docs.pragmatismo.com.br
|
||||
- **Stack Overflow:** Tag questions with `generalbots`
|
||||
- **Security Issues:** security@pragmatismo.com.br
|
||||
|
||||
---
|
||||
|
||||
**Last Updated:** 2024-11-22
|
||||
**Documentation Version:** 6.0.8
|
||||
**Status:** Production Ready ✅
|
||||
261
docs/REORGANIZATION_SUMMARY.md
Normal file
261
docs/REORGANIZATION_SUMMARY.md
Normal file
|
|
@ -0,0 +1,261 @@
|
|||
# Documentation Reorganization Summary
|
||||
|
||||
## Overview
|
||||
|
||||
All markdown documentation files from the project root (except CHANGELOG.md) have been successfully integrated into the `docs/` directory as organized chapters.
|
||||
|
||||
## What Was Done
|
||||
|
||||
### Files Moved to docs/
|
||||
|
||||
The following files were moved from the project root to `docs/` and renamed with chapter numbers:
|
||||
|
||||
1. **README.md** → `docs/00-README.md`
|
||||
2. **BUILD_STATUS.md** → `docs/01-BUILD_STATUS.md`
|
||||
3. **CODE_OF_CONDUCT.md** → `docs/02-CODE_OF_CONDUCT.md`
|
||||
4. **CODE_OF_CONDUCT-pt-br.md** → `docs/03-CODE_OF_CONDUCT-pt-br.md`
|
||||
5. **CONTRIBUTING.md** → `docs/04-CONTRIBUTING.md`
|
||||
6. **INTEGRATION_STATUS.md** → `docs/05-INTEGRATION_STATUS.md`
|
||||
7. **SECURITY.md** → `docs/06-SECURITY.md`
|
||||
8. **STATUS.md** → `docs/07-STATUS.md`
|
||||
|
||||
### Files Kept at Root
|
||||
|
||||
- **CHANGELOG.md** - Remains at root as specified (the truth is in src/)
|
||||
- **README.md** - New concise root README created pointing to documentation
|
||||
|
||||
### New Documentation Created
|
||||
|
||||
1. **docs/INDEX.md** - Comprehensive index of all documentation with:
|
||||
- Organized chapter structure
|
||||
- Quick navigation guides for different user types
|
||||
- Complete table of contents
|
||||
- Cross-references between documents
|
||||
|
||||
2. **README.md** (new) - Clean root README with:
|
||||
- Quick links to key documentation
|
||||
- Overview of documentation structure
|
||||
- Quick start guide
|
||||
- Key features summary
|
||||
- Links to all chapters
|
||||
|
||||
## Documentation Structure
|
||||
|
||||
### Root Level
|
||||
```
|
||||
/
|
||||
├── CHANGELOG.md (version history - stays at root)
|
||||
└── README.md (new - gateway to documentation)
|
||||
```
|
||||
|
||||
### Docs Directory
|
||||
```
|
||||
docs/
|
||||
├── INDEX.md (comprehensive documentation index)
|
||||
│
|
||||
├── 00-README.md (Chapter 0: Introduction & Getting Started)
|
||||
├── 01-BUILD_STATUS.md (Chapter 1: Build & Development Status)
|
||||
├── 02-CODE_OF_CONDUCT.md (Chapter 2: Code of Conduct)
|
||||
├── 03-CODE_OF_CONDUCT-pt-br.md (Chapter 3: Código de Conduta)
|
||||
├── 04-CONTRIBUTING.md (Chapter 4: Contributing Guidelines)
|
||||
├── 05-INTEGRATION_STATUS.md (Chapter 5: Integration Status)
|
||||
├── 06-SECURITY.md (Chapter 6: Security Policy)
|
||||
├── 07-STATUS.md (Chapter 7: Production Status)
|
||||
│
|
||||
├── BASIC_UNIVERSAL_MESSAGING.md (Technical: Multi-channel communication)
|
||||
├── CLEANUP_COMPLETE.md (Maintenance: Completed cleanup tasks)
|
||||
├── CLEANUP_WARNINGS.md (Maintenance: Warning analysis)
|
||||
├── FIX_WARNINGS_NOW.md (Maintenance: Immediate action items)
|
||||
├── KB_AND_TOOLS.md (Technical: KB and TOOL system)
|
||||
├── QUICK_START.md (Technical: Fast-track setup)
|
||||
├── SECURITY_FEATURES.md (Technical: Security implementation)
|
||||
├── SEMANTIC_CACHE.md (Technical: LLM caching)
|
||||
├── SMB_DEPLOYMENT_GUIDE.md (Technical: SMB deployment)
|
||||
├── WARNINGS_SUMMARY.md (Maintenance: Warning overview)
|
||||
│
|
||||
└── src/ (Book-style comprehensive documentation)
|
||||
├── README.md
|
||||
├── SUMMARY.md
|
||||
├── chapter-01/ (Getting Started)
|
||||
├── chapter-02/ (Package System)
|
||||
├── chapter-03/ (Knowledge Management)
|
||||
├── chapter-04/ (User Interface)
|
||||
├── chapter-05/ (BASIC Language)
|
||||
└── appendix-i/ (Database Schema)
|
||||
```
|
||||
|
||||
## Organization Principles
|
||||
|
||||
### 1. Numbered Chapters (00-07)
|
||||
Core project documentation in logical reading order:
|
||||
- **00** - Introduction and overview
|
||||
- **01** - Build and development
|
||||
- **02-03** - Community guidelines (English & Portuguese)
|
||||
- **04** - Contribution process
|
||||
- **05** - Technical integration status
|
||||
- **06** - Security policies
|
||||
- **07** - Production readiness
|
||||
|
||||
### 2. Named Technical Documents
|
||||
Organized alphabetically for easy reference:
|
||||
- Deep-dive technical documentation
|
||||
- Maintenance and cleanup guides
|
||||
- Specialized deployment guides
|
||||
- Feature-specific documentation
|
||||
|
||||
### 3. Subdirectories
|
||||
- **src/** - Book-style comprehensive documentation with full chapter structure
|
||||
|
||||
### 4. Root Level
|
||||
- **CHANGELOG.md** - Version history (authoritative source)
|
||||
- **README.md** - Entry point and navigation hub
|
||||
|
||||
## Benefits of This Structure
|
||||
|
||||
### For New Users
|
||||
1. Clear entry point via root README.md
|
||||
2. Progressive learning path through numbered chapters
|
||||
3. Quick start guide readily accessible
|
||||
4. Easy discovery of key concepts
|
||||
|
||||
### For Contributors
|
||||
1. All contribution guidelines in one place (Chapter 4)
|
||||
2. Build status immediately visible (Chapter 1)
|
||||
3. Integration status tracked (Chapter 5)
|
||||
4. Code of conduct clear (Chapters 2-3)
|
||||
|
||||
### For Deployers
|
||||
1. Production readiness documented (Chapter 7)
|
||||
2. Deployment guides organized by use case
|
||||
3. Security requirements clear (Chapter 6)
|
||||
4. Configuration examples accessible
|
||||
|
||||
### For Maintainers
|
||||
1. All documentation in one directory
|
||||
2. Consistent naming convention
|
||||
3. Easy to update and maintain
|
||||
4. Clear separation of concerns
|
||||
|
||||
## Quick Navigation Guides
|
||||
|
||||
### First-Time Users
|
||||
1. **README.md** (root) → Quick overview
|
||||
2. **docs/00-README.md** → Detailed introduction
|
||||
3. **docs/QUICK_START.md** → Get running
|
||||
4. **docs/KB_AND_TOOLS.md** → Core concepts
|
||||
|
||||
### Contributors
|
||||
1. **docs/04-CONTRIBUTING.md** → How to contribute
|
||||
2. **docs/01-BUILD_STATUS.md** → Build instructions
|
||||
3. **docs/02-CODE_OF_CONDUCT.md** → Community standards
|
||||
4. **docs/05-INTEGRATION_STATUS.md** → Current work
|
||||
|
||||
### Deployers
|
||||
1. **docs/07-STATUS.md** → Production readiness
|
||||
2. **docs/SMB_DEPLOYMENT_GUIDE.md** → Deployment steps
|
||||
3. **docs/SECURITY_FEATURES.md** → Security setup
|
||||
4. **docs/06-SECURITY.md** → Security policy
|
||||
|
||||
### Developers
|
||||
1. **docs/01-BUILD_STATUS.md** → Build setup
|
||||
2. **docs/05-INTEGRATION_STATUS.md** → API status
|
||||
3. **docs/KB_AND_TOOLS.md** → Architecture
|
||||
4. **docs/src/** → Detailed technical docs
|
||||
|
||||
## File Count Summary
|
||||
|
||||
- **Root**: 2 markdown files (README.md, CHANGELOG.md)
|
||||
- **docs/**: 19 markdown files (8 chapters + 11 technical docs)
|
||||
- **docs/src/**: ~40+ markdown files (comprehensive book)
|
||||
|
||||
## Verification Commands
|
||||
|
||||
```bash
|
||||
# Check root level
|
||||
ls -la *.md
|
||||
|
||||
# Check docs structure
|
||||
ls -la docs/*.md
|
||||
|
||||
# Check numbered chapters
|
||||
ls -1 docs/0*.md
|
||||
|
||||
# Check technical docs
|
||||
ls -1 docs/[A-Z]*.md
|
||||
|
||||
# Check book-style docs
|
||||
ls -la docs/src/
|
||||
```
|
||||
|
||||
## Migration Notes
|
||||
|
||||
1. **No content was modified** - Only file locations and names changed
|
||||
2. **All links preserved** - Internal references remain valid
|
||||
3. **CHANGELOG unchanged** - Version history stays at root as requested
|
||||
4. **Backward compatibility** - Old paths can be symlinked if needed
|
||||
|
||||
## Next Steps
|
||||
|
||||
### Recommended Actions
|
||||
1. ✅ Update any CI/CD scripts that reference old paths
|
||||
2. ✅ Update GitHub wiki links if applicable
|
||||
3. ✅ Update any external documentation links
|
||||
4. ✅ Consider adding symlinks for backward compatibility
|
||||
|
||||
### Optional Improvements
|
||||
- Add docs/README.md as alias for INDEX.md
|
||||
- Create docs/getting-started/ subdirectory for tutorials
|
||||
- Add docs/api/ for API reference documentation
|
||||
- Create docs/examples/ for code examples
|
||||
|
||||
## Success Criteria Met
|
||||
|
||||
✅ All root .md files integrated into docs/ (except CHANGELOG.md)
|
||||
✅ CHANGELOG.md remains at root
|
||||
✅ Clear chapter organization with numbered files
|
||||
✅ Comprehensive INDEX.md created
|
||||
✅ New root README.md as navigation hub
|
||||
✅ No content lost or modified
|
||||
✅ Logical structure for different user types
|
||||
✅ Easy to navigate and maintain
|
||||
|
||||
## Command Reference
|
||||
|
||||
### To verify structure:
|
||||
```bash
|
||||
# Root level (should show 2 files)
|
||||
ls *.md
|
||||
|
||||
# Docs directory (should show 19 files)
|
||||
ls docs/*.md | wc -l
|
||||
|
||||
# Numbered chapters (should show 8 files)
|
||||
ls docs/0*.md
|
||||
```
|
||||
|
||||
### To search documentation:
|
||||
```bash
|
||||
# Search all docs
|
||||
grep -r "search term" docs/
|
||||
|
||||
# Search only chapters
|
||||
grep "search term" docs/0*.md
|
||||
|
||||
# Search technical docs
|
||||
grep "search term" docs/[A-Z]*.md
|
||||
```
|
||||
|
||||
## Contact
|
||||
|
||||
For questions about documentation structure:
|
||||
- **Repository**: https://github.com/GeneralBots/BotServer
|
||||
- **Issues**: https://github.com/GeneralBots/BotServer/issues
|
||||
- **Email**: engineering@pragmatismo.com.br
|
||||
|
||||
---
|
||||
|
||||
**Reorganization Date**: 2024-11-22
|
||||
**Status**: ✅ COMPLETE
|
||||
**Files Moved**: 8
|
||||
**Files Created**: 2
|
||||
**Total Documentation Files**: 60+
|
||||
196
docs/STRUCTURE.md
Normal file
196
docs/STRUCTURE.md
Normal file
|
|
@ -0,0 +1,196 @@
|
|||
# Documentation Directory Structure
|
||||
|
||||
```
|
||||
botserver/
|
||||
│
|
||||
├── 📄 README.md ← Entry point - Quick overview & navigation
|
||||
├── 📋 CHANGELOG.md ← Version history (stays at root)
|
||||
│
|
||||
└── 📁 docs/ ← All documentation lives here
|
||||
│
|
||||
├── 📖 INDEX.md ← Comprehensive documentation index
|
||||
├── 📝 REORGANIZATION_SUMMARY.md ← This reorganization explained
|
||||
├── 🗺️ STRUCTURE.md ← This file (visual structure)
|
||||
│
|
||||
├── 📚 CORE CHAPTERS (00-07)
|
||||
│ ├── 00-README.md ← Introduction & Getting Started
|
||||
│ ├── 01-BUILD_STATUS.md ← Build & Development Status
|
||||
│ ├── 02-CODE_OF_CONDUCT.md ← Code of Conduct (English)
|
||||
│ ├── 03-CODE_OF_CONDUCT-pt-br.md ← Código de Conduta (Português)
|
||||
│ ├── 04-CONTRIBUTING.md ← Contributing Guidelines
|
||||
│ ├── 05-INTEGRATION_STATUS.md ← Module Integration Tracking
|
||||
│ ├── 06-SECURITY.md ← Security Policy
|
||||
│ └── 07-STATUS.md ← Production Status
|
||||
│
|
||||
├── 🔧 TECHNICAL DOCUMENTATION
|
||||
│ ├── BASIC_UNIVERSAL_MESSAGING.md ← Multi-channel communication
|
||||
│ ├── KB_AND_TOOLS.md ← Core KB & TOOL system
|
||||
│ ├── QUICK_START.md ← Fast-track setup guide
|
||||
│ ├── SECURITY_FEATURES.md ← Security implementation details
|
||||
│ ├── SEMANTIC_CACHE.md ← LLM caching (70% cost reduction)
|
||||
│ └── SMB_DEPLOYMENT_GUIDE.md ← Small business deployment
|
||||
│
|
||||
├── 🧹 MAINTENANCE DOCUMENTATION
|
||||
│ ├── CLEANUP_COMPLETE.md ← Completed cleanup tasks
|
||||
│ ├── CLEANUP_WARNINGS.md ← Warning analysis
|
||||
│ ├── FIX_WARNINGS_NOW.md ← Immediate action items
|
||||
│ └── WARNINGS_SUMMARY.md ← Warning overview
|
||||
│
|
||||
└── 📁 src/ ← Book-style comprehensive docs
|
||||
├── README.md ← Book introduction
|
||||
├── SUMMARY.md ← Table of contents
|
||||
│
|
||||
├── 📁 chapter-01/ ← Getting Started
|
||||
│ ├── README.md
|
||||
│ ├── installation.md
|
||||
│ ├── first-conversation.md
|
||||
│ └── sessions.md
|
||||
│
|
||||
├── 📁 chapter-02/ ← Package System
|
||||
│ ├── README.md
|
||||
│ ├── gbai.md
|
||||
│ ├── gbdialog.md
|
||||
│ ├── gbdrive.md
|
||||
│ ├── gbkb.md
|
||||
│ ├── gbot.md
|
||||
│ ├── gbtheme.md
|
||||
│ └── summary.md
|
||||
│
|
||||
├── 📁 chapter-03/ ← Knowledge Management
|
||||
│ ├── README.md
|
||||
│ ├── semantic-search.md
|
||||
│ ├── qdrant.md
|
||||
│ ├── caching.md
|
||||
│ ├── context-compaction.md
|
||||
│ ├── indexing.md
|
||||
│ ├── vector-collections.md
|
||||
│ └── summary.md
|
||||
│
|
||||
├── 📁 chapter-04/ ← User Interface
|
||||
│ ├── README.md
|
||||
│ ├── html.md
|
||||
│ ├── css.md
|
||||
│ ├── structure.md
|
||||
│ └── web-interface.md
|
||||
│
|
||||
├── 📁 chapter-05/ ← BASIC Language (30+ keywords)
|
||||
│ ├── README.md
|
||||
│ ├── basics.md
|
||||
│ ├── keyword-add-kb.md
|
||||
│ ├── keyword-add-tool.md
|
||||
│ ├── keyword-add-website.md
|
||||
│ ├── keyword-clear-tools.md
|
||||
│ ├── keyword-create-draft.md
|
||||
│ ├── keyword-create-site.md
|
||||
│ ├── keyword-exit-for.md
|
||||
│ └── ... (30+ more keyword docs)
|
||||
│
|
||||
└── 📁 appendix-i/ ← Database Schema
|
||||
├── README.md
|
||||
├── tables.md
|
||||
├── relationships.md
|
||||
└── schema.md
|
||||
```
|
||||
|
||||
## Navigation Paths
|
||||
|
||||
### 🚀 For New Users
|
||||
```
|
||||
README.md
|
||||
└─> docs/00-README.md (detailed intro)
|
||||
└─> docs/QUICK_START.md (get running)
|
||||
└─> docs/KB_AND_TOOLS.md (core concepts)
|
||||
```
|
||||
|
||||
### 👨💻 For Contributors
|
||||
```
|
||||
README.md
|
||||
└─> docs/04-CONTRIBUTING.md (guidelines)
|
||||
└─> docs/01-BUILD_STATUS.md (build setup)
|
||||
└─> docs/05-INTEGRATION_STATUS.md (current work)
|
||||
```
|
||||
|
||||
### 🚢 For Deployers
|
||||
```
|
||||
README.md
|
||||
└─> docs/07-STATUS.md (production readiness)
|
||||
└─> docs/SMB_DEPLOYMENT_GUIDE.md (deployment)
|
||||
└─> docs/SECURITY_FEATURES.md (security setup)
|
||||
```
|
||||
|
||||
### 🔍 For Developers
|
||||
```
|
||||
README.md
|
||||
└─> docs/INDEX.md (full index)
|
||||
└─> docs/src/ (detailed technical docs)
|
||||
└─> Specific chapters as needed
|
||||
```
|
||||
|
||||
## File Statistics
|
||||
|
||||
| Category | Count | Description |
|
||||
|----------|-------|-------------|
|
||||
| Root files | 2 | README.md, CHANGELOG.md |
|
||||
| Core chapters (00-07) | 8 | Numbered documentation |
|
||||
| Technical docs | 6 | Feature-specific guides |
|
||||
| Maintenance docs | 4 | Cleanup and warnings |
|
||||
| Meta docs | 3 | INDEX, REORGANIZATION, STRUCTURE |
|
||||
| Book chapters | 40+ | Comprehensive src/ docs |
|
||||
| **Total** | **60+** | All documentation files |
|
||||
|
||||
## Key Features of This Structure
|
||||
|
||||
### ✅ Clear Organization
|
||||
- Numbered chapters provide reading order
|
||||
- Technical docs organized alphabetically
|
||||
- Maintenance docs grouped together
|
||||
- Book-style docs in subdirectory
|
||||
|
||||
### ✅ Easy Navigation
|
||||
- INDEX.md provides comprehensive overview
|
||||
- README.md provides quick entry point
|
||||
- Multiple navigation paths for different users
|
||||
- Clear cross-references
|
||||
|
||||
### ✅ Maintainable
|
||||
- Consistent naming convention
|
||||
- Logical grouping
|
||||
- Easy to find and update files
|
||||
- Clear separation of concerns
|
||||
|
||||
### ✅ Discoverable
|
||||
- New users find what they need quickly
|
||||
- Contributors know where to start
|
||||
- Deployers have clear deployment path
|
||||
- Developers can dive deep into technical details
|
||||
|
||||
## Quick Commands
|
||||
|
||||
```bash
|
||||
# View all core chapters
|
||||
ls docs/0*.md
|
||||
|
||||
# View all technical documentation
|
||||
ls docs/[A-Z]*.md
|
||||
|
||||
# Search all documentation
|
||||
grep -r "search term" docs/
|
||||
|
||||
# View book-style documentation structure
|
||||
tree docs/src/
|
||||
|
||||
# Count total documentation files
|
||||
find docs -name "*.md" | wc -l
|
||||
```
|
||||
|
||||
## Version Information
|
||||
|
||||
- **Created**: 2024-11-22
|
||||
- **Version**: 6.0.8
|
||||
- **Status**: ✅ Complete
|
||||
- **Total files**: 60+
|
||||
- **Organization**: Chapters + Technical + Book-style
|
||||
|
||||
---
|
||||
|
||||
**For full documentation index, see [INDEX.md](INDEX.md)**
|
||||
23
migrations/6.0.8_directory_integration/down.sql
Normal file
23
migrations/6.0.8_directory_integration/down.sql
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
-- Drop triggers
|
||||
DROP TRIGGER IF EXISTS update_directory_users_updated_at ON public.directory_users;
|
||||
DROP TRIGGER IF EXISTS update_oauth_applications_updated_at ON public.oauth_applications;
|
||||
|
||||
-- Drop function if no other triggers use it
|
||||
DROP FUNCTION IF EXISTS update_updated_at_column() CASCADE;
|
||||
|
||||
-- Drop tables in reverse order of dependencies
|
||||
DROP TABLE IF EXISTS public.bot_access CASCADE;
|
||||
DROP TABLE IF EXISTS public.oauth_applications CASCADE;
|
||||
DROP TABLE IF EXISTS public.directory_users CASCADE;
|
||||
|
||||
-- Drop indexes
|
||||
DROP INDEX IF EXISTS idx_bots_org_id;
|
||||
|
||||
-- Remove columns from bots table
|
||||
ALTER TABLE public.bots
|
||||
DROP CONSTRAINT IF EXISTS bots_org_id_fkey,
|
||||
DROP COLUMN IF EXISTS org_id,
|
||||
DROP COLUMN IF EXISTS is_default;
|
||||
|
||||
-- Note: We don't delete the default organization or bot data as they may have other relationships
|
||||
-- The application should handle orphaned data appropriately
|
||||
246
migrations/6.0.8_directory_integration/up.sql
Normal file
246
migrations/6.0.8_directory_integration/up.sql
Normal file
|
|
@ -0,0 +1,246 @@
|
|||
-- Add organization relationship to bots
|
||||
ALTER TABLE public.bots
|
||||
ADD COLUMN IF NOT EXISTS org_id UUID,
|
||||
ADD COLUMN IF NOT EXISTS is_default BOOLEAN DEFAULT false;
|
||||
|
||||
-- Add foreign key constraint to organizations
|
||||
ALTER TABLE public.bots
|
||||
ADD CONSTRAINT bots_org_id_fkey
|
||||
FOREIGN KEY (org_id) REFERENCES public.organizations(org_id) ON DELETE CASCADE;
|
||||
|
||||
-- Create index for org_id lookups
|
||||
CREATE INDEX IF NOT EXISTS idx_bots_org_id ON public.bots(org_id);
|
||||
|
||||
-- Create directory_users table to map directory (Zitadel) users to our system
|
||||
CREATE TABLE IF NOT EXISTS public.directory_users (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
directory_id VARCHAR(255) NOT NULL UNIQUE, -- Zitadel user ID
|
||||
username VARCHAR(255) NOT NULL UNIQUE,
|
||||
email VARCHAR(255) NOT NULL UNIQUE,
|
||||
org_id UUID NOT NULL REFERENCES public.organizations(org_id) ON DELETE CASCADE,
|
||||
bot_id UUID REFERENCES public.bots(id) ON DELETE SET NULL,
|
||||
first_name VARCHAR(255),
|
||||
last_name VARCHAR(255),
|
||||
is_admin BOOLEAN DEFAULT false,
|
||||
is_bot_user BOOLEAN DEFAULT false, -- true for bot service accounts
|
||||
created_at TIMESTAMPTZ DEFAULT NOW() NOT NULL,
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW() NOT NULL
|
||||
);
|
||||
|
||||
-- Create indexes for directory_users
|
||||
CREATE INDEX IF NOT EXISTS idx_directory_users_org_id ON public.directory_users(org_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_directory_users_bot_id ON public.directory_users(bot_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_directory_users_email ON public.directory_users(email);
|
||||
CREATE INDEX IF NOT EXISTS idx_directory_users_directory_id ON public.directory_users(directory_id);
|
||||
|
||||
-- Create bot_access table to manage which users can access which bots
|
||||
CREATE TABLE IF NOT EXISTS public.bot_access (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
bot_id UUID NOT NULL REFERENCES public.bots(id) ON DELETE CASCADE,
|
||||
user_id UUID NOT NULL REFERENCES public.directory_users(id) ON DELETE CASCADE,
|
||||
access_level VARCHAR(50) NOT NULL DEFAULT 'user', -- 'owner', 'admin', 'user', 'viewer'
|
||||
granted_at TIMESTAMPTZ DEFAULT NOW() NOT NULL,
|
||||
granted_by UUID REFERENCES public.directory_users(id),
|
||||
UNIQUE(bot_id, user_id)
|
||||
);
|
||||
|
||||
-- Create indexes for bot_access
|
||||
CREATE INDEX IF NOT EXISTS idx_bot_access_bot_id ON public.bot_access(bot_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_bot_access_user_id ON public.bot_access(user_id);
|
||||
|
||||
-- Create OAuth application registry for directory integrations
|
||||
CREATE TABLE IF NOT EXISTS public.oauth_applications (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
org_id UUID NOT NULL REFERENCES public.organizations(org_id) ON DELETE CASCADE,
|
||||
project_id VARCHAR(255),
|
||||
client_id VARCHAR(255) NOT NULL UNIQUE,
|
||||
client_secret_encrypted TEXT NOT NULL, -- Store encrypted
|
||||
redirect_uris TEXT[] NOT NULL DEFAULT '{}',
|
||||
application_name VARCHAR(255) NOT NULL,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW() NOT NULL,
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW() NOT NULL
|
||||
);
|
||||
|
||||
-- Create index for OAuth applications
|
||||
CREATE INDEX IF NOT EXISTS idx_oauth_applications_org_id ON public.oauth_applications(org_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_oauth_applications_client_id ON public.oauth_applications(client_id);
|
||||
|
||||
-- Insert default organization if it doesn't exist
|
||||
INSERT INTO public.organizations (org_id, name, slug, created_at, updated_at)
|
||||
VALUES (
|
||||
'f47ac10b-58cc-4372-a567-0e02b2c3d479'::uuid, -- Fixed UUID for default org
|
||||
'Default Organization',
|
||||
'default',
|
||||
NOW(),
|
||||
NOW()
|
||||
) ON CONFLICT (slug) DO NOTHING;
|
||||
|
||||
-- Insert default bot for the default organization
|
||||
DO $$
|
||||
DECLARE
|
||||
v_org_id UUID;
|
||||
v_bot_id UUID;
|
||||
BEGIN
|
||||
-- Get the default organization ID
|
||||
SELECT org_id INTO v_org_id FROM public.organizations WHERE slug = 'default';
|
||||
|
||||
-- Generate or use fixed UUID for default bot
|
||||
v_bot_id := 'f47ac10b-58cc-4372-a567-0e02b2c3d480'::uuid;
|
||||
|
||||
-- Insert default bot if it doesn't exist
|
||||
INSERT INTO public.bots (
|
||||
id,
|
||||
org_id,
|
||||
name,
|
||||
description,
|
||||
llm_provider,
|
||||
llm_config,
|
||||
context_provider,
|
||||
context_config,
|
||||
is_default,
|
||||
is_active,
|
||||
created_at,
|
||||
updated_at
|
||||
)
|
||||
VALUES (
|
||||
v_bot_id,
|
||||
v_org_id,
|
||||
'Default Bot',
|
||||
'Default bot for the default organization',
|
||||
'openai',
|
||||
'{"model": "gpt-4", "temperature": 0.7}'::jsonb,
|
||||
'none',
|
||||
'{}'::jsonb,
|
||||
true,
|
||||
true,
|
||||
NOW(),
|
||||
NOW()
|
||||
) ON CONFLICT (id) DO UPDATE
|
||||
SET org_id = EXCLUDED.org_id,
|
||||
is_default = true,
|
||||
updated_at = NOW();
|
||||
|
||||
-- Insert default admin user (admin@default)
|
||||
INSERT INTO public.directory_users (
|
||||
directory_id,
|
||||
username,
|
||||
email,
|
||||
org_id,
|
||||
bot_id,
|
||||
first_name,
|
||||
last_name,
|
||||
is_admin,
|
||||
is_bot_user,
|
||||
created_at,
|
||||
updated_at
|
||||
)
|
||||
VALUES (
|
||||
'admin-default-001', -- Will be replaced with actual Zitadel ID
|
||||
'admin',
|
||||
'admin@default',
|
||||
v_org_id,
|
||||
v_bot_id,
|
||||
'Admin',
|
||||
'Default',
|
||||
true,
|
||||
false,
|
||||
NOW(),
|
||||
NOW()
|
||||
) ON CONFLICT (email) DO UPDATE
|
||||
SET org_id = EXCLUDED.org_id,
|
||||
bot_id = EXCLUDED.bot_id,
|
||||
is_admin = true,
|
||||
updated_at = NOW();
|
||||
|
||||
-- Insert default regular user (user@default)
|
||||
INSERT INTO public.directory_users (
|
||||
directory_id,
|
||||
username,
|
||||
email,
|
||||
org_id,
|
||||
bot_id,
|
||||
first_name,
|
||||
last_name,
|
||||
is_admin,
|
||||
is_bot_user,
|
||||
created_at,
|
||||
updated_at
|
||||
)
|
||||
VALUES (
|
||||
'user-default-001', -- Will be replaced with actual Zitadel ID
|
||||
'user',
|
||||
'user@default',
|
||||
v_org_id,
|
||||
v_bot_id,
|
||||
'User',
|
||||
'Default',
|
||||
false,
|
||||
false,
|
||||
NOW(),
|
||||
NOW()
|
||||
) ON CONFLICT (email) DO UPDATE
|
||||
SET org_id = EXCLUDED.org_id,
|
||||
bot_id = EXCLUDED.bot_id,
|
||||
is_admin = false,
|
||||
updated_at = NOW();
|
||||
|
||||
-- Grant bot access to admin user
|
||||
INSERT INTO public.bot_access (bot_id, user_id, access_level, granted_at)
|
||||
SELECT
|
||||
v_bot_id,
|
||||
id,
|
||||
'owner',
|
||||
NOW()
|
||||
FROM public.directory_users
|
||||
WHERE email = 'admin@default'
|
||||
ON CONFLICT (bot_id, user_id) DO UPDATE
|
||||
SET access_level = 'owner',
|
||||
granted_at = NOW();
|
||||
|
||||
-- Grant bot access to regular user
|
||||
INSERT INTO public.bot_access (bot_id, user_id, access_level, granted_at)
|
||||
SELECT
|
||||
v_bot_id,
|
||||
id,
|
||||
'user',
|
||||
NOW()
|
||||
FROM public.directory_users
|
||||
WHERE email = 'user@default'
|
||||
ON CONFLICT (bot_id, user_id) DO UPDATE
|
||||
SET access_level = 'user',
|
||||
granted_at = NOW();
|
||||
|
||||
END $$;
|
||||
|
||||
-- Create function to update updated_at timestamps
|
||||
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = NOW();
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ language 'plpgsql';
|
||||
|
||||
-- Add triggers for updated_at columns if they don't exist
|
||||
DO $$
|
||||
BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_trigger WHERE tgname = 'update_directory_users_updated_at') THEN
|
||||
CREATE TRIGGER update_directory_users_updated_at
|
||||
BEFORE UPDATE ON public.directory_users
|
||||
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||
END IF;
|
||||
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_trigger WHERE tgname = 'update_oauth_applications_updated_at') THEN
|
||||
CREATE TRIGGER update_oauth_applications_updated_at
|
||||
BEFORE UPDATE ON public.oauth_applications
|
||||
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||
END IF;
|
||||
END $$;
|
||||
|
||||
-- Add comment documentation
|
||||
COMMENT ON TABLE public.directory_users IS 'Maps directory (Zitadel) users to the system and their associated bots';
|
||||
COMMENT ON TABLE public.bot_access IS 'Controls which users have access to which bots and their permission levels';
|
||||
COMMENT ON TABLE public.oauth_applications IS 'OAuth application configurations for directory integration';
|
||||
COMMENT ON COLUMN public.bots.is_default IS 'Indicates if this is the default bot for an organization';
|
||||
COMMENT ON COLUMN public.directory_users.is_bot_user IS 'True if this user is a service account for bot operations';
|
||||
COMMENT ON COLUMN public.bot_access.access_level IS 'Access level: owner (full control), admin (manage), user (use), viewer (read-only)';
|
||||
File diff suppressed because it is too large
Load diff
|
|
@ -12,17 +12,46 @@ use uuid::Uuid;
|
|||
pub mod facade;
|
||||
pub mod zitadel;
|
||||
|
||||
pub use facade::{
|
||||
AuthFacade, AuthResult, CreateGroupRequest, CreateUserRequest, Group, Permission, Session,
|
||||
SimpleAuthFacade, UpdateUserRequest, User, ZitadelAuthFacade,
|
||||
};
|
||||
pub use zitadel::{UserWorkspace, ZitadelAuth, ZitadelConfig, ZitadelUser};
|
||||
use self::facade::{AuthFacade, ZitadelAuthFacade};
|
||||
use self::zitadel::{ZitadelClient, ZitadelConfig};
|
||||
|
||||
pub struct AuthService {}
|
||||
pub struct AuthService {
|
||||
facade: Box<dyn AuthFacade>,
|
||||
}
|
||||
|
||||
impl std::fmt::Debug for AuthService {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
f.debug_struct("AuthService")
|
||||
.field("facade", &"Box<dyn AuthFacade>")
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl AuthService {
|
||||
pub fn new() -> Self {
|
||||
Self {}
|
||||
pub fn new(config: ZitadelConfig) -> Self {
|
||||
let client = ZitadelClient::new(config);
|
||||
Self {
|
||||
facade: Box::new(ZitadelAuthFacade::new(client)),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_zitadel(config: ZitadelConfig) -> Self {
|
||||
let client = ZitadelClient::new(config);
|
||||
Self {
|
||||
facade: Box::new(ZitadelAuthFacade::new(client)),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_zitadel_and_cache(config: ZitadelConfig, redis_url: String) -> Self {
|
||||
let client = ZitadelClient::new(config);
|
||||
let facade = ZitadelAuthFacade::with_cache(client, redis_url);
|
||||
Self {
|
||||
facade: Box::new(facade),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn facade(&self) -> &dyn AuthFacade {
|
||||
self.facade.as_ref()
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
|
|
@ -15,6 +15,7 @@ pub mod vectordb_indexer;
|
|||
#[cfg(feature = "vectordb")]
|
||||
pub use vectordb_indexer::{IndexingStats, IndexingStatus, VectorDBIndexer};
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct AutomationService {
|
||||
state: Arc<AppState>,
|
||||
}
|
||||
|
|
|
|||
|
|
@ -11,10 +11,9 @@ use uuid::Uuid;
|
|||
use crate::auth::UserWorkspace;
|
||||
use crate::shared::utils::DbPool;
|
||||
|
||||
// VectorDB types are defined locally in this module
|
||||
#[cfg(feature = "vectordb")]
|
||||
use crate::drive::vectordb::{FileContentExtractor, FileDocument, UserDriveVectorDB};
|
||||
#[cfg(all(feature = "vectordb", feature = "email"))]
|
||||
use crate::email::vectordb::{EmailDocument, EmailEmbeddingGenerator, UserEmailVectorDB};
|
||||
use qdrant_client::prelude::*;
|
||||
|
||||
/// Indexing job status
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
|
|
|
|||
|
|
@ -75,6 +75,7 @@ pub struct OpenAIProperty {
|
|||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub example: Option<String>,
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct BasicCompiler {
|
||||
state: Arc<AppState>,
|
||||
bot_id: uuid::Uuid,
|
||||
|
|
|
|||
|
|
@ -244,17 +244,17 @@ async fn execute_create_team(
|
|||
|
||||
let user_id_str = user.user_id.to_string();
|
||||
let now = Utc::now();
|
||||
let permissions_json = serde_json::to_value(json!({
|
||||
"workspace_enabled": true,
|
||||
"chat_enabled": true,
|
||||
"file_sharing": true
|
||||
}))
|
||||
.unwrap();
|
||||
|
||||
let query = query
|
||||
.bind::<diesel::sql_types::Text, _>(&user_id_str)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&now)
|
||||
.bind::<diesel::sql_types::Jsonb, _>(
|
||||
&serde_json::to_value(json!({
|
||||
"workspace_enabled": true,
|
||||
"chat_enabled": true,
|
||||
"file_sharing": true
|
||||
}))
|
||||
.unwrap(),
|
||||
);
|
||||
.bind::<diesel::sql_types::Jsonb, _>(&permissions_json);
|
||||
|
||||
query.execute(&mut *conn).map_err(|e| {
|
||||
error!("Failed to create team: {}", e);
|
||||
|
|
@ -438,11 +438,13 @@ async fn create_workspace_structure(
|
|||
"INSERT INTO workspace_folders (id, team_id, path, name, created_at)
|
||||
VALUES ($1, $2, $3, $4, $5)",
|
||||
)
|
||||
.bind::<diesel::sql_types::Text, _>(&folder_id)
|
||||
.bind::<diesel::sql_types::Text, _>(team_id)
|
||||
.bind::<diesel::sql_types::Text, _>(&folder_path)
|
||||
.bind::<diesel::sql_types::Text, _>(folder)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&chrono::Utc::now());
|
||||
.bind::<diesel::sql_types::Text, _>(&folder_id);
|
||||
let now = chrono::Utc::now();
|
||||
let query = query
|
||||
.bind::<diesel::sql_types::Text, _>(team_id)
|
||||
.bind::<diesel::sql_types::Text, _>(&folder_path)
|
||||
.bind::<diesel::sql_types::Text, _>(folder)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&now);
|
||||
|
||||
query.execute(&mut *conn).map_err(|e| {
|
||||
error!("Failed to create workspace folder: {}", e);
|
||||
|
|
|
|||
|
|
@ -8,15 +8,6 @@ use serde_json::json;
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
struct BookingRequest {
|
||||
attendees: Vec<String>,
|
||||
date_range: String,
|
||||
duration_minutes: i32,
|
||||
subject: Option<String>,
|
||||
description: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
struct TimeSlot {
|
||||
start: DateTime<Utc>,
|
||||
|
|
@ -357,19 +348,24 @@ async fn create_calendar_event(
|
|||
// Store in database
|
||||
let mut conn = state.conn.get().map_err(|e| format!("DB error: {}", e))?;
|
||||
|
||||
let user_id_str = user.user_id.to_string();
|
||||
let bot_id_str = user.bot_id.to_string();
|
||||
let attendees_json = json!(attendees);
|
||||
let now = Utc::now();
|
||||
|
||||
let query = diesel::sql_query(
|
||||
"INSERT INTO calendar_events (id, user_id, bot_id, subject, description, start_time, end_time, attendees, created_at)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)"
|
||||
)
|
||||
.bind::<diesel::sql_types::Text, _>(&event_id)
|
||||
.bind::<diesel::sql_types::Text, _>(&user.user_id.to_string())
|
||||
.bind::<diesel::sql_types::Text, _>(&user.bot_id.to_string())
|
||||
.bind::<diesel::sql_types::Text, _>(&user_id_str)
|
||||
.bind::<diesel::sql_types::Text, _>(&bot_id_str)
|
||||
.bind::<diesel::sql_types::Text, _>(subject)
|
||||
.bind::<diesel::sql_types::Nullable<diesel::sql_types::Text>, _>(&description)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&start)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&end)
|
||||
.bind::<diesel::sql_types::Jsonb, _>(&json!(attendees))
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&Utc::now());
|
||||
.bind::<diesel::sql_types::Jsonb, _>(&attendees_json)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&now);
|
||||
|
||||
use diesel::RunQueryDsl;
|
||||
query.execute(&mut *conn).map_err(|e| {
|
||||
|
|
|
|||
|
|
@ -75,9 +75,12 @@ pub fn register_clear_kb_keyword(
|
|||
|
||||
match result {
|
||||
Ok(Ok(count)) => {
|
||||
// Get the remaining active KB count
|
||||
let remaining_count =
|
||||
get_active_kb_count(&state_clone2.conn, session_clone2.id).unwrap_or(0);
|
||||
info!(
|
||||
"✅ Cleared {} KBs from session {}",
|
||||
count, session_clone2.id
|
||||
"Successfully cleared {} KB associations for session {}, {} remaining active",
|
||||
count, session_clone2.id, remaining_count
|
||||
);
|
||||
Ok(Dynamic::UNIT)
|
||||
}
|
||||
|
|
@ -116,13 +119,19 @@ fn clear_specific_kb(
|
|||
.execute(&mut conn)
|
||||
.map_err(|e| format!("Failed to clear KB: {}", e))?;
|
||||
|
||||
// Get the remaining active KB count after clearing
|
||||
let remaining_count = get_active_kb_count(&conn_pool, session_id).unwrap_or(0);
|
||||
|
||||
if rows_affected == 0 {
|
||||
info!(
|
||||
"KB '{}' was not active in session {} or not found",
|
||||
kb_name, session_id
|
||||
);
|
||||
} else {
|
||||
info!("✅ Cleared KB '{}' from session {}", kb_name, session_id);
|
||||
info!(
|
||||
"✅ Cleared KB '{}' from session {}, {} KB(s) remaining active",
|
||||
kb_name, session_id, remaining_count
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
|
|
|
|||
|
|
@ -2,15 +2,6 @@ use crate::shared::models::UserSession;
|
|||
use crate::shared::state::AppState;
|
||||
use rhai::Dynamic;
|
||||
use rhai::Engine;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SaveDraftRequest {
|
||||
pub to: String,
|
||||
pub subject: String,
|
||||
pub cc: Option<String>,
|
||||
pub text: String,
|
||||
}
|
||||
|
||||
pub fn create_draft_keyword(_state: &AppState, _user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = _state.clone();
|
||||
|
|
@ -34,60 +25,76 @@ pub fn create_draft_keyword(_state: &AppState, _user: UserSession, engine: &mut
|
|||
}
|
||||
|
||||
async fn execute_create_draft(
|
||||
_state: &AppState,
|
||||
state: &AppState,
|
||||
to: &str,
|
||||
subject: &str,
|
||||
reply_text: &str,
|
||||
) -> Result<String, String> {
|
||||
// For now, we'll store drafts in the database or just log them
|
||||
// This is a simplified implementation until the email module is fully ready
|
||||
|
||||
#[cfg(feature = "email")]
|
||||
{
|
||||
// When email feature is enabled, try to use email functionality if available
|
||||
// For now, we'll just simulate draft creation
|
||||
use log::info;
|
||||
use crate::email::{fetch_latest_sent_to, save_email_draft, SaveDraftRequest};
|
||||
|
||||
info!("Creating draft email - To: {}, Subject: {}", to, subject);
|
||||
let config = state.config.as_ref().ok_or("No email config")?;
|
||||
|
||||
// In a real implementation, this would:
|
||||
// 1. Connect to email service
|
||||
// 2. Create draft in IMAP folder or local storage
|
||||
// 3. Return draft ID or confirmation
|
||||
// Fetch any previous emails to this recipient for threading
|
||||
let previous_email = fetch_latest_sent_to(&config.email, to)
|
||||
.await
|
||||
.unwrap_or_default();
|
||||
|
||||
let draft_id = uuid::Uuid::new_v4().to_string();
|
||||
let email_body = if !previous_email.is_empty() {
|
||||
// Create a threaded reply
|
||||
let email_separator = "<br><hr><br>";
|
||||
let formatted_reply = reply_text.replace("FIX", "Fixed");
|
||||
let formatted_old = previous_email.replace("\n", "<br>");
|
||||
format!("{}{}{}", formatted_reply, email_separator, formatted_old)
|
||||
} else {
|
||||
reply_text.to_string()
|
||||
};
|
||||
|
||||
// You could store this in the database
|
||||
// For now, just return success
|
||||
Ok(format!("Draft saved successfully with ID: {}", draft_id))
|
||||
let draft_request = SaveDraftRequest {
|
||||
to: to.to_string(),
|
||||
subject: subject.to_string(),
|
||||
cc: None,
|
||||
text: email_body,
|
||||
};
|
||||
|
||||
save_email_draft(&config.email, &draft_request)
|
||||
.await
|
||||
.map(|_| "Draft saved successfully".to_string())
|
||||
.map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "email"))]
|
||||
{
|
||||
// When email feature is disabled, return a placeholder message
|
||||
Ok(format!(
|
||||
"Email feature not enabled. Would create draft - To: {}, Subject: {}, Body: {}",
|
||||
to, subject, reply_text
|
||||
))
|
||||
// Store draft in database when email feature is disabled
|
||||
use chrono::Utc;
|
||||
use diesel::prelude::*;
|
||||
use uuid::Uuid;
|
||||
|
||||
let draft_id = Uuid::new_v4();
|
||||
let conn = state.conn.clone();
|
||||
let to = to.to_string();
|
||||
let subject = subject.to_string();
|
||||
let reply_text = reply_text.to_string();
|
||||
|
||||
tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| e.to_string())?;
|
||||
|
||||
diesel::sql_query(
|
||||
"INSERT INTO email_drafts (id, recipient, subject, body, created_at)
|
||||
VALUES ($1, $2, $3, $4, $5)",
|
||||
)
|
||||
.bind::<diesel::sql_types::Uuid, _>(&draft_id)
|
||||
.bind::<diesel::sql_types::Text, _>(&to)
|
||||
.bind::<diesel::sql_types::Text, _>(&subject)
|
||||
.bind::<diesel::sql_types::Text, _>(&reply_text)
|
||||
.bind::<diesel::sql_types::Timestamptz, _>(&Utc::now())
|
||||
.execute(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
Ok::<_, String>(format!("Draft saved with ID: {}", draft_id))
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())?
|
||||
}
|
||||
}
|
||||
|
||||
// Helper functions that would be implemented when email module is complete
|
||||
#[cfg(feature = "email")]
|
||||
async fn fetch_latest_sent_to(
|
||||
_config: &Option<crate::config::Config>,
|
||||
_to: &str,
|
||||
) -> Result<String, String> {
|
||||
// This would fetch the latest email sent to the recipient
|
||||
// For threading/reply purposes
|
||||
Ok(String::new())
|
||||
}
|
||||
|
||||
#[cfg(feature = "email")]
|
||||
async fn save_email_draft(
|
||||
_config: &Option<crate::config::Config>,
|
||||
_draft: &SaveDraftRequest,
|
||||
) -> Result<(), String> {
|
||||
// This would save the draft to the email server or local storage
|
||||
Ok(())
|
||||
}
|
||||
|
|
|
|||
|
|
@ -403,30 +403,23 @@ async fn save_to_table(
|
|||
|
||||
// Build dynamic INSERT query
|
||||
let mut fields = vec!["id", "created_at"];
|
||||
let mut placeholders = vec!["$1", "$2"];
|
||||
let mut placeholders = vec!["$1".to_string(), "$2".to_string()];
|
||||
let mut bind_index = 3;
|
||||
|
||||
let data_obj = data.as_object().ok_or("Invalid data format")?;
|
||||
|
||||
for (field, _) in data_obj {
|
||||
fields.push(field);
|
||||
placeholders.push(&format!("${}", bind_index));
|
||||
placeholders.push(format!("${}", bind_index));
|
||||
bind_index += 1;
|
||||
}
|
||||
|
||||
// Add user tracking if not already present
|
||||
if !data_obj.contains_key("user_id") {
|
||||
fields.push("user_id");
|
||||
placeholders.push(&format!("${}", bind_index));
|
||||
placeholders.push(format!("${}", bind_index));
|
||||
}
|
||||
|
||||
let insert_query = format!(
|
||||
"INSERT INTO {} ({}) VALUES ({})",
|
||||
table_name,
|
||||
fields.join(", "),
|
||||
placeholders.join(", ")
|
||||
);
|
||||
|
||||
// Build values as JSON for simpler handling
|
||||
let mut values_map = serde_json::Map::new();
|
||||
values_map.insert("id".to_string(), json!(record_id));
|
||||
|
|
|
|||
|
|
@ -397,7 +397,8 @@ fn apply_template_variables(
|
|||
if let Some(obj) = variables.as_object() {
|
||||
for (key, value) in obj {
|
||||
let placeholder = format!("{{{{{}}}}}", key);
|
||||
let replacement = value.as_str().unwrap_or(&value.to_string());
|
||||
let value_string = value.to_string();
|
||||
let replacement = value.as_str().unwrap_or(&value_string);
|
||||
content = content.replace(&placeholder, replacement);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,43 +1,58 @@
|
|||
use crate::shared::models::TriggerKind;
|
||||
use diesel::prelude::*;
|
||||
use log::{trace};
|
||||
use log::trace;
|
||||
use serde_json::{json, Value};
|
||||
use uuid::Uuid;
|
||||
use crate::shared::models::TriggerKind;
|
||||
pub fn execute_set_schedule(conn: &mut diesel::PgConnection, cron: &str, script_name: &str, bot_uuid: Uuid) -> Result<Value, Box<dyn std::error::Error>> {
|
||||
trace!("Scheduling SET SCHEDULE cron: {}, script: {}, bot_id: {:?}", cron, script_name, bot_uuid);
|
||||
use crate::shared::models::bots::dsl::bots;
|
||||
let bot_exists: bool = diesel::select(diesel::dsl::exists(bots.filter(crate::shared::models::bots::dsl::id.eq(bot_uuid)))).get_result(conn)?;
|
||||
if !bot_exists {
|
||||
return Err(format!("Bot with id {} does not exist", bot_uuid).into());
|
||||
}
|
||||
use crate::shared::models::system_automations::dsl::*;
|
||||
let new_automation = (
|
||||
bot_id.eq(bot_uuid),
|
||||
kind.eq(TriggerKind::Scheduled as i32),
|
||||
schedule.eq(cron),
|
||||
param.eq(script_name),
|
||||
is_active.eq(true),
|
||||
);
|
||||
let update_result = diesel::update(system_automations)
|
||||
.filter(bot_id.eq(bot_uuid))
|
||||
.filter(kind.eq(TriggerKind::Scheduled as i32))
|
||||
.filter(param.eq(script_name))
|
||||
.set((
|
||||
schedule.eq(cron),
|
||||
is_active.eq(true),
|
||||
last_triggered.eq(None::<chrono::DateTime<chrono::Utc>>),
|
||||
))
|
||||
.execute(&mut *conn)?;
|
||||
let result = if update_result == 0 {
|
||||
diesel::insert_into(system_automations).values(&new_automation).execute(&mut *conn)?
|
||||
} else {
|
||||
update_result
|
||||
};
|
||||
Ok(json!({
|
||||
"command": "set_schedule",
|
||||
"schedule": cron,
|
||||
"script": script_name,
|
||||
"bot_id": bot_uuid.to_string(),
|
||||
"rows_affected": result
|
||||
}))
|
||||
pub fn execute_set_schedule(
|
||||
conn: &mut diesel::PgConnection,
|
||||
cron: &str,
|
||||
script_name: &str,
|
||||
bot_uuid: Uuid,
|
||||
) -> Result<Value, Box<dyn std::error::Error>> {
|
||||
trace!(
|
||||
"Scheduling SET SCHEDULE cron: {}, script: {}, bot_id: {:?}",
|
||||
cron,
|
||||
script_name,
|
||||
bot_uuid
|
||||
);
|
||||
use crate::shared::models::bots::dsl::bots;
|
||||
let bot_exists: bool = diesel::select(diesel::dsl::exists(
|
||||
bots.filter(crate::shared::models::bots::dsl::id.eq(bot_uuid)),
|
||||
))
|
||||
.get_result(conn)?;
|
||||
if !bot_exists {
|
||||
return Err(format!("Bot with id {} does not exist", bot_uuid).into());
|
||||
}
|
||||
use crate::shared::models::system_automations::dsl::*;
|
||||
let new_automation = (
|
||||
bot_id.eq(bot_uuid),
|
||||
kind.eq(TriggerKind::Scheduled as i32),
|
||||
schedule.eq(cron),
|
||||
param.eq(script_name),
|
||||
is_active.eq(true),
|
||||
);
|
||||
let update_result = diesel::update(system_automations)
|
||||
.filter(bot_id.eq(bot_uuid))
|
||||
.filter(kind.eq(TriggerKind::Scheduled as i32))
|
||||
.filter(param.eq(script_name))
|
||||
.set((
|
||||
schedule.eq(cron),
|
||||
is_active.eq(true),
|
||||
last_triggered.eq(None::<chrono::DateTime<chrono::Utc>>),
|
||||
))
|
||||
.execute(&mut *conn)?;
|
||||
let result = if update_result == 0 {
|
||||
diesel::insert_into(system_automations)
|
||||
.values(&new_automation)
|
||||
.execute(&mut *conn)?
|
||||
} else {
|
||||
update_result
|
||||
};
|
||||
Ok(json!({
|
||||
"command": "set_schedule",
|
||||
"schedule": cron,
|
||||
"script": script_name,
|
||||
"bot_id": bot_uuid.to_string(),
|
||||
"rows_affected": result
|
||||
}))
|
||||
}
|
||||
|
|
|
|||
|
|
@ -563,9 +563,9 @@ async fn send_web_file(
|
|||
}
|
||||
|
||||
async fn send_email(
|
||||
_state: Arc<AppState>,
|
||||
_email: &str,
|
||||
_message: &str,
|
||||
state: Arc<AppState>,
|
||||
email: &str,
|
||||
message: &str,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
// Send email using the email service
|
||||
#[cfg(feature = "email")]
|
||||
|
|
@ -576,20 +576,22 @@ async fn send_email(
|
|||
email_service
|
||||
.send_email(email, "Message from Bot", message, None)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "email"))]
|
||||
{
|
||||
let _ = (state, email, message); // Explicitly use variables to avoid warnings
|
||||
error!("Email feature not enabled");
|
||||
return Err("Email feature not enabled".into());
|
||||
Err("Email feature not enabled".into())
|
||||
}
|
||||
}
|
||||
|
||||
async fn send_email_attachment(
|
||||
_state: Arc<AppState>,
|
||||
_email: &str,
|
||||
_file_data: Vec<u8>,
|
||||
_caption: &str,
|
||||
state: Arc<AppState>,
|
||||
email: &str,
|
||||
file_data: Vec<u8>,
|
||||
caption: &str,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
#[cfg(feature = "email")]
|
||||
{
|
||||
|
|
@ -599,12 +601,14 @@ async fn send_email_attachment(
|
|||
email_service
|
||||
.send_email_with_attachment(email, "File from Bot", caption, file_data, "attachment")
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "email"))]
|
||||
{
|
||||
error!("Email feature not enabled");
|
||||
return Err("Email feature not enabled".into());
|
||||
let _ = (state, email, file_data, caption); // Explicitly use variables to avoid warnings
|
||||
error!("Email feature not enabled for attachments");
|
||||
Err("Email feature not enabled".into())
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -20,14 +20,14 @@ struct KbCollectionResult {
|
|||
qdrant_collection: String,
|
||||
}
|
||||
|
||||
#[derive(QueryableByName)]
|
||||
struct ActiveKbResult {
|
||||
#[derive(QueryableByName, Debug, Clone)]
|
||||
pub struct ActiveKbResult {
|
||||
#[diesel(sql_type = diesel::sql_types::Text)]
|
||||
kb_name: String,
|
||||
pub kb_name: String,
|
||||
#[diesel(sql_type = diesel::sql_types::Text)]
|
||||
kb_folder_path: String,
|
||||
pub kb_folder_path: String,
|
||||
#[diesel(sql_type = diesel::sql_types::Text)]
|
||||
qdrant_collection: String,
|
||||
pub qdrant_collection: String,
|
||||
}
|
||||
|
||||
/// Register USE_KB keyword
|
||||
|
|
|
|||
|
|
@ -7,8 +7,8 @@ use serde::{Deserialize, Serialize};
|
|||
use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
struct WeatherData {
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WeatherData {
|
||||
pub location: String,
|
||||
pub temperature: String,
|
||||
pub condition: String,
|
||||
|
|
@ -16,7 +16,7 @@ struct WeatherData {
|
|||
}
|
||||
|
||||
/// Fetches weather data from 7Timer! API (free, no auth)
|
||||
async fn fetch_weather(location: &str) -> Result<WeatherData, Box<dyn std::error::Error>> {
|
||||
pub async fn fetch_weather(location: &str) -> Result<WeatherData, Box<dyn std::error::Error>> {
|
||||
// Parse location to get coordinates (simplified - in production use geocoding)
|
||||
let (lat, lon) = parse_location(location)?;
|
||||
|
||||
|
|
@ -28,9 +28,7 @@ async fn fetch_weather(location: &str) -> Result<WeatherData, Box<dyn std::error
|
|||
|
||||
trace!("Fetching weather from: {}", url);
|
||||
|
||||
let client = Client::builder()
|
||||
.timeout(Duration::from_secs(10))
|
||||
.build()?;
|
||||
let client = Client::builder().timeout(Duration::from_secs(10)).build()?;
|
||||
|
||||
let response = client.get(&url).send().await?;
|
||||
|
||||
|
|
@ -68,10 +66,7 @@ async fn fetch_weather(location: &str) -> Result<WeatherData, Box<dyn std::error
|
|||
// Build forecast string
|
||||
let mut forecast_parts = Vec::new();
|
||||
for (i, item) in dataseries.iter().take(3).enumerate() {
|
||||
if let (Some(temp), Some(weather)) = (
|
||||
item["temp2m"].as_i64(),
|
||||
item["weather"].as_str(),
|
||||
) {
|
||||
if let (Some(temp), Some(weather)) = (item["temp2m"].as_i64(), item["weather"].as_str()) {
|
||||
forecast_parts.push(format!("{}h: {}°C, {}", i * 3, temp, weather));
|
||||
}
|
||||
}
|
||||
|
|
@ -86,7 +81,7 @@ async fn fetch_weather(location: &str) -> Result<WeatherData, Box<dyn std::error
|
|||
}
|
||||
|
||||
/// Simple location parser (lat,lon or city name)
|
||||
fn parse_location(location: &str) -> Result<(f64, f64), Box<dyn std::error::Error>> {
|
||||
pub fn parse_location(location: &str) -> Result<(f64, f64), Box<dyn std::error::Error>> {
|
||||
// Check if it's coordinates (lat,lon)
|
||||
if let Some((lat_str, lon_str)) = location.split_once(',') {
|
||||
let lat = lat_str.trim().parse::<f64>()?;
|
||||
|
|
@ -116,74 +111,72 @@ fn parse_location(location: &str) -> Result<(f64, f64), Box<dyn std::error::Erro
|
|||
"chicago" => (41.8781, -87.6298),
|
||||
"toronto" => (43.6532, -79.3832),
|
||||
"mexico city" => (19.4326, -99.1332),
|
||||
_ => return Err(format!("Unknown location: {}. Use 'lat,lon' format or known city", location).into()),
|
||||
_ => {
|
||||
return Err(format!(
|
||||
"Unknown location: {}. Use 'lat,lon' format or known city",
|
||||
location
|
||||
)
|
||||
.into())
|
||||
}
|
||||
};
|
||||
|
||||
Ok(coords)
|
||||
}
|
||||
|
||||
/// Register WEATHER keyword in Rhai engine
|
||||
pub fn weather_keyword(
|
||||
_state: Arc<AppState>,
|
||||
_user_session: UserSession,
|
||||
engine: &mut Engine,
|
||||
) {
|
||||
engine.register_custom_syntax(
|
||||
&["WEATHER", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let location = context.eval_expression_tree(&inputs[0])?;
|
||||
let location_str = location.to_string();
|
||||
pub fn weather_keyword(_state: Arc<AppState>, _user_session: UserSession, engine: &mut Engine) {
|
||||
let _ = engine.register_custom_syntax(&["WEATHER", "$expr$"], false, move |context, inputs| {
|
||||
let location = context.eval_expression_tree(&inputs[0])?;
|
||||
let location_str = location.to_string();
|
||||
|
||||
trace!("WEATHER keyword called for: {}", location_str);
|
||||
trace!("WEATHER keyword called for: {}", location_str);
|
||||
|
||||
// Create channel for async result
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
// Create channel for async result
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
// Spawn blocking task
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_current_thread()
|
||||
.enable_all()
|
||||
.build();
|
||||
// Spawn blocking task
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_current_thread()
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let result = if let Ok(rt) = rt {
|
||||
rt.block_on(async {
|
||||
match fetch_weather(&location_str).await {
|
||||
Ok(weather) => {
|
||||
let msg = format!(
|
||||
"Weather for {}: {} ({}). Forecast: {}",
|
||||
weather.location,
|
||||
weather.temperature,
|
||||
weather.condition,
|
||||
weather.forecast
|
||||
);
|
||||
Ok(msg)
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Weather fetch failed: {}", e);
|
||||
Err(format!("Could not fetch weather: {}", e))
|
||||
}
|
||||
let result = if let Ok(rt) = rt {
|
||||
rt.block_on(async {
|
||||
match fetch_weather(&location_str).await {
|
||||
Ok(weather) => {
|
||||
let msg = format!(
|
||||
"Weather for {}: {} ({}). Forecast: {}",
|
||||
weather.location,
|
||||
weather.temperature,
|
||||
weather.condition,
|
||||
weather.forecast
|
||||
);
|
||||
Ok(msg)
|
||||
}
|
||||
})
|
||||
} else {
|
||||
Err("Failed to create runtime".to_string())
|
||||
};
|
||||
Err(e) => {
|
||||
error!("Weather fetch failed: {}", e);
|
||||
Err(format!("Could not fetch weather: {}", e))
|
||||
}
|
||||
}
|
||||
})
|
||||
} else {
|
||||
Err("Failed to create runtime".to_string())
|
||||
};
|
||||
|
||||
let _ = tx.send(result);
|
||||
});
|
||||
let _ = tx.send(result);
|
||||
});
|
||||
|
||||
// Wait for result
|
||||
match rx.recv() {
|
||||
Ok(Ok(weather_msg)) => Ok(Dynamic::from(weather_msg)),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
e.into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(_) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"Weather request timeout".into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
);
|
||||
// Wait for result
|
||||
match rx.recv() {
|
||||
Ok(Ok(weather_msg)) => Ok(Dynamic::from(weather_msg)),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
e.into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(_) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"Weather request timeout".into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
});
|
||||
}
|
||||
|
|
|
|||
|
|
@ -37,6 +37,7 @@ use self::keywords::set::set_keyword;
|
|||
use self::keywords::set_context::set_context_keyword;
|
||||
|
||||
use self::keywords::wait::wait_keyword;
|
||||
#[derive(Debug)]
|
||||
pub struct ScriptService {
|
||||
pub engine: Engine,
|
||||
}
|
||||
|
|
@ -82,6 +83,13 @@ impl ScriptService {
|
|||
create_task_keyword(state.clone(), user.clone(), &mut engine);
|
||||
add_member_keyword(state.clone(), user.clone(), &mut engine);
|
||||
|
||||
// Register universal messaging keywords
|
||||
keywords::universal_messaging::register_universal_messaging(
|
||||
state.clone(),
|
||||
user.clone(),
|
||||
&mut engine,
|
||||
);
|
||||
|
||||
ScriptService { engine }
|
||||
}
|
||||
fn preprocess_basic_script(&self, script: &str) -> String {
|
||||
|
|
|
|||
|
|
@ -11,9 +11,11 @@ use rand::distr::Alphanumeric;
|
|||
use std::io::{self, Write};
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
#[derive(Debug)]
|
||||
pub struct ComponentInfo {
|
||||
pub name: &'static str,
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct BootstrapManager {
|
||||
pub install_mode: InstallMode,
|
||||
pub tenant: Option<String>,
|
||||
|
|
@ -157,22 +159,69 @@ impl BootstrapManager {
|
|||
/// Setup Directory (Zitadel) with default organization and user
|
||||
async fn setup_directory(&self) -> Result<()> {
|
||||
let config_path = PathBuf::from("./config/directory_config.json");
|
||||
let work_root = PathBuf::from("./work");
|
||||
|
||||
// Ensure config directory exists
|
||||
tokio::fs::create_dir_all("./config").await?;
|
||||
|
||||
let mut setup = DirectorySetup::new("http://localhost:8080".to_string(), config_path);
|
||||
|
||||
let config = setup.initialize().await?;
|
||||
// Create default organization
|
||||
let org_name = "default";
|
||||
let org_id = setup
|
||||
.create_organization(org_name, "Default Organization")
|
||||
.await?;
|
||||
info!("✅ Created default organization: {}", org_name);
|
||||
|
||||
// Create admin@default account for bot administration
|
||||
let admin_user = setup
|
||||
.create_user(
|
||||
&org_id,
|
||||
"admin",
|
||||
"admin@default",
|
||||
"Admin123!",
|
||||
"Admin",
|
||||
"Default",
|
||||
true, // is_admin
|
||||
)
|
||||
.await?;
|
||||
info!("✅ Created admin user: admin@default");
|
||||
|
||||
// Create user@default account for regular bot usage
|
||||
let regular_user = setup
|
||||
.create_user(
|
||||
&org_id,
|
||||
"user",
|
||||
"user@default",
|
||||
"User123!",
|
||||
"User",
|
||||
"Default",
|
||||
false, // is_admin
|
||||
)
|
||||
.await?;
|
||||
info!("✅ Created regular user: user@default");
|
||||
info!(" Regular user ID: {}", regular_user.id);
|
||||
|
||||
// Create OAuth2 application for BotServer
|
||||
let (project_id, client_id, client_secret) =
|
||||
setup.create_oauth_application(&org_id).await?;
|
||||
info!("✅ Created OAuth2 application in project: {}", project_id);
|
||||
|
||||
// Save configuration
|
||||
let config = setup
|
||||
.save_config(
|
||||
org_id.clone(),
|
||||
org_name.to_string(),
|
||||
admin_user,
|
||||
client_id.clone(),
|
||||
client_secret,
|
||||
)
|
||||
.await?;
|
||||
|
||||
info!("✅ Directory initialized successfully!");
|
||||
info!(" Organization: {}", config.default_org.name);
|
||||
info!(
|
||||
" Default User: {} / {}",
|
||||
config.default_user.email, config.default_user.password
|
||||
);
|
||||
info!(" Client ID: {}", config.client_id);
|
||||
info!(" Organization: default");
|
||||
info!(" Admin User: admin@default / Admin123!");
|
||||
info!(" Regular User: user@default / User123!");
|
||||
info!(" Client ID: {}", client_id);
|
||||
info!(" Login URL: {}", config.base_url);
|
||||
|
||||
Ok(())
|
||||
|
|
|
|||
|
|
@ -19,6 +19,8 @@ use tokio::sync::mpsc;
|
|||
use tokio::sync::Mutex as AsyncMutex;
|
||||
use uuid::Uuid;
|
||||
|
||||
pub mod multimedia;
|
||||
|
||||
/// Retrieves the default bot (first active bot) from the database.
|
||||
pub fn get_default_bot(conn: &mut PgConnection) -> (Uuid, String) {
|
||||
use crate::shared::models::schema::bots::dsl::*;
|
||||
|
|
@ -41,6 +43,7 @@ pub fn get_default_bot(conn: &mut PgConnection) -> (Uuid, String) {
|
|||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct BotOrchestrator {
|
||||
pub state: Arc<AppState>,
|
||||
pub mounted_bots: Arc<AsyncMutex<HashMap<String, Arc<DriveMonitor>>>>,
|
||||
|
|
@ -478,10 +481,47 @@ pub async fn create_bot_handler(
|
|||
.get("bot_name")
|
||||
.cloned()
|
||||
.unwrap_or_else(|| "default".to_string());
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(serde_json::json!({ "status": format!("bot '{}' created", bot_name) })),
|
||||
)
|
||||
|
||||
// Use state to create the bot in the database
|
||||
let mut conn = match state.conn.get() {
|
||||
Ok(conn) => conn,
|
||||
Err(e) => {
|
||||
return (
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Database error: {}", e) })),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::bots::dsl::*;
|
||||
use diesel::prelude::*;
|
||||
|
||||
let new_bot = (
|
||||
name.eq(&bot_name),
|
||||
description.eq(format!("Bot created via API: {}", bot_name)),
|
||||
llm_provider.eq("openai"),
|
||||
llm_config.eq(serde_json::json!({"model": "gpt-4"})),
|
||||
context_provider.eq("none"),
|
||||
context_config.eq(serde_json::json!({})),
|
||||
is_active.eq(true),
|
||||
);
|
||||
|
||||
match diesel::insert_into(bots)
|
||||
.values(&new_bot)
|
||||
.execute(&mut conn)
|
||||
{
|
||||
Ok(_) => (
|
||||
StatusCode::OK,
|
||||
Json(serde_json::json!({
|
||||
"status": format!("bot '{}' created successfully", bot_name),
|
||||
"bot_name": bot_name
|
||||
})),
|
||||
),
|
||||
Err(e) => (
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to create bot: {}", e) })),
|
||||
),
|
||||
}
|
||||
}
|
||||
|
||||
/// Mount an existing bot (placeholder implementation)
|
||||
|
|
|
|||
|
|
@ -16,7 +16,6 @@ use anyhow::Result;
|
|||
use async_trait::async_trait;
|
||||
use base64::{engine::general_purpose::STANDARD, Engine};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use uuid::Uuid;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
|
|
@ -115,6 +114,7 @@ pub trait MultimediaHandler: Send + Sync {
|
|||
}
|
||||
|
||||
/// Default implementation for multimedia handling
|
||||
#[derive(Debug)]
|
||||
pub struct DefaultMultimediaHandler {
|
||||
storage_client: Option<aws_sdk_s3::Client>,
|
||||
search_api_key: Option<String>,
|
||||
|
|
|
|||
|
|
@ -79,6 +79,7 @@ pub struct InstagramQuickReply {
|
|||
pub payload: String,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct InstagramAdapter {
|
||||
pub state: Arc<AppState>,
|
||||
pub access_token: String,
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ pub trait ChannelAdapter: Send + Sync {
|
|||
response: BotResponse,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>>;
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct WebChannelAdapter {
|
||||
connections: Arc<Mutex<HashMap<String, mpsc::Sender<BotResponse>>>>,
|
||||
}
|
||||
|
|
@ -66,6 +67,7 @@ impl ChannelAdapter for WebChannelAdapter {
|
|||
Ok(())
|
||||
}
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct VoiceAdapter {
|
||||
rooms: Arc<Mutex<HashMap<String, String>>>,
|
||||
connections: Arc<Mutex<HashMap<String, mpsc::Sender<BotResponse>>>>,
|
||||
|
|
|
|||
|
|
@ -52,6 +52,7 @@ pub struct TeamsAttachment {
|
|||
pub content: serde_json::Value,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct TeamsAdapter {
|
||||
pub state: Arc<AppState>,
|
||||
pub app_id: String,
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ use serde::{Deserialize, Serialize};
|
|||
use serde_json::json;
|
||||
use std::sync::Arc;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppWebhook {
|
||||
#[serde(rename = "hub.mode")]
|
||||
pub hub_mode: Option<String>,
|
||||
|
|
@ -29,24 +29,24 @@ pub struct WhatsAppWebhook {
|
|||
pub hub_challenge: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppMessage {
|
||||
pub entry: Vec<WhatsAppEntry>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppEntry {
|
||||
pub id: String,
|
||||
pub changes: Vec<WhatsAppChange>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppChange {
|
||||
pub value: WhatsAppValue,
|
||||
pub field: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppValue {
|
||||
pub messaging_product: String,
|
||||
pub metadata: WhatsAppMetadata,
|
||||
|
|
@ -54,24 +54,24 @@ pub struct WhatsAppValue {
|
|||
pub messages: Option<Vec<WhatsAppIncomingMessage>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppMetadata {
|
||||
pub display_phone_number: String,
|
||||
pub phone_number_id: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppContact {
|
||||
pub profile: WhatsAppProfile,
|
||||
pub wa_id: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppProfile {
|
||||
pub name: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppIncomingMessage {
|
||||
pub from: String,
|
||||
pub id: String,
|
||||
|
|
@ -86,12 +86,12 @@ pub struct WhatsAppIncomingMessage {
|
|||
pub location: Option<WhatsAppLocation>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppText {
|
||||
pub body: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppMedia {
|
||||
pub id: String,
|
||||
pub mime_type: Option<String>,
|
||||
|
|
@ -99,7 +99,7 @@ pub struct WhatsAppMedia {
|
|||
pub caption: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
#[derive(Debug, Clone, Deserialize, Serialize)]
|
||||
pub struct WhatsAppLocation {
|
||||
pub latitude: f64,
|
||||
pub longitude: f64,
|
||||
|
|
@ -107,6 +107,7 @@ pub struct WhatsAppLocation {
|
|||
pub address: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct WhatsAppAdapter {
|
||||
pub state: Arc<AppState>,
|
||||
pub access_token: String,
|
||||
|
|
|
|||
|
|
@ -7,25 +7,25 @@ use uuid::Uuid;
|
|||
// Type alias for backward compatibility
|
||||
pub type Config = AppConfig;
|
||||
|
||||
#[derive(Clone)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct AppConfig {
|
||||
pub drive: DriveConfig,
|
||||
pub server: ServerConfig,
|
||||
pub email: EmailConfig,
|
||||
pub site_path: String,
|
||||
}
|
||||
#[derive(Clone)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct DriveConfig {
|
||||
pub server: String,
|
||||
pub access_key: String,
|
||||
pub secret_key: String,
|
||||
}
|
||||
#[derive(Clone)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct ServerConfig {
|
||||
pub host: String,
|
||||
pub port: u16,
|
||||
}
|
||||
#[derive(Clone)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct EmailConfig {
|
||||
pub server: String,
|
||||
pub port: u16,
|
||||
|
|
@ -161,6 +161,7 @@ impl AppConfig {
|
|||
})
|
||||
}
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct ConfigManager {
|
||||
conn: DbPool,
|
||||
}
|
||||
|
|
|
|||
457
src/drive/mod.rs
457
src/drive/mod.rs
|
|
@ -1,57 +1,108 @@
|
|||
//! Drive Module - S3-based File Storage
|
||||
//!
|
||||
//! Provides file management operations using S3 as backend storage.
|
||||
//! Supports bot storage and provides REST API endpoints for desktop frontend.
|
||||
//!
|
||||
//! API Endpoints:
|
||||
//! - GET /files/list - List files and folders
|
||||
//! - POST /files/read - Read file content
|
||||
//! - POST /files/write - Write file content
|
||||
//! - POST /files/delete - Delete file/folder
|
||||
//! - POST /files/create-folder - Create new folder
|
||||
|
||||
use crate::shared::state::AppState;
|
||||
use crate::ui_tree::file_tree::{FileTree, TreeNode};
|
||||
use actix_web::{web, HttpResponse, Responder};
|
||||
use axum::{
|
||||
extract::{Query, State},
|
||||
http::StatusCode,
|
||||
response::Json,
|
||||
routing::{get, post},
|
||||
Router,
|
||||
};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::sync::Arc;
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub mod vectordb;
|
||||
|
||||
// ===== Request/Response Structures =====
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct FileItem {
|
||||
name: String,
|
||||
path: String,
|
||||
is_dir: bool,
|
||||
icon: String,
|
||||
pub name: String,
|
||||
pub path: String,
|
||||
pub is_dir: bool,
|
||||
pub size: Option<i64>,
|
||||
pub modified: Option<String>,
|
||||
pub icon: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct ListQuery {
|
||||
path: Option<String>,
|
||||
bucket: Option<String>,
|
||||
pub path: Option<String>,
|
||||
pub bucket: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct ReadRequest {
|
||||
bucket: String,
|
||||
path: String,
|
||||
pub bucket: String,
|
||||
pub path: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct ReadResponse {
|
||||
pub content: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct WriteRequest {
|
||||
bucket: String,
|
||||
path: String,
|
||||
content: String,
|
||||
pub bucket: String,
|
||||
pub path: String,
|
||||
pub content: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct DeleteRequest {
|
||||
bucket: String,
|
||||
path: String,
|
||||
pub bucket: String,
|
||||
pub path: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct CreateFolderRequest {
|
||||
bucket: String,
|
||||
path: String,
|
||||
name: String,
|
||||
pub bucket: String,
|
||||
pub path: String,
|
||||
pub name: String,
|
||||
}
|
||||
|
||||
async fn list_files(
|
||||
query: web::Query<ListQuery>,
|
||||
app_state: web::Data<Arc<AppState>>,
|
||||
) -> impl Responder {
|
||||
let mut tree = FileTree::new(app_state.get_ref().clone());
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SuccessResponse {
|
||||
pub success: bool,
|
||||
pub message: Option<String>,
|
||||
}
|
||||
|
||||
let result = if let Some(bucket) = &query.bucket {
|
||||
if let Some(path) = &query.path {
|
||||
// ===== API Configuration =====
|
||||
|
||||
/// Configure drive API routes
|
||||
pub fn configure() -> Router<Arc<AppState>> {
|
||||
Router::new()
|
||||
.route("/files/list", get(list_files))
|
||||
.route("/files/read", post(read_file))
|
||||
.route("/files/write", post(write_file))
|
||||
.route("/files/delete", post(delete_file))
|
||||
.route("/files/create-folder", post(create_folder))
|
||||
}
|
||||
|
||||
// ===== API Handlers =====
|
||||
|
||||
/// GET /files/list - List files and folders in S3 bucket
|
||||
pub async fn list_files(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Query(params): Query<ListQuery>,
|
||||
) -> Result<Json<Vec<FileItem>>, (StatusCode, Json<serde_json::Value>)> {
|
||||
// Use FileTree for hierarchical navigation
|
||||
let mut tree = FileTree::new(state.clone());
|
||||
|
||||
let result = if let Some(bucket) = ¶ms.bucket {
|
||||
if let Some(path) = ¶ms.path {
|
||||
tree.enter_folder(bucket.clone(), path.clone()).await
|
||||
} else {
|
||||
tree.enter_bucket(bucket.clone()).await
|
||||
|
|
@ -61,9 +112,10 @@ async fn list_files(
|
|||
};
|
||||
|
||||
if let Err(e) = result {
|
||||
return HttpResponse::InternalServerError().json(serde_json::json!({
|
||||
"error": e.to_string()
|
||||
}));
|
||||
return Err((
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": e.to_string() })),
|
||||
));
|
||||
}
|
||||
|
||||
let items: Vec<FileItem> = tree
|
||||
|
|
@ -85,22 +137,8 @@ async fn list_files(
|
|||
}
|
||||
TreeNode::File { bucket, path } => {
|
||||
let name = path.split('/').last().unwrap_or(path).to_string();
|
||||
let icon = if path.ends_with(".bas") {
|
||||
"⚙️"
|
||||
} else if path.ends_with(".ast") {
|
||||
"🔧"
|
||||
} else if path.ends_with(".csv") {
|
||||
"📊"
|
||||
} else if path.ends_with(".gbkb") {
|
||||
"📚"
|
||||
} else if path.ends_with(".json") {
|
||||
"🔖"
|
||||
} else if path.ends_with(".txt") || path.ends_with(".md") {
|
||||
"📃"
|
||||
} else {
|
||||
"📄"
|
||||
};
|
||||
(name, path.clone(), false, icon.to_string())
|
||||
let icon = get_file_icon(path);
|
||||
(name, path.clone(), false, icon)
|
||||
}
|
||||
};
|
||||
|
||||
|
|
@ -108,146 +146,221 @@ async fn list_files(
|
|||
name,
|
||||
path,
|
||||
is_dir,
|
||||
size: None,
|
||||
modified: None,
|
||||
icon,
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
HttpResponse::Ok().json(items)
|
||||
Ok(Json(items))
|
||||
}
|
||||
|
||||
async fn read_file(
|
||||
req: web::Json<ReadRequest>,
|
||||
app_state: web::Data<Arc<AppState>>,
|
||||
) -> impl Responder {
|
||||
if let Some(drive) = &app_state.drive {
|
||||
match drive
|
||||
.get_object()
|
||||
/// POST /files/read - Read file content from S3
|
||||
pub async fn read_file(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(req): Json<ReadRequest>,
|
||||
) -> Result<Json<ReadResponse>, (StatusCode, Json<serde_json::Value>)> {
|
||||
let s3_client = state.drive.as_ref().ok_or_else(|| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
Json(serde_json::json!({ "error": "S3 service not available" })),
|
||||
)
|
||||
})?;
|
||||
|
||||
let result = s3_client
|
||||
.get_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(&req.path)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to read file: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
|
||||
let bytes = result
|
||||
.body
|
||||
.collect()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to read file body: {}", e) })),
|
||||
)
|
||||
})?
|
||||
.into_bytes();
|
||||
|
||||
let content = String::from_utf8(bytes.to_vec()).map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("File is not valid UTF-8: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
|
||||
Ok(Json(ReadResponse { content }))
|
||||
}
|
||||
|
||||
/// POST /files/write - Write file content to S3
|
||||
pub async fn write_file(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(req): Json<WriteRequest>,
|
||||
) -> Result<Json<SuccessResponse>, (StatusCode, Json<serde_json::Value>)> {
|
||||
let s3_client = state.drive.as_ref().ok_or_else(|| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
Json(serde_json::json!({ "error": "S3 service not available" })),
|
||||
)
|
||||
})?;
|
||||
|
||||
s3_client
|
||||
.put_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(&req.path)
|
||||
.body(req.content.into_bytes().into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to write file: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
|
||||
Ok(Json(SuccessResponse {
|
||||
success: true,
|
||||
message: Some("File written successfully".to_string()),
|
||||
}))
|
||||
}
|
||||
|
||||
/// POST /files/delete - Delete file or folder from S3
|
||||
pub async fn delete_file(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(req): Json<DeleteRequest>,
|
||||
) -> Result<Json<SuccessResponse>, (StatusCode, Json<serde_json::Value>)> {
|
||||
let s3_client = state.drive.as_ref().ok_or_else(|| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
Json(serde_json::json!({ "error": "S3 service not available" })),
|
||||
)
|
||||
})?;
|
||||
|
||||
// If path ends with /, it's a folder - delete all objects with this prefix
|
||||
if req.path.ends_with('/') {
|
||||
let result = s3_client
|
||||
.list_objects_v2()
|
||||
.bucket(&req.bucket)
|
||||
.key(&req.path)
|
||||
.prefix(&req.path)
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(response) => match response.body.collect().await {
|
||||
Ok(data) => {
|
||||
let bytes = data.into_bytes();
|
||||
match String::from_utf8(bytes.to_vec()) {
|
||||
Ok(content) => HttpResponse::Ok().json(serde_json::json!({
|
||||
"content": content
|
||||
})),
|
||||
Err(_) => HttpResponse::BadRequest().json(serde_json::json!({
|
||||
"error": "File is not valid UTF-8 text"
|
||||
})),
|
||||
}
|
||||
}
|
||||
Err(e) => HttpResponse::InternalServerError().json(serde_json::json!({
|
||||
"error": e.to_string()
|
||||
})),
|
||||
},
|
||||
Err(e) => HttpResponse::InternalServerError().json(serde_json::json!({
|
||||
"error": e.to_string()
|
||||
})),
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to list objects for deletion: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
|
||||
for obj in result.contents() {
|
||||
if let Some(key) = obj.key() {
|
||||
s3_client
|
||||
.delete_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(key)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to delete object: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
HttpResponse::ServiceUnavailable().json(serde_json::json!({
|
||||
"error": "Drive not connected"
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
async fn write_file(
|
||||
req: web::Json<WriteRequest>,
|
||||
app_state: web::Data<Arc<AppState>>,
|
||||
) -> impl Responder {
|
||||
if let Some(drive) = &app_state.drive {
|
||||
match drive
|
||||
.put_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(&req.path)
|
||||
.body(req.content.clone().into_bytes().into())
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(_) => HttpResponse::Ok().json(serde_json::json!({
|
||||
"success": true
|
||||
})),
|
||||
Err(e) => HttpResponse::InternalServerError().json(serde_json::json!({
|
||||
"error": e.to_string()
|
||||
})),
|
||||
}
|
||||
} else {
|
||||
HttpResponse::ServiceUnavailable().json(serde_json::json!({
|
||||
"error": "Drive not connected"
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
async fn delete_file(
|
||||
req: web::Json<DeleteRequest>,
|
||||
app_state: web::Data<Arc<AppState>>,
|
||||
) -> impl Responder {
|
||||
if let Some(drive) = &app_state.drive {
|
||||
match drive
|
||||
s3_client
|
||||
.delete_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(&req.path)
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(_) => HttpResponse::Ok().json(serde_json::json!({
|
||||
"success": true
|
||||
})),
|
||||
Err(e) => HttpResponse::InternalServerError().json(serde_json::json!({
|
||||
"error": e.to_string()
|
||||
})),
|
||||
}
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to delete file: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
}
|
||||
|
||||
Ok(Json(SuccessResponse {
|
||||
success: true,
|
||||
message: Some("Deleted successfully".to_string()),
|
||||
}))
|
||||
}
|
||||
|
||||
/// POST /files/create-folder - Create new folder in S3
|
||||
pub async fn create_folder(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(req): Json<CreateFolderRequest>,
|
||||
) -> Result<Json<SuccessResponse>, (StatusCode, Json<serde_json::Value>)> {
|
||||
let s3_client = state.drive.as_ref().ok_or_else(|| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
Json(serde_json::json!({ "error": "S3 service not available" })),
|
||||
)
|
||||
})?;
|
||||
|
||||
// S3 doesn't have real folders, create an empty object with trailing /
|
||||
let folder_path = if req.path.is_empty() || req.path == "/" {
|
||||
format!("{}/", req.name)
|
||||
} else {
|
||||
HttpResponse::ServiceUnavailable().json(serde_json::json!({
|
||||
"error": "Drive not connected"
|
||||
}))
|
||||
format!("{}/{}/", req.path.trim_end_matches('/'), req.name)
|
||||
};
|
||||
|
||||
s3_client
|
||||
.put_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(&folder_path)
|
||||
.body(Vec::new().into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
Json(serde_json::json!({ "error": format!("Failed to create folder: {}", e) })),
|
||||
)
|
||||
})?;
|
||||
|
||||
Ok(Json(SuccessResponse {
|
||||
success: true,
|
||||
message: Some("Folder created successfully".to_string()),
|
||||
}))
|
||||
}
|
||||
|
||||
// ===== Helper Functions =====
|
||||
|
||||
/// Get appropriate icon for file based on extension
|
||||
fn get_file_icon(path: &str) -> String {
|
||||
if path.ends_with(".bas") {
|
||||
"⚙️".to_string()
|
||||
} else if path.ends_with(".ast") {
|
||||
"🔧".to_string()
|
||||
} else if path.ends_with(".csv") {
|
||||
"📊".to_string()
|
||||
} else if path.ends_with(".gbkb") {
|
||||
"📚".to_string()
|
||||
} else if path.ends_with(".json") {
|
||||
"🔖".to_string()
|
||||
} else if path.ends_with(".txt") || path.ends_with(".md") {
|
||||
"📃".to_string()
|
||||
} else if path.ends_with(".pdf") {
|
||||
"📕".to_string()
|
||||
} else if path.ends_with(".zip") || path.ends_with(".tar") || path.ends_with(".gz") {
|
||||
"📦".to_string()
|
||||
} else if path.ends_with(".jpg") || path.ends_with(".png") || path.ends_with(".gif") {
|
||||
"🖼️".to_string()
|
||||
} else {
|
||||
"📄".to_string()
|
||||
}
|
||||
}
|
||||
|
||||
async fn create_folder(
|
||||
req: web::Json<CreateFolderRequest>,
|
||||
app_state: web::Data<Arc<AppState>>,
|
||||
) -> impl Responder {
|
||||
if let Some(drive) = &app_state.drive {
|
||||
let folder_path = if req.path.is_empty() {
|
||||
format!("{}/", req.name)
|
||||
} else {
|
||||
format!("{}/{}/", req.path, req.name)
|
||||
};
|
||||
|
||||
match drive
|
||||
.put_object()
|
||||
.bucket(&req.bucket)
|
||||
.key(&folder_path)
|
||||
.body(Vec::new().into())
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(_) => HttpResponse::Ok().json(serde_json::json!({
|
||||
"success": true
|
||||
})),
|
||||
Err(e) => HttpResponse::InternalServerError().json(serde_json::json!({
|
||||
"error": e.to_string()
|
||||
})),
|
||||
}
|
||||
} else {
|
||||
HttpResponse::ServiceUnavailable().json(serde_json::json!({
|
||||
"error": "Drive not connected"
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
pub fn configure(cfg: &mut web::ServiceConfig) {
|
||||
cfg.service(
|
||||
web::scope("/files")
|
||||
.route("/list", web::get().to(list_files))
|
||||
.route("/read", web::post().to(read_file))
|
||||
.route("/write", web::post().to(write_file))
|
||||
.route("/delete", web::post().to(delete_file))
|
||||
.route("/create-folder", web::post().to(create_folder)),
|
||||
);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ use crate::basic::compiler::BasicCompiler;
|
|||
use crate::config::ConfigManager;
|
||||
use crate::shared::state::AppState;
|
||||
use aws_sdk_s3::Client;
|
||||
use log::{info};
|
||||
use log::info;
|
||||
use std::collections::HashMap;
|
||||
use std::error::Error;
|
||||
use std::sync::Arc;
|
||||
|
|
@ -11,6 +11,7 @@ use tokio::time::{interval, Duration};
|
|||
pub struct FileState {
|
||||
pub etag: String,
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct DriveMonitor {
|
||||
state: Arc<AppState>,
|
||||
bucket_name: String,
|
||||
|
|
@ -28,7 +29,10 @@ impl DriveMonitor {
|
|||
}
|
||||
pub fn spawn(self: Arc<Self>) -> tokio::task::JoinHandle<()> {
|
||||
tokio::spawn(async move {
|
||||
info!("Drive Monitor service started for bucket: {}", self.bucket_name);
|
||||
info!(
|
||||
"Drive Monitor service started for bucket: {}",
|
||||
self.bucket_name
|
||||
);
|
||||
let mut tick = interval(Duration::from_secs(90));
|
||||
loop {
|
||||
tick.tick().await;
|
||||
|
|
@ -47,7 +51,10 @@ impl DriveMonitor {
|
|||
self.check_gbot(client).await?;
|
||||
Ok(())
|
||||
}
|
||||
async fn check_gbdialog_changes(&self, client: &Client) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
async fn check_gbdialog_changes(
|
||||
&self,
|
||||
client: &Client,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let prefix = ".gbdialog/";
|
||||
let mut current_files = HashMap::new();
|
||||
let mut continuation_token = None;
|
||||
|
|
@ -58,8 +65,10 @@ impl DriveMonitor {
|
|||
.list_objects_v2()
|
||||
.bucket(&self.bucket_name.to_lowercase())
|
||||
.set_continuation_token(continuation_token)
|
||||
.send()
|
||||
).await {
|
||||
.send(),
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(Ok(list)) => list,
|
||||
Ok(Err(e)) => return Err(e.into()),
|
||||
Err(_) => {
|
||||
|
|
@ -125,8 +134,10 @@ impl DriveMonitor {
|
|||
.list_objects_v2()
|
||||
.bucket(&self.bucket_name.to_lowercase())
|
||||
.set_continuation_token(continuation_token)
|
||||
.send()
|
||||
).await {
|
||||
.send(),
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(Ok(list)) => list,
|
||||
Ok(Err(e)) => return Err(e.into()),
|
||||
Err(_) => {
|
||||
|
|
@ -143,9 +154,20 @@ impl DriveMonitor {
|
|||
if !path.ends_with("config.csv") {
|
||||
continue;
|
||||
}
|
||||
match client.head_object().bucket(&self.bucket_name).key(&path).send().await {
|
||||
match client
|
||||
.head_object()
|
||||
.bucket(&self.bucket_name)
|
||||
.key(&path)
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(_head_res) => {
|
||||
let response = client.get_object().bucket(&self.bucket_name).key(&path).send().await?;
|
||||
let response = client
|
||||
.get_object()
|
||||
.bucket(&self.bucket_name)
|
||||
.key(&path)
|
||||
.send()
|
||||
.await?;
|
||||
let bytes = response.body.collect().await?.into_bytes();
|
||||
let csv_content = String::from_utf8(bytes.to_vec())
|
||||
.map_err(|e| format!("UTF-8 error in {}: {}", path, e))?;
|
||||
|
|
@ -164,7 +186,10 @@ impl DriveMonitor {
|
|||
match config_manager.get_config(&self.bot_id, key, None) {
|
||||
Ok(old_value) => {
|
||||
if old_value != new_value {
|
||||
info!("Detected change in {} (old: {}, new: {})", key, old_value, new_value);
|
||||
info!(
|
||||
"Detected change in {} (old: {}, new: {})",
|
||||
key, old_value, new_value
|
||||
);
|
||||
restart_needed = true;
|
||||
}
|
||||
}
|
||||
|
|
@ -176,7 +201,9 @@ impl DriveMonitor {
|
|||
}
|
||||
let _ = config_manager.sync_gbot_config(&self.bot_id, &csv_content);
|
||||
if restart_needed {
|
||||
if let Err(e) = ensure_llama_servers_running(Arc::clone(&self.state)).await {
|
||||
if let Err(e) =
|
||||
ensure_llama_servers_running(Arc::clone(&self.state)).await
|
||||
{
|
||||
log::error!("Failed to restart LLaMA servers after llm- config change: {}", e);
|
||||
}
|
||||
}
|
||||
|
|
@ -199,7 +226,10 @@ impl DriveMonitor {
|
|||
}
|
||||
Ok(())
|
||||
}
|
||||
async fn broadcast_theme_change(&self, csv_content: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
async fn broadcast_theme_change(
|
||||
&self,
|
||||
csv_content: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let mut theme_data = serde_json::json!({
|
||||
"event": "change_theme",
|
||||
"data": {}
|
||||
|
|
@ -210,11 +240,23 @@ impl DriveMonitor {
|
|||
let key = parts[0].trim();
|
||||
let value = parts[1].trim();
|
||||
match key {
|
||||
"theme-color1" => theme_data["data"]["color1"] = serde_json::Value::String(value.to_string()),
|
||||
"theme-color2" => theme_data["data"]["color2"] = serde_json::Value::String(value.to_string()),
|
||||
"theme-logo" => theme_data["data"]["logo_url"] = serde_json::Value::String(value.to_string()),
|
||||
"theme-title" => theme_data["data"]["title"] = serde_json::Value::String(value.to_string()),
|
||||
"theme-logo-text" => theme_data["data"]["logo_text"] = serde_json::Value::String(value.to_string()),
|
||||
"theme-color1" => {
|
||||
theme_data["data"]["color1"] = serde_json::Value::String(value.to_string())
|
||||
}
|
||||
"theme-color2" => {
|
||||
theme_data["data"]["color2"] = serde_json::Value::String(value.to_string())
|
||||
}
|
||||
"theme-logo" => {
|
||||
theme_data["data"]["logo_url"] =
|
||||
serde_json::Value::String(value.to_string())
|
||||
}
|
||||
"theme-title" => {
|
||||
theme_data["data"]["title"] = serde_json::Value::String(value.to_string())
|
||||
}
|
||||
"theme-logo-text" => {
|
||||
theme_data["data"]["logo_text"] =
|
||||
serde_json::Value::String(value.to_string())
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
|
@ -239,17 +281,38 @@ impl DriveMonitor {
|
|||
}
|
||||
Ok(())
|
||||
}
|
||||
async fn compile_tool(&self, client: &Client, file_path: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
info!("Fetching object from Drive: bucket={}, key={}", &self.bucket_name, file_path);
|
||||
let response = match client.get_object().bucket(&self.bucket_name).key(file_path).send().await {
|
||||
async fn compile_tool(
|
||||
&self,
|
||||
client: &Client,
|
||||
file_path: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
info!(
|
||||
"Fetching object from Drive: bucket={}, key={}",
|
||||
&self.bucket_name, file_path
|
||||
);
|
||||
let response = match client
|
||||
.get_object()
|
||||
.bucket(&self.bucket_name)
|
||||
.key(file_path)
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
Ok(res) => {
|
||||
info!("Successfully fetched object from Drive: bucket={}, key={}, size={}",
|
||||
&self.bucket_name, file_path, res.content_length().unwrap_or(0));
|
||||
info!(
|
||||
"Successfully fetched object from Drive: bucket={}, key={}, size={}",
|
||||
&self.bucket_name,
|
||||
file_path,
|
||||
res.content_length().unwrap_or(0)
|
||||
);
|
||||
res
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Failed to fetch object from Drive: bucket={}, key={}, error={:?}",
|
||||
&self.bucket_name, file_path, e);
|
||||
log::error!(
|
||||
"Failed to fetch object from Drive: bucket={}, key={}, error={:?}",
|
||||
&self.bucket_name,
|
||||
file_path,
|
||||
e
|
||||
);
|
||||
return Err(e.into());
|
||||
}
|
||||
};
|
||||
|
|
@ -262,7 +325,10 @@ impl DriveMonitor {
|
|||
.strip_suffix(".bas")
|
||||
.unwrap_or(file_path)
|
||||
.to_string();
|
||||
let bot_name = self.bucket_name.strip_suffix(".gbai").unwrap_or(&self.bucket_name);
|
||||
let bot_name = self
|
||||
.bucket_name
|
||||
.strip_suffix(".gbai")
|
||||
.unwrap_or(&self.bucket_name);
|
||||
let work_dir = format!("./work/{}.gbai/{}.gbdialog", bot_name, bot_name);
|
||||
let state_clone = Arc::clone(&self.state);
|
||||
let work_dir_clone = work_dir.clone();
|
||||
|
|
@ -276,11 +342,14 @@ impl DriveMonitor {
|
|||
let mut compiler = BasicCompiler::new(state_clone, bot_id);
|
||||
let result = compiler.compile_file(&local_source_path, &work_dir_clone)?;
|
||||
if let Some(mcp_tool) = result.mcp_tool {
|
||||
info!("MCP tool definition generated with {} parameters",
|
||||
mcp_tool.input_schema.properties.len());
|
||||
info!(
|
||||
"MCP tool definition generated with {} parameters",
|
||||
mcp_tool.input_schema.properties.len()
|
||||
);
|
||||
}
|
||||
Ok::<(), Box<dyn Error + Send + Sync>>(())
|
||||
}).await??;
|
||||
})
|
||||
.await??;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
|
|
|||
159
src/email/mod.rs
159
src/email/mod.rs
|
|
@ -5,6 +5,10 @@ use axum::{
|
|||
response::{IntoResponse, Response},
|
||||
Json,
|
||||
};
|
||||
use axum::{
|
||||
routing::{get, post},
|
||||
Router,
|
||||
};
|
||||
use base64::{engine::general_purpose, Engine as _};
|
||||
use diesel::prelude::*;
|
||||
use imap::types::Seq;
|
||||
|
|
@ -15,6 +19,28 @@ use serde::{Deserialize, Serialize};
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
pub mod vectordb;
|
||||
|
||||
// ===== Router Configuration =====
|
||||
|
||||
/// Configure email API routes
|
||||
pub fn configure() -> Router<Arc<AppState>> {
|
||||
Router::new()
|
||||
.route("/api/email/accounts", get(list_email_accounts))
|
||||
.route("/api/email/accounts/add", post(add_email_account))
|
||||
.route(
|
||||
"/api/email/accounts/:account_id",
|
||||
axum::routing::delete(delete_email_account),
|
||||
)
|
||||
.route("/api/email/list", post(list_emails))
|
||||
.route("/api/email/send", post(send_email))
|
||||
.route("/api/email/draft", post(save_draft))
|
||||
.route("/api/email/folders/:account_id", get(list_folders))
|
||||
.route("/api/email/latest", post(get_latest_email_from))
|
||||
.route("/api/email/get/:campaign_id", get(get_emails))
|
||||
.route("/api/email/click/:campaign_id/:email", get(save_click))
|
||||
}
|
||||
|
||||
// Export SaveDraftRequest for other modules
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SaveDraftRequest {
|
||||
|
|
@ -69,7 +95,17 @@ pub struct EmailResponse {
|
|||
pub has_attachments: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct EmailRequest {
|
||||
pub to: String,
|
||||
pub subject: String,
|
||||
pub body: String,
|
||||
pub cc: Option<String>,
|
||||
pub bcc: Option<String>,
|
||||
pub attachments: Option<Vec<String>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct SendEmailRequest {
|
||||
pub account_id: String,
|
||||
pub to: String,
|
||||
|
|
@ -389,12 +425,11 @@ pub async fn list_emails(
|
|||
.build()
|
||||
.map_err(|e| EmailError(format!("Failed to create TLS connector: {:?}", e)))?;
|
||||
|
||||
let client = imap::connect(
|
||||
(imap_server.as_str(), imap_port as u16),
|
||||
imap_server.as_str(),
|
||||
&tls,
|
||||
)
|
||||
.map_err(|e| EmailError(format!("Failed to connect to IMAP: {:?}", e)))?;
|
||||
let client = imap::ClientBuilder::new(imap_server.as_str(), imap_port as u16)
|
||||
.native_tls(&tls)
|
||||
.map_err(|e| EmailError(format!("Failed to create IMAP client: {:?}", e)))?
|
||||
.connect()
|
||||
.map_err(|e| EmailError(format!("Failed to connect to IMAP: {:?}", e)))?;
|
||||
|
||||
let mut session = client
|
||||
.login(&username, &password)
|
||||
|
|
@ -669,12 +704,11 @@ pub async fn list_folders(
|
|||
.build()
|
||||
.map_err(|e| EmailError(format!("TLS error: {:?}", e)))?;
|
||||
|
||||
let client = imap::connect(
|
||||
(imap_server.as_str(), imap_port as u16),
|
||||
imap_server.as_str(),
|
||||
&tls,
|
||||
)
|
||||
.map_err(|e| EmailError(format!("IMAP connection error: {:?}", e)))?;
|
||||
let client = imap::ClientBuilder::new(imap_server.as_str(), imap_port as u16)
|
||||
.native_tls(&tls)
|
||||
.map_err(|e| EmailError(format!("Failed to create IMAP client: {:?}", e)))?
|
||||
.connect()
|
||||
.map_err(|e| EmailError(format!("Failed to connect to IMAP: {:?}", e)))?;
|
||||
|
||||
let mut session = client
|
||||
.login(&username, &password)
|
||||
|
|
@ -810,9 +844,45 @@ impl EmailService {
|
|||
|
||||
// Helper functions for draft system
|
||||
pub async fn fetch_latest_sent_to(config: &EmailConfig, to: &str) -> Result<String, String> {
|
||||
// This would fetch the latest email sent to the recipient
|
||||
// For threading/reply purposes
|
||||
// For now, return empty string
|
||||
use native_tls::TlsConnector;
|
||||
|
||||
let tls = TlsConnector::builder()
|
||||
.build()
|
||||
.map_err(|e| format!("TLS error: {}", e))?;
|
||||
|
||||
let client = imap::ClientBuilder::new(&config.server, config.port as u16)
|
||||
.native_tls(&tls)
|
||||
.map_err(|e| format!("IMAP client error: {}", e))?
|
||||
.connect()
|
||||
.map_err(|e| format!("Connection error: {}", e))?;
|
||||
|
||||
let mut session = client
|
||||
.login(&config.username, &config.password)
|
||||
.map_err(|e| format!("Login failed: {:?}", e))?;
|
||||
|
||||
session
|
||||
.select("INBOX")
|
||||
.map_err(|e| format!("Select INBOX failed: {}", e))?;
|
||||
|
||||
// Search for emails to this recipient
|
||||
let search_query = format!("TO \"{}\"", to);
|
||||
let message_ids = session
|
||||
.search(&search_query)
|
||||
.map_err(|e| format!("Search failed: {}", e))?;
|
||||
|
||||
if let Some(last_id) = message_ids.last() {
|
||||
let messages = session
|
||||
.fetch(last_id.to_string(), "BODY[TEXT]")
|
||||
.map_err(|e| format!("Fetch failed: {}", e))?;
|
||||
|
||||
if let Some(message) = messages.iter().next() {
|
||||
if let Some(body) = message.text() {
|
||||
return Ok(String::from_utf8_lossy(body).to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
session.logout().ok();
|
||||
Ok(String::new())
|
||||
}
|
||||
|
||||
|
|
@ -820,8 +890,59 @@ pub async fn save_email_draft(
|
|||
config: &EmailConfig,
|
||||
draft: &SaveDraftRequest,
|
||||
) -> Result<(), String> {
|
||||
// This would save the draft to the email server or local storage
|
||||
// For now, just log and return success
|
||||
info!("Saving draft to: {}, subject: {}", draft.to, draft.subject);
|
||||
use chrono::Utc;
|
||||
use native_tls::TlsConnector;
|
||||
|
||||
let tls = TlsConnector::builder()
|
||||
.build()
|
||||
.map_err(|e| format!("TLS error: {}", e))?;
|
||||
|
||||
let client = imap::ClientBuilder::new(&config.server, config.port as u16)
|
||||
.native_tls(&tls)
|
||||
.map_err(|e| format!("IMAP client error: {}", e))?
|
||||
.connect()
|
||||
.map_err(|e| format!("Connection error: {}", e))?;
|
||||
|
||||
let mut session = client
|
||||
.login(&config.username, &config.password)
|
||||
.map_err(|e| format!("Login failed: {:?}", e))?;
|
||||
|
||||
// Create draft email in RFC822 format
|
||||
let date = Utc::now().to_rfc2822();
|
||||
let message_id = format!("<{}.{}@botserver>", Uuid::new_v4(), config.server);
|
||||
let cc_header = if let Some(cc) = &draft.cc {
|
||||
format!("Cc: {}\r\n", cc)
|
||||
} else {
|
||||
String::new()
|
||||
};
|
||||
|
||||
let email_content = format!(
|
||||
"Date: {}\r\n\
|
||||
From: {}\r\n\
|
||||
To: {}\r\n\
|
||||
{}\
|
||||
Subject: {}\r\n\
|
||||
Message-ID: {}\r\n\
|
||||
Content-Type: text/html; charset=UTF-8\r\n\
|
||||
\r\n\
|
||||
{}",
|
||||
date, config.from, draft.to, cc_header, draft.subject, message_id, draft.text
|
||||
);
|
||||
|
||||
// Try to save to Drafts folder, fall back to INBOX if not available
|
||||
let folder = session
|
||||
.list(None, Some("Drafts"))
|
||||
.map_err(|e| format!("List folders failed: {}", e))?
|
||||
.iter()
|
||||
.find(|name| name.name().to_lowercase().contains("draft"))
|
||||
.map(|n| n.name().to_string())
|
||||
.unwrap_or_else(|| "INBOX".to_string());
|
||||
|
||||
session
|
||||
.append(&folder, email_content.as_bytes())
|
||||
.map_err(|e| format!("Append draft failed: {}", e))?;
|
||||
|
||||
session.logout().ok();
|
||||
info!("Draft saved to: {}, subject: {}", draft.to, draft.subject);
|
||||
Ok(())
|
||||
}
|
||||
|
|
|
|||
19
src/lib.rs
19
src/lib.rs
|
|
@ -1,3 +1,4 @@
|
|||
pub mod auth;
|
||||
pub mod automation;
|
||||
pub mod basic;
|
||||
pub mod bootstrap;
|
||||
|
|
@ -5,6 +6,7 @@ pub mod bot;
|
|||
pub mod channels;
|
||||
pub mod config;
|
||||
pub mod context;
|
||||
pub mod drive;
|
||||
pub mod drive_monitor;
|
||||
#[cfg(feature = "email")]
|
||||
pub mod email;
|
||||
|
|
@ -12,10 +14,23 @@ pub mod file;
|
|||
pub mod llm;
|
||||
pub mod llm_models;
|
||||
pub mod meet;
|
||||
pub mod nvidia;
|
||||
pub mod package_manager;
|
||||
pub mod session;
|
||||
pub mod shared;
|
||||
pub mod tests;
|
||||
pub mod ui_tree;
|
||||
pub mod web_server;
|
||||
pub mod auth;
|
||||
pub mod nvidia;
|
||||
|
||||
// Bootstrap progress enum used by UI
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum BootstrapProgress {
|
||||
StartingBootstrap,
|
||||
InstallingComponent(String),
|
||||
StartingComponent(String),
|
||||
UploadingTemplates,
|
||||
ConnectingDatabase,
|
||||
StartingLLM,
|
||||
BootstrapComplete,
|
||||
BootstrapError(String),
|
||||
}
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ pub trait LLMProvider: Send + Sync {
|
|||
prompt: &str,
|
||||
config: &Value,
|
||||
model: &str,
|
||||
key: &str
|
||||
key: &str,
|
||||
) -> Result<String, Box<dyn std::error::Error + Send + Sync>>;
|
||||
async fn generate_stream(
|
||||
&self,
|
||||
|
|
@ -19,17 +19,17 @@ pub trait LLMProvider: Send + Sync {
|
|||
config: &Value,
|
||||
tx: mpsc::Sender<String>,
|
||||
model: &str,
|
||||
key: &str
|
||||
key: &str,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>>;
|
||||
async fn cancel_job(
|
||||
&self,
|
||||
session_id: &str,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>>;
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct OpenAIClient {
|
||||
client: reqwest::Client,
|
||||
base_url: String,
|
||||
|
||||
}
|
||||
#[async_trait]
|
||||
impl LLMProvider for OpenAIClient {
|
||||
|
|
@ -38,11 +38,10 @@ impl LLMProvider for OpenAIClient {
|
|||
prompt: &str,
|
||||
messages: &Value,
|
||||
model: &str,
|
||||
key: &str
|
||||
key: &str,
|
||||
) -> Result<String, Box<dyn std::error::Error + Send + Sync>> {
|
||||
let default_messages = serde_json::json!([{"role": "user", "content": prompt}]);
|
||||
let response =
|
||||
self
|
||||
let response = self
|
||||
.client
|
||||
.post(&format!("{}/v1/chat/completions", self.base_url))
|
||||
.header("Authorization", format!("Bearer {}", key))
|
||||
|
|
@ -74,7 +73,7 @@ impl LLMProvider for OpenAIClient {
|
|||
messages: &Value,
|
||||
tx: mpsc::Sender<String>,
|
||||
model: &str,
|
||||
key: &str
|
||||
key: &str,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
let default_messages = serde_json::json!([{"role": "user", "content": prompt}]);
|
||||
let response = self
|
||||
|
|
@ -129,11 +128,15 @@ impl OpenAIClient {
|
|||
pub fn new(_api_key: String, base_url: Option<String>) -> Self {
|
||||
Self {
|
||||
client: reqwest::Client::new(),
|
||||
base_url: base_url.unwrap()
|
||||
base_url: base_url.unwrap(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn build_messages(system_prompt: &str, context_data: &str, history: &[(String, String)]) -> Value {
|
||||
pub fn build_messages(
|
||||
system_prompt: &str,
|
||||
context_data: &str,
|
||||
history: &[(String, String)],
|
||||
) -> Value {
|
||||
let mut messages = Vec::new();
|
||||
if !system_prompt.is_empty() {
|
||||
messages.push(serde_json::json!({
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
use super::ModelHandler;
|
||||
use regex;
|
||||
#[derive(Debug)]
|
||||
pub struct DeepseekR3Handler;
|
||||
impl ModelHandler for DeepseekR3Handler {
|
||||
fn is_analysis_complete(&self, buffer: &str) -> bool {
|
||||
|
|
|
|||
|
|
@ -1,10 +1,9 @@
|
|||
use super::ModelHandler;
|
||||
pub struct GptOss120bHandler {
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct GptOss120bHandler {}
|
||||
impl GptOss120bHandler {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
}
|
||||
Self {}
|
||||
}
|
||||
}
|
||||
impl ModelHandler for GptOss120bHandler {
|
||||
|
|
@ -12,8 +11,7 @@ impl ModelHandler for GptOss120bHandler {
|
|||
buffer.contains("**end**")
|
||||
}
|
||||
fn process_content(&self, content: &str) -> String {
|
||||
content.replace("**start**", "")
|
||||
.replace("**end**", "")
|
||||
content.replace("**start**", "").replace("**end**", "")
|
||||
}
|
||||
fn has_analysis_markers(&self, buffer: &str) -> bool {
|
||||
buffer.contains("**start**")
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
use super::ModelHandler;
|
||||
#[derive(Debug)]
|
||||
pub struct GptOss20bHandler;
|
||||
impl ModelHandler for GptOss20bHandler {
|
||||
fn is_analysis_complete(&self, buffer: &str) -> bool {
|
||||
|
|
|
|||
38
src/main.rs
38
src/main.rs
|
|
@ -20,6 +20,7 @@ mod bot;
|
|||
mod channels;
|
||||
mod config;
|
||||
mod context;
|
||||
mod drive;
|
||||
mod drive_monitor;
|
||||
#[cfg(feature = "email")]
|
||||
mod email;
|
||||
|
|
@ -153,21 +154,13 @@ async fn run_axum_server(
|
|||
);
|
||||
|
||||
// Add email routes if feature is enabled
|
||||
#[cfg(feature = "email")]
|
||||
// Merge drive, email, and meet module routes
|
||||
let api_router = api_router
|
||||
.route("/api/email/accounts", get(list_email_accounts))
|
||||
.route("/api/email/accounts/add", post(add_email_account))
|
||||
.route(
|
||||
"/api/email/accounts/{account_id}",
|
||||
axum::routing::delete(delete_email_account),
|
||||
)
|
||||
.route("/api/email/list", post(list_emails))
|
||||
.route("/api/email/send", post(send_email))
|
||||
.route("/api/email/draft", post(save_draft))
|
||||
.route("/api/email/folders/{account_id}", get(list_folders))
|
||||
.route("/api/email/latest", post(get_latest_email_from))
|
||||
.route("/api/email/get/{campaign_id}", get(get_emails))
|
||||
.route("/api/email/click/{campaign_id}/{email}", get(save_click));
|
||||
.merge(crate::drive::configure())
|
||||
.merge(crate::meet::configure());
|
||||
|
||||
#[cfg(feature = "email")]
|
||||
let api_router = api_router.merge(crate::email::configure());
|
||||
|
||||
// Build static file serving
|
||||
let static_path = std::path::Path::new("./web/desktop");
|
||||
|
|
@ -410,7 +403,22 @@ async fn main() -> std::io::Result<()> {
|
|||
redis_client.clone(),
|
||||
)));
|
||||
|
||||
let auth_service = Arc::new(tokio::sync::Mutex::new(auth::AuthService::new()));
|
||||
// Create default Zitadel config (can be overridden with env vars)
|
||||
let zitadel_config = auth::zitadel::ZitadelConfig {
|
||||
issuer_url: std::env::var("ZITADEL_ISSUER_URL")
|
||||
.unwrap_or_else(|_| "http://localhost:8080".to_string()),
|
||||
issuer: std::env::var("ZITADEL_ISSUER")
|
||||
.unwrap_or_else(|_| "http://localhost:8080".to_string()),
|
||||
client_id: std::env::var("ZITADEL_CLIENT_ID").unwrap_or_else(|_| "default".to_string()),
|
||||
client_secret: std::env::var("ZITADEL_CLIENT_SECRET")
|
||||
.unwrap_or_else(|_| "secret".to_string()),
|
||||
redirect_uri: std::env::var("ZITADEL_REDIRECT_URI")
|
||||
.unwrap_or_else(|_| "http://localhost:8080/callback".to_string()),
|
||||
project_id: std::env::var("ZITADEL_PROJECT_ID").unwrap_or_else(|_| "default".to_string()),
|
||||
};
|
||||
let auth_service = Arc::new(tokio::sync::Mutex::new(auth::AuthService::new(
|
||||
zitadel_config,
|
||||
)));
|
||||
let config_manager = ConfigManager::new(pool.clone());
|
||||
|
||||
let mut bot_conn = pool.get().expect("Failed to get database connection");
|
||||
|
|
|
|||
|
|
@ -2,17 +2,69 @@ use axum::{
|
|||
extract::{Path, State},
|
||||
http::StatusCode,
|
||||
response::{IntoResponse, Json},
|
||||
routing::{get, post},
|
||||
Router,
|
||||
};
|
||||
use log::{error, info};
|
||||
use serde::Deserialize;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::shared::state::AppState;
|
||||
|
||||
pub mod service;
|
||||
use service::{DefaultTranscriptionService, MeetingService};
|
||||
|
||||
// ===== Router Configuration =====
|
||||
|
||||
/// Configure meet API routes
|
||||
pub fn configure() -> Router<Arc<AppState>> {
|
||||
Router::new()
|
||||
.route("/api/voice/start", post(voice_start))
|
||||
.route("/api/voice/stop", post(voice_stop))
|
||||
.route("/api/meet/create", post(create_meeting))
|
||||
.route("/api/meet/rooms", get(list_rooms))
|
||||
.route("/api/meet/rooms/:room_id", get(get_room))
|
||||
.route("/api/meet/rooms/:room_id/join", post(join_room))
|
||||
.route(
|
||||
"/api/meet/rooms/:room_id/transcription/start",
|
||||
post(start_transcription),
|
||||
)
|
||||
.route("/api/meet/token", post(get_meeting_token))
|
||||
.route("/api/meet/invite", post(send_meeting_invites))
|
||||
.route("/ws/meet", get(meeting_websocket))
|
||||
}
|
||||
|
||||
// ===== Request/Response Structures =====
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct CreateMeetingRequest {
|
||||
pub name: String,
|
||||
pub created_by: String,
|
||||
pub settings: Option<service::MeetingSettings>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct JoinRoomRequest {
|
||||
pub participant_name: String,
|
||||
pub participant_id: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct GetTokenRequest {
|
||||
pub room_id: String,
|
||||
pub user_id: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SendInvitesRequest {
|
||||
pub room_id: String,
|
||||
pub emails: Vec<String>,
|
||||
}
|
||||
|
||||
// ===== Voice/Meet Handlers =====
|
||||
|
||||
pub async fn voice_start(
|
||||
State(data): State<Arc<AppState>>,
|
||||
Json(info): Json<Value>,
|
||||
|
|
@ -252,29 +304,3 @@ async fn handle_meeting_socket(_socket: axum::extract::ws::WebSocket, _state: Ar
|
|||
// Handle WebSocket messages for real-time meeting communication
|
||||
// This would integrate with WebRTC signaling
|
||||
}
|
||||
|
||||
// Request/Response DTOs
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct CreateMeetingRequest {
|
||||
pub name: String,
|
||||
pub created_by: String,
|
||||
pub settings: Option<service::MeetingSettings>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct JoinRoomRequest {
|
||||
pub participant_name: String,
|
||||
pub participant_id: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct GetTokenRequest {
|
||||
pub room_id: String,
|
||||
pub user_id: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SendInvitesRequest {
|
||||
pub room_id: String,
|
||||
pub emails: Vec<String>,
|
||||
}
|
||||
|
|
|
|||
|
|
@ -192,7 +192,21 @@ pub struct MeetingService {
|
|||
pub state: Arc<AppState>,
|
||||
pub rooms: Arc<RwLock<HashMap<String, MeetingRoom>>>,
|
||||
pub connections: Arc<RwLock<HashMap<String, mpsc::Sender<MeetingMessage>>>>,
|
||||
pub transcription_service: Arc<dyn TranscriptionService>,
|
||||
pub transcription_service: Arc<dyn TranscriptionService + Send + Sync>,
|
||||
}
|
||||
|
||||
impl std::fmt::Debug for MeetingService {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
f.debug_struct("MeetingService")
|
||||
.field("state", &self.state)
|
||||
.field("rooms", &"Arc<RwLock<HashMap<String, MeetingRoom>>>")
|
||||
.field(
|
||||
"connections",
|
||||
&"Arc<RwLock<HashMap<String, mpsc::Sender<MeetingMessage>>>>",
|
||||
)
|
||||
.field("transcription_service", &"Arc<dyn TranscriptionService>")
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl MeetingService {
|
||||
|
|
@ -498,7 +512,6 @@ impl MeetingService {
|
|||
|
||||
/// Trait for transcription services
|
||||
#[async_trait]
|
||||
#[allow(dead_code)]
|
||||
pub trait TranscriptionService: Send + Sync {
|
||||
async fn start_transcription(&self, room_id: &str) -> Result<()>;
|
||||
async fn stop_transcription(&self, room_id: &str) -> Result<()>;
|
||||
|
|
@ -506,6 +519,7 @@ pub trait TranscriptionService: Send + Sync {
|
|||
}
|
||||
|
||||
/// Default transcription service implementation
|
||||
#[derive(Debug)]
|
||||
pub struct DefaultTranscriptionService;
|
||||
|
||||
#[async_trait]
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
use anyhow::Result;
|
||||
use std::collections::HashMap;
|
||||
use sysinfo::{System};
|
||||
#[derive(Default)]
|
||||
use sysinfo::System;
|
||||
/// System performance metrics
|
||||
#[derive(Debug, Default)]
|
||||
pub struct SystemMetrics {
|
||||
pub gpu_usage: Option<f32>,
|
||||
pub cpu_usage: f32,
|
||||
|
|
@ -27,9 +28,7 @@ pub fn has_nvidia_gpu() -> bool {
|
|||
.output()
|
||||
{
|
||||
Ok(output) => output.status.success(),
|
||||
Err(_) => {
|
||||
false
|
||||
}
|
||||
Err(_) => false,
|
||||
}
|
||||
}
|
||||
pub fn get_gpu_utilization() -> Result<HashMap<String, f32>> {
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ use log::trace;
|
|||
use std::collections::HashMap;
|
||||
use std::path::PathBuf;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct PackageManager {
|
||||
pub mode: InstallMode,
|
||||
pub os_type: OsType,
|
||||
|
|
@ -58,8 +59,6 @@ impl PackageManager {
|
|||
}
|
||||
|
||||
fn register_drive(&mut self) {
|
||||
|
||||
|
||||
self.components.insert(
|
||||
"drive".to_string(),
|
||||
ComponentConfig {
|
||||
|
|
@ -88,14 +87,9 @@ impl PackageManager {
|
|||
check_cmd: "ps -ef | grep minio | grep -v grep | grep {{BIN_PATH}}".to_string(),
|
||||
},
|
||||
);
|
||||
|
||||
|
||||
}
|
||||
|
||||
|
||||
fn register_tables(&mut self) {
|
||||
|
||||
|
||||
self.components.insert(
|
||||
"tables".to_string(),
|
||||
ComponentConfig {
|
||||
|
|
@ -674,7 +668,10 @@ impl PackageManager {
|
|||
|
||||
if check_status.is_ok() && check_status.unwrap().success() {
|
||||
trace!("Component {} is already running", component.name);
|
||||
return Ok(std::process::Command::new("sh").arg("-c").arg("true").spawn()?);
|
||||
return Ok(std::process::Command::new("sh")
|
||||
.arg("-c")
|
||||
.arg("true")
|
||||
.spawn()?);
|
||||
}
|
||||
|
||||
// If not running, execute the main command
|
||||
|
|
@ -733,5 +730,4 @@ impl PackageManager {
|
|||
Err(anyhow::anyhow!("Component {} not found", component))
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
|||
|
|
@ -16,6 +16,7 @@ pub enum OsType {
|
|||
MacOS,
|
||||
Windows,
|
||||
}
|
||||
#[derive(Debug)]
|
||||
pub struct ComponentInfo {
|
||||
pub name: &'static str,
|
||||
pub termination_command: &'static str,
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ use tokio::fs;
|
|||
use tokio::time::sleep;
|
||||
|
||||
/// Directory (Zitadel) auto-setup manager
|
||||
#[derive(Debug)]
|
||||
pub struct DirectorySetup {
|
||||
base_url: String,
|
||||
client: Client,
|
||||
|
|
@ -15,6 +16,23 @@ pub struct DirectorySetup {
|
|||
config_path: PathBuf,
|
||||
}
|
||||
|
||||
impl DirectorySetup {
|
||||
/// Set the admin token
|
||||
pub fn set_admin_token(&mut self, token: String) {
|
||||
self.admin_token = Some(token);
|
||||
}
|
||||
|
||||
/// Get or initialize admin token
|
||||
pub async fn ensure_admin_token(&mut self) -> Result<()> {
|
||||
if self.admin_token.is_none() {
|
||||
let token = std::env::var("DIRECTORY_ADMIN_TOKEN")
|
||||
.unwrap_or_else(|_| "zitadel-admin-sa".to_string());
|
||||
self.admin_token = Some(token);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct DefaultOrganization {
|
||||
pub id: String,
|
||||
|
|
@ -99,7 +117,7 @@ impl DirectorySetup {
|
|||
self.wait_for_ready(30).await?;
|
||||
|
||||
// Get initial admin token (from Zitadel setup)
|
||||
self.get_initial_admin_token().await?;
|
||||
self.ensure_admin_token().await?;
|
||||
|
||||
// Create default organization
|
||||
let org = self.create_default_organization().await?;
|
||||
|
|
@ -128,7 +146,7 @@ impl DirectorySetup {
|
|||
};
|
||||
|
||||
// Save configuration
|
||||
self.save_config(&config).await?;
|
||||
self.save_config_internal(&config).await?;
|
||||
log::info!("✅ Saved Directory configuration");
|
||||
|
||||
log::info!("🎉 Directory initialization complete!");
|
||||
|
|
@ -142,15 +160,29 @@ impl DirectorySetup {
|
|||
Ok(config)
|
||||
}
|
||||
|
||||
/// Get initial admin token from Zitadel
|
||||
async fn get_initial_admin_token(&mut self) -> Result<()> {
|
||||
// In Zitadel, the initial setup creates a service account
|
||||
// For now, use environment variable or default token
|
||||
let token = std::env::var("DIRECTORY_ADMIN_TOKEN")
|
||||
.unwrap_or_else(|_| "zitadel-admin-sa".to_string());
|
||||
/// Create an organization
|
||||
pub async fn create_organization(&mut self, name: &str, description: &str) -> Result<String> {
|
||||
// Ensure we have admin token
|
||||
self.ensure_admin_token().await?;
|
||||
|
||||
self.admin_token = Some(token);
|
||||
Ok(())
|
||||
let response = self
|
||||
.client
|
||||
.post(format!("{}/management/v1/orgs", self.base_url))
|
||||
.bearer_auth(self.admin_token.as_ref().unwrap())
|
||||
.json(&json!({
|
||||
"name": name,
|
||||
"description": description,
|
||||
}))
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
if !response.status().is_success() {
|
||||
let error_text = response.text().await?;
|
||||
anyhow::bail!("Failed to create organization: {}", error_text);
|
||||
}
|
||||
|
||||
let result: serde_json::Value = response.json().await?;
|
||||
Ok(result["id"].as_str().unwrap_or("").to_string())
|
||||
}
|
||||
|
||||
/// Create default organization
|
||||
|
|
@ -182,6 +214,67 @@ impl DirectorySetup {
|
|||
})
|
||||
}
|
||||
|
||||
/// Create a user in an organization
|
||||
pub async fn create_user(
|
||||
&mut self,
|
||||
org_id: &str,
|
||||
username: &str,
|
||||
email: &str,
|
||||
password: &str,
|
||||
first_name: &str,
|
||||
last_name: &str,
|
||||
is_admin: bool,
|
||||
) -> Result<DefaultUser> {
|
||||
// Ensure we have admin token
|
||||
self.ensure_admin_token().await?;
|
||||
|
||||
let response = self
|
||||
.client
|
||||
.post(format!("{}/management/v1/users/human", self.base_url))
|
||||
.bearer_auth(self.admin_token.as_ref().unwrap())
|
||||
.json(&json!({
|
||||
"userName": username,
|
||||
"profile": {
|
||||
"firstName": first_name,
|
||||
"lastName": last_name,
|
||||
"displayName": format!("{} {}", first_name, last_name)
|
||||
},
|
||||
"email": {
|
||||
"email": email,
|
||||
"isEmailVerified": true
|
||||
},
|
||||
"password": password,
|
||||
"organisation": {
|
||||
"orgId": org_id
|
||||
}
|
||||
}))
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
if !response.status().is_success() {
|
||||
let error_text = response.text().await?;
|
||||
anyhow::bail!("Failed to create user: {}", error_text);
|
||||
}
|
||||
|
||||
let result: serde_json::Value = response.json().await?;
|
||||
|
||||
let user = DefaultUser {
|
||||
id: result["userId"].as_str().unwrap_or("").to_string(),
|
||||
username: username.to_string(),
|
||||
email: email.to_string(),
|
||||
password: password.to_string(),
|
||||
first_name: first_name.to_string(),
|
||||
last_name: last_name.to_string(),
|
||||
};
|
||||
|
||||
// Grant admin permissions if requested
|
||||
if is_admin {
|
||||
self.grant_user_permissions(org_id, &user.id).await?;
|
||||
}
|
||||
|
||||
Ok(user)
|
||||
}
|
||||
|
||||
/// Create default user in organization
|
||||
async fn create_default_user(&self, org_id: &str) -> Result<DefaultUser> {
|
||||
let username =
|
||||
|
|
@ -207,6 +300,9 @@ impl DirectorySetup {
|
|||
"isEmailVerified": true
|
||||
},
|
||||
"password": password,
|
||||
"organisation": {
|
||||
"orgId": org_id
|
||||
}
|
||||
}))
|
||||
.send()
|
||||
.await?;
|
||||
|
|
@ -229,7 +325,10 @@ impl DirectorySetup {
|
|||
}
|
||||
|
||||
/// Create OAuth2 application for BotServer
|
||||
async fn create_oauth_application(&self, org_id: &str) -> Result<(String, String, String)> {
|
||||
pub async fn create_oauth_application(
|
||||
&self,
|
||||
_org_id: &str,
|
||||
) -> Result<(String, String, String)> {
|
||||
let app_name = "BotServer";
|
||||
let redirect_uri = std::env::var("DIRECTORY_REDIRECT_URI")
|
||||
.unwrap_or_else(|_| "http://localhost:8080/auth/callback".to_string());
|
||||
|
|
@ -275,7 +374,7 @@ impl DirectorySetup {
|
|||
}
|
||||
|
||||
/// Grant admin permissions to user
|
||||
async fn grant_user_permissions(&self, org_id: &str, user_id: &str) -> Result<()> {
|
||||
pub async fn grant_user_permissions(&self, org_id: &str, user_id: &str) -> Result<()> {
|
||||
// Grant ORG_OWNER role
|
||||
let _response = self
|
||||
.client
|
||||
|
|
@ -295,7 +394,41 @@ impl DirectorySetup {
|
|||
}
|
||||
|
||||
/// Save configuration to file
|
||||
async fn save_config(&self, config: &DirectoryConfig) -> Result<()> {
|
||||
pub async fn save_config(
|
||||
&mut self,
|
||||
org_id: String,
|
||||
org_name: String,
|
||||
admin_user: DefaultUser,
|
||||
client_id: String,
|
||||
client_secret: String,
|
||||
) -> Result<DirectoryConfig> {
|
||||
// Get or create admin token
|
||||
self.ensure_admin_token().await?;
|
||||
|
||||
let config = DirectoryConfig {
|
||||
base_url: self.base_url.clone(),
|
||||
default_org: DefaultOrganization {
|
||||
id: org_id,
|
||||
name: org_name.clone(),
|
||||
domain: format!("{}.localhost", org_name.to_lowercase()),
|
||||
},
|
||||
default_user: admin_user,
|
||||
admin_token: self.admin_token.clone().unwrap_or_default(),
|
||||
project_id: String::new(), // This will be set if OAuth app is created
|
||||
client_id,
|
||||
client_secret,
|
||||
};
|
||||
|
||||
// Save to file
|
||||
let json = serde_json::to_string_pretty(&config)?;
|
||||
fs::write(&self.config_path, json).await?;
|
||||
|
||||
log::info!("Saved Directory configuration to {:?}", self.config_path);
|
||||
Ok(config)
|
||||
}
|
||||
|
||||
/// Internal save configuration to file
|
||||
async fn save_config_internal(&self, config: &DirectoryConfig) -> Result<()> {
|
||||
let json = serde_json::to_string_pretty(config)?;
|
||||
fs::write(&self.config_path, json).await?;
|
||||
Ok(())
|
||||
|
|
@ -315,7 +448,7 @@ impl DirectorySetup {
|
|||
}
|
||||
|
||||
/// Generate Zitadel configuration file
|
||||
pub async fn generate_directory_config(config_path: PathBuf, db_path: PathBuf) -> Result<()> {
|
||||
pub async fn generate_directory_config(config_path: PathBuf, _db_path: PathBuf) -> Result<()> {
|
||||
let yaml_config = format!(
|
||||
r#"
|
||||
Log:
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
use anyhow::Result;
|
||||
use reqwest::Client;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::PathBuf;
|
||||
use std::time::Duration;
|
||||
|
|
@ -7,11 +6,11 @@ use tokio::fs;
|
|||
use tokio::time::sleep;
|
||||
|
||||
/// Email (Stalwart) auto-setup manager
|
||||
#[derive(Debug)]
|
||||
pub struct EmailSetup {
|
||||
base_url: String,
|
||||
admin_user: String,
|
||||
admin_pass: String,
|
||||
client: Client,
|
||||
config_path: PathBuf,
|
||||
}
|
||||
|
||||
|
|
@ -44,10 +43,6 @@ impl EmailSetup {
|
|||
base_url,
|
||||
admin_user,
|
||||
admin_pass,
|
||||
client: Client::builder()
|
||||
.timeout(Duration::from_secs(30))
|
||||
.build()
|
||||
.unwrap(),
|
||||
config_path,
|
||||
}
|
||||
}
|
||||
|
|
@ -75,7 +70,10 @@ impl EmailSetup {
|
|||
}
|
||||
|
||||
/// Initialize email server with default configuration
|
||||
pub async fn initialize(&mut self, directory_config_path: Option<PathBuf>) -> Result<EmailConfig> {
|
||||
pub async fn initialize(
|
||||
&mut self,
|
||||
directory_config_path: Option<PathBuf>,
|
||||
) -> Result<EmailConfig> {
|
||||
log::info!("🔧 Initializing Email (Stalwart) server...");
|
||||
|
||||
// Check if already initialized
|
||||
|
|
@ -154,7 +152,9 @@ impl EmailSetup {
|
|||
let content = fs::read_to_string(directory_config_path).await?;
|
||||
let dir_config: serde_json::Value = serde_json::from_str(&content)?;
|
||||
|
||||
let issuer_url = dir_config["base_url"].as_str().unwrap_or("http://localhost:8080");
|
||||
let issuer_url = dir_config["base_url"]
|
||||
.as_str()
|
||||
.unwrap_or("http://localhost:8080");
|
||||
|
||||
log::info!("Setting up OIDC authentication with Directory...");
|
||||
log::info!("Issuer URL: {}", issuer_url);
|
||||
|
|
@ -184,7 +184,12 @@ impl EmailSetup {
|
|||
}
|
||||
|
||||
/// Create email account for Directory user
|
||||
pub async fn create_user_mailbox(&self, username: &str, password: &str, email: &str) -> Result<()> {
|
||||
pub async fn create_user_mailbox(
|
||||
&self,
|
||||
_username: &str,
|
||||
_password: &str,
|
||||
email: &str,
|
||||
) -> Result<()> {
|
||||
log::info!("Creating mailbox for user: {}", email);
|
||||
|
||||
// Implement Stalwart mailbox creation
|
||||
|
|
|
|||
|
|
@ -1,7 +1,5 @@
|
|||
pub mod directory_setup;
|
||||
pub mod email_setup;
|
||||
|
||||
pub use directory_setup::{
|
||||
generate_directory_config, DefaultOrganization, DefaultUser, DirectoryConfig, DirectorySetup,
|
||||
};
|
||||
pub use email_setup::{generate_email_config, EmailConfig, EmailDomain, EmailSetup};
|
||||
pub use directory_setup::DirectorySetup;
|
||||
pub use email_setup::EmailSetup;
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ use std::error::Error;
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
#[derive(Clone, Serialize, Deserialize)]
|
||||
#[derive(Clone, Serialize, Deserialize, Debug)]
|
||||
pub struct SessionData {
|
||||
pub id: Uuid,
|
||||
pub user_id: Option<Uuid>,
|
||||
|
|
@ -32,6 +32,17 @@ pub struct SessionManager {
|
|||
redis: Option<Arc<Client>>,
|
||||
}
|
||||
|
||||
impl std::fmt::Debug for SessionManager {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
f.debug_struct("SessionManager")
|
||||
.field("conn", &"PooledConnection<PgConnection>")
|
||||
.field("sessions", &self.sessions)
|
||||
.field("waiting_for_input", &self.waiting_for_input)
|
||||
.field("redis", &self.redis.is_some())
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl SessionManager {
|
||||
pub fn new(
|
||||
conn: PooledConnection<ConnectionManager<PgConnection>>,
|
||||
|
|
@ -175,7 +186,8 @@ impl SessionManager {
|
|||
|
||||
fn _clear_messages(&mut self, _session_id: Uuid) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
use crate::shared::models::message_history::dsl::*;
|
||||
diesel::delete(message_history.filter(session_id.eq(session_id))).execute(&mut self.conn)?;
|
||||
diesel::delete(message_history.filter(session_id.eq(session_id)))
|
||||
.execute(&mut self.conn)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
|
@ -343,9 +355,7 @@ impl SessionManager {
|
|||
/* Axum handlers */
|
||||
|
||||
/// Create a new session (anonymous user)
|
||||
pub async fn create_session(
|
||||
Extension(state): Extension<Arc<AppState>>,
|
||||
) -> impl IntoResponse {
|
||||
pub async fn create_session(Extension(state): Extension<Arc<AppState>>) -> impl IntoResponse {
|
||||
// Using a fixed anonymous user ID for simplicity
|
||||
let user_id = Uuid::parse_str("00000000-0000-0000-0000-000000000001").unwrap();
|
||||
let bot_id = Uuid::nil();
|
||||
|
|
@ -374,9 +384,7 @@ pub async fn create_session(
|
|||
}
|
||||
|
||||
/// Get list of sessions for the anonymous user
|
||||
pub async fn get_sessions(
|
||||
Extension(state): Extension<Arc<AppState>>,
|
||||
) -> impl IntoResponse {
|
||||
pub async fn get_sessions(Extension(state): Extension<Arc<AppState>>) -> impl IntoResponse {
|
||||
let user_id = Uuid::parse_str("00000000-0000-0000-0000-000000000001").unwrap();
|
||||
let orchestrator = BotOrchestrator::new(state.clone());
|
||||
match orchestrator.get_user_sessions(user_id).await {
|
||||
|
|
|
|||
|
|
@ -1,44 +1,64 @@
|
|||
use crate::auth::AuthService;
|
||||
use crate::channels::{ChannelAdapter, VoiceAdapter, WebChannelAdapter};
|
||||
use crate::config::AppConfig;
|
||||
use crate::llm::LLMProvider;
|
||||
use crate::session::SessionManager;
|
||||
use crate::shared::models::BotResponse;
|
||||
use crate::shared::utils::DbPool;
|
||||
use aws_sdk_s3::Client as S3Client;
|
||||
use redis::Client as RedisClient;
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use tokio::sync::mpsc;
|
||||
use crate::shared::models::BotResponse;
|
||||
use crate::auth::AuthService;
|
||||
use crate::shared::utils::DbPool;
|
||||
|
||||
pub struct AppState {
|
||||
pub drive: Option<S3Client>,
|
||||
pub cache: Option<Arc<RedisClient>>,
|
||||
pub bucket_name: String,
|
||||
pub config: Option<AppConfig>,
|
||||
pub conn: DbPool,
|
||||
pub session_manager: Arc<tokio::sync::Mutex<SessionManager>>,
|
||||
pub llm_provider: Arc<dyn LLMProvider>,
|
||||
pub auth_service: Arc<tokio::sync::Mutex<AuthService>>,
|
||||
pub channels: Arc<tokio::sync::Mutex<HashMap<String, Arc<dyn ChannelAdapter>>>>,
|
||||
pub response_channels: Arc<tokio::sync::Mutex<HashMap<String, mpsc::Sender<BotResponse>>>>,
|
||||
pub web_adapter: Arc<WebChannelAdapter>,
|
||||
pub voice_adapter: Arc<VoiceAdapter>,
|
||||
pub drive: Option<S3Client>,
|
||||
pub cache: Option<Arc<RedisClient>>,
|
||||
pub bucket_name: String,
|
||||
pub config: Option<AppConfig>,
|
||||
pub conn: DbPool,
|
||||
pub session_manager: Arc<tokio::sync::Mutex<SessionManager>>,
|
||||
pub llm_provider: Arc<dyn LLMProvider>,
|
||||
pub auth_service: Arc<tokio::sync::Mutex<AuthService>>,
|
||||
pub channels: Arc<tokio::sync::Mutex<HashMap<String, Arc<dyn ChannelAdapter>>>>,
|
||||
pub response_channels: Arc<tokio::sync::Mutex<HashMap<String, mpsc::Sender<BotResponse>>>>,
|
||||
pub web_adapter: Arc<WebChannelAdapter>,
|
||||
pub voice_adapter: Arc<VoiceAdapter>,
|
||||
}
|
||||
impl Clone for AppState {
|
||||
fn clone(&self) -> Self {
|
||||
Self {
|
||||
drive: self.drive.clone(),
|
||||
bucket_name: self.bucket_name.clone(),
|
||||
config: self.config.clone(),
|
||||
conn: self.conn.clone(),
|
||||
cache: self.cache.clone(),
|
||||
session_manager: Arc::clone(&self.session_manager),
|
||||
llm_provider: Arc::clone(&self.llm_provider),
|
||||
auth_service: Arc::clone(&self.auth_service),
|
||||
channels: Arc::clone(&self.channels),
|
||||
response_channels: Arc::clone(&self.response_channels),
|
||||
web_adapter: Arc::clone(&self.web_adapter),
|
||||
voice_adapter: Arc::clone(&self.voice_adapter),
|
||||
}
|
||||
}
|
||||
fn clone(&self) -> Self {
|
||||
Self {
|
||||
drive: self.drive.clone(),
|
||||
bucket_name: self.bucket_name.clone(),
|
||||
config: self.config.clone(),
|
||||
conn: self.conn.clone(),
|
||||
cache: self.cache.clone(),
|
||||
session_manager: Arc::clone(&self.session_manager),
|
||||
llm_provider: Arc::clone(&self.llm_provider),
|
||||
auth_service: Arc::clone(&self.auth_service),
|
||||
channels: Arc::clone(&self.channels),
|
||||
response_channels: Arc::clone(&self.response_channels),
|
||||
web_adapter: Arc::clone(&self.web_adapter),
|
||||
voice_adapter: Arc::clone(&self.voice_adapter),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::fmt::Debug for AppState {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
f.debug_struct("AppState")
|
||||
.field("drive", &self.drive.is_some())
|
||||
.field("cache", &self.cache.is_some())
|
||||
.field("bucket_name", &self.bucket_name)
|
||||
.field("config", &self.config)
|
||||
.field("conn", &"DbPool")
|
||||
.field("session_manager", &"Arc<Mutex<SessionManager>>")
|
||||
.field("llm_provider", &"Arc<dyn LLMProvider>")
|
||||
.field("auth_service", &"Arc<Mutex<AuthService>>")
|
||||
.field("channels", &"Arc<Mutex<HashMap>>")
|
||||
.field("response_channels", &"Arc<Mutex<HashMap>>")
|
||||
.field("web_adapter", &self.web_adapter)
|
||||
.field("voice_adapter", &self.voice_adapter)
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
|
|
|||
1360
src/ui_tree/mod.rs
1360
src/ui_tree/mod.rs
File diff suppressed because it is too large
Load diff
|
|
@ -1,8 +1,8 @@
|
|||
use crate::config::ConfigManager;
|
||||
use crate::nvidia;
|
||||
use crate::nvidia::get_system_metrics;
|
||||
use crate::shared::models::schema::bots::dsl::*;
|
||||
use crate::shared::state::AppState;
|
||||
use botserver::nvidia::get_system_metrics;
|
||||
use diesel::prelude::*;
|
||||
use std::sync::Arc;
|
||||
use sysinfo::System;
|
||||
|
|
@ -30,7 +30,8 @@ impl StatusPanel {
|
|||
let _tokens = (std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs() % 1000) as usize;
|
||||
.as_secs()
|
||||
% 1000) as usize;
|
||||
let _system_metrics = nvidia::get_system_metrics().unwrap_or_default();
|
||||
self.cached_content = self.render(None);
|
||||
self.last_update = std::time::Instant::now();
|
||||
|
|
@ -52,7 +53,6 @@ impl StatusPanel {
|
|||
lines.push(format!(" CPU: {:5.1}% {}", cpu_usage, cpu_bar));
|
||||
let system_metrics = get_system_metrics().unwrap_or_default();
|
||||
|
||||
|
||||
if let Some(gpu_usage) = system_metrics.gpu_usage {
|
||||
let gpu_bar = Self::create_progress_bar(gpu_usage, 20);
|
||||
lines.push(format!(" GPU: {:5.1}% {}", gpu_usage, gpu_bar));
|
||||
|
|
@ -111,15 +111,22 @@ impl StatusPanel {
|
|||
} else {
|
||||
for (bot_name, bot_id) in bot_list {
|
||||
let marker = if let Some(ref selected) = selected_bot {
|
||||
if selected == &bot_name { "►" } else { " " }
|
||||
} else { " " };
|
||||
if selected == &bot_name {
|
||||
"►"
|
||||
} else {
|
||||
" "
|
||||
}
|
||||
} else {
|
||||
" "
|
||||
};
|
||||
lines.push(format!(" {} 🤖 {}", marker, bot_name));
|
||||
|
||||
if let Some(ref selected) = selected_bot {
|
||||
if selected == &bot_name {
|
||||
lines.push("".to_string());
|
||||
lines.push(" ┌─ Bot Configuration ─────────┐".to_string());
|
||||
let config_manager = ConfigManager::new(self.app_state.conn.clone());
|
||||
let config_manager =
|
||||
ConfigManager::new(self.app_state.conn.clone());
|
||||
let llm_model = config_manager
|
||||
.get_config(&bot_id, "llm-model", None)
|
||||
.unwrap_or_else(|_| "N/A".to_string());
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue