b6d3e0a2d5
Update llama.cpp to b7345 with platform-specific builds and checksums
...
- Update 3rdparty.toml: llama.cpp b4547 -> b7345 with SHA256 checksums
- Add config/llm_releases.json with complete checksums for all 24 release assets
- Fix Windows binary naming in installer.rs (win-cpu-x64, win-cpu-arm64)
- Add Vulkan detection for Windows
- Add platform-specific variants: CUDA 12/13, Vulkan, HIP, SYCL, OpenCL
2025-12-10 12:54:52 -03:00
f3e38d8d8b
feat(console): Show UI immediately with live system logs
...
- Add state_channel field to XtreeUI for receiving AppState updates
- Add set_state_channel() method to enable async state communication
- Poll for AppState in event loop to initialize panels when ready
- UI now shows loading state instantly, logs stream in real-time
- Transitions to full interactive mode when AppState is received
2025-12-08 23:35:33 -03:00
89e92a4739
feat: add offline installer cache and health endpoints
...
- Add /health and /api/health endpoints for botui connectivity
- Create 3rdparty.toml with all download URLs for offline bundles
- Add botserver-installers/ cache directory for downloaded files
- Implement DownloadCache module with:
- Automatic cache lookup before downloading
- Support for pre-populated offline bundles
- SHA256 checksum verification (optional)
- Cache management utilities (list, clear, size)
- Update download_and_install to use cache system
- Data files (models) also cached for reuse
Cache behavior:
- First run: downloads to botserver-installers/
- Subsequent runs: uses cached files
- Delete botserver-stack/ without losing downloads
- Pre-populate cache for fully offline installation
2025-12-08 14:08:49 -03:00