botserver/src/llm
Rodrigo Rodriguez (Pragmatismo) 3626719d7f chore: point LLM client and default configs to local endpoint
Updated the 6.0.4 migration to use `http://localhost:8081/v1` for the default OpenAI model configurations (gpt‑4 and gpt‑3.5‑turbo) and the local embed service. Adjusted `OpenAIClient` to default to the same localhost base URL instead of the production OpenAI API.

Reorganized imports and module ordering in `src/main.rs` (moved `mod llm`, `mod nvidia`, and `BotOrchestrator` import), cleaned up formatting, and removed unused imports. These changes streamline development by directing LLM calls to a local server and improve code readability.
2025-11-07 16:40:19 -03:00
..
local.rs refactor(bot): disable history retrieval and simplify LLM args 2025-11-07 16:35:48 -03:00
mod.rs chore: point LLM client and default configs to local endpoint 2025-11-07 16:40:19 -03:00