botserver/templates
Rodrigo Rodriguez (Pragmatismo) f626793d77 Refactor LLM flow, add prompts, fix UI streaming
- Extract LLM generation into `execute_llm_generation` and simplify
  keyword handling.
- Prepend system prompt and session context to LLM prompts in
  `BotOrchestrator`.
- Parse incoming WebSocket messages as JSON and use the `content` field.
- Add async `get_session_context` and stop injecting Redis context into
  conversation history.
- Change default LLM URL to `http://48.217.66.81:8080` throughout the
  project.
- Use the existing DB pool instead of creating a separate custom
  connection.
- Update `start.bas` to call LLM and set a new context string.
- Refactor web client message handling: separate event processing,
  improve streaming logic, reset streaming state on thinking end, and
  remove unused test functions.
2025-10-15 01:14:37 -03:00
..
annoucements.gbai/annoucements.gbdialog Refactor LLM flow, add prompts, fix UI streaming 2025-10-15 01:14:37 -03:00