- Extract LLM generation into `execute_llm_generation` and simplify keyword handling. - Prepend system prompt and session context to LLM prompts in `BotOrchestrator`. - Parse incoming WebSocket messages as JSON and use the `content` field. - Add async `get_session_context` and stop injecting Redis context into conversation history. - Change default LLM URL to `http://48.217.66.81:8080` throughout the project. - Use the existing DB pool instead of creating a separate custom connection. - Update `start.bas` to call LLM and set a new context string. - Refactor web client message handling: separate event processing, improve streaming logic, reset streaming state on thinking end, and remove unused test functions. |
||
|---|---|---|
| .. | ||
| annoucements.gbai/annoucements.gbdialog | ||