Added `once_cell` and `scopeguard` dependencies to implement thread-safe compaction lock mechanism. Modified `compact_prompt_for_bot` to:
- Prevent concurrent compaction for the same bot using a global lock
- Add proper tracing and error handling
- Improve summarization with content filtering
- Clean up locks automatically using scopeguard
- Remove redundant threshold check and compact entire history
The changes ensure thread safety during prompt compaction and provide better observability through tracing.
- Added initial 30s delay to compact prompt scheduler
- Implemented async LLM summarization for conversation history
- Reduced lock contention by minimizing critical sections
- Added fallback to original text if summarization fails
- Updated README with guidance for failed requirements
- Added new `summarize` method to LLMProvider trait
- Improved session manager query with proper DSL usage
The changes optimize the prompt compaction process by:
1. Reducing lock contention through better resource management
2. Adding LLM-based summarization for better conversation compression
3. Making the system more resilient with proper error handling
4. Improving documentation for development practices
Added new compact_prompt module and its scheduler initialization in AutomationService.
Refactored code for better readability:
- Improved import organization
- Fixed indentation in schedule checking logic
- Enhanced error handling with more descriptive messages
- Formatted long lines for better readability
- Added comments for clarity
The changes maintain existing functionality while making the code more maintainable.