Refactor: Split large files into modular subdirectories
Some checks failed
BotServer CI / build (push) Failing after 1m34s
Some checks failed
BotServer CI / build (push) Failing after 1m34s
Split 20+ files over 1000 lines into focused subdirectories for better maintainability and code organization. All changes maintain backward compatibility through re-export wrappers. Major splits: - attendance/llm_assist.rs (2074→7 modules) - basic/keywords/face_api.rs → face_api/ (7 modules) - basic/keywords/file_operations.rs → file_ops/ (8 modules) - basic/keywords/hear_talk.rs → hearing/ (6 modules) - channels/wechat.rs → wechat/ (10 modules) - channels/youtube.rs → youtube/ (5 modules) - contacts/mod.rs → contacts_api/ (6 modules) - core/bootstrap/mod.rs → bootstrap/ (5 modules) - core/shared/admin.rs → admin_*.rs (5 modules) - designer/canvas.rs → canvas_api/ (6 modules) - designer/mod.rs → designer_api/ (6 modules) - docs/handlers.rs → handlers_api/ (11 modules) - drive/mod.rs → drive_handlers.rs, drive_types.rs - learn/mod.rs → types.rs - main.rs → main_module/ (7 modules) - meet/webinar.rs → webinar_api/ (8 modules) - paper/mod.rs → (10 modules) - security/auth.rs → auth_api/ (7 modules) - security/passkey.rs → (4 modules) - sources/mod.rs → sources_api/ (5 modules) - tasks/mod.rs → task_api/ (5 modules) Stats: 38,040 deletions, 1,315 additions across 318 files Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
parent
3566a8c87f
commit
5ea171d126
460 changed files with 37738 additions and 37477 deletions
|
|
@ -3,7 +3,7 @@
|
|||
|
||||
# Server
|
||||
HOST=0.0.0.0
|
||||
PORT=8088
|
||||
PORT=9000
|
||||
RUST_LOG=info
|
||||
|
||||
# Database (SQLite for embedded, no PostgreSQL needed)
|
||||
|
|
|
|||
|
|
@ -1,20 +0,0 @@
|
|||
{
|
||||
"base_url": "http://localhost:8300",
|
||||
"default_org": {
|
||||
"id": "359454828524470274",
|
||||
"name": "default",
|
||||
"domain": "default.localhost"
|
||||
},
|
||||
"default_user": {
|
||||
"id": "admin",
|
||||
"username": "admin",
|
||||
"email": "admin@localhost",
|
||||
"password": "",
|
||||
"first_name": "Admin",
|
||||
"last_name": "User"
|
||||
},
|
||||
"admin_token": "vuZlSrNRdCEm0qY6jBj4KrUT5QFepGbtu9Zn_JDXby4HaTXejQKhRgYmSie3T_qLOmcuDZw",
|
||||
"project_id": "",
|
||||
"client_id": "359454829094961154",
|
||||
"client_secret": "OVzcDUzhBqcWDWmoakDbZ8HKAiy7RHcCBeD71dvhdFmcVpQc3Rq3pvr1CpX2zmIe"
|
||||
}
|
||||
214
deploy/README.md
214
deploy/README.md
|
|
@ -1,214 +0,0 @@
|
|||
# Deployment Guide
|
||||
|
||||
## Overview
|
||||
|
||||
This directory contains deployment configurations and scripts for General Bots in production environments.
|
||||
|
||||
## Deployment Methods
|
||||
|
||||
### 1. Traditional Server Deployment
|
||||
|
||||
#### Prerequisites
|
||||
- Server with Linux (Ubuntu 20.04+ recommended)
|
||||
- Rust 1.70+ toolchain
|
||||
- PostgreSQL, Redis, Qdrant installed or managed by botserver
|
||||
- At least 4GB RAM, 2 CPU cores
|
||||
|
||||
#### Steps
|
||||
|
||||
1. **Build Release Binaries:**
|
||||
```bash
|
||||
cargo build --release -p botserver -p botui
|
||||
```
|
||||
|
||||
2. **Deploy to Production:**
|
||||
```bash
|
||||
# Copy binaries
|
||||
sudo cp target/release/botserver /opt/gbo/bin/
|
||||
sudo cp target/release/botui /opt/gbo/bin/
|
||||
|
||||
# Deploy UI files
|
||||
./botserver/deploy/deploy-ui.sh /opt/gbo
|
||||
|
||||
# Set permissions
|
||||
sudo chmod +x /opt/gbo/bin/botserver
|
||||
sudo chmod +x /opt/gbo/bin/botui
|
||||
```
|
||||
|
||||
3. **Configure Environment:**
|
||||
```bash
|
||||
# Copy and edit environment file
|
||||
cp botserver/.env.example /opt/gbo/.env
|
||||
nano /opt/gbo/.env
|
||||
```
|
||||
|
||||
4. **Start Services:**
|
||||
```bash
|
||||
# Using systemd (recommended)
|
||||
sudo systemctl start botserver
|
||||
sudo systemctl start botui
|
||||
|
||||
# Or manually
|
||||
/opt/gbo/bin/botserver --noconsole
|
||||
/opt/gbo/bin/botui
|
||||
```
|
||||
|
||||
### 2. Kubernetes Deployment
|
||||
|
||||
#### Prerequisites
|
||||
- Kubernetes cluster 1.24+
|
||||
- kubectl configured
|
||||
- Persistent volumes provisioned
|
||||
|
||||
#### Steps
|
||||
|
||||
1. **Create Namespace:**
|
||||
```bash
|
||||
kubectl create namespace generalbots
|
||||
```
|
||||
|
||||
2. **Deploy UI Files:**
|
||||
```bash
|
||||
# Create ConfigMap with UI files
|
||||
kubectl create configmap botui-files \
|
||||
--from-file=botui/ui/suite/ \
|
||||
-n generalbots
|
||||
```
|
||||
|
||||
3. **Apply Deployment:**
|
||||
```bash
|
||||
kubectl apply -f botserver/deploy/kubernetes/deployment.yaml
|
||||
```
|
||||
|
||||
4. **Verify Deployment:**
|
||||
```bash
|
||||
kubectl get pods -n generalbots
|
||||
kubectl logs -f deployment/botserver -n generalbots
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### UI Files Not Found Error
|
||||
|
||||
**Symptom:**
|
||||
```
|
||||
Asset 'suite/index.html' not found in embedded binary, falling back to filesystem
|
||||
Failed to load suite UI: No such file or directory
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
|
||||
**For Traditional Deployment:**
|
||||
```bash
|
||||
# Run the deployment script
|
||||
./botserver/deploy/deploy-ui.sh /opt/gbo
|
||||
|
||||
# Verify files exist
|
||||
ls -la /opt/gbo/bin/ui/suite/index.html
|
||||
```
|
||||
|
||||
**For Kubernetes:**
|
||||
```bash
|
||||
# Recreate UI ConfigMap
|
||||
kubectl delete configmap botui-files -n generalbots
|
||||
kubectl create configmap botui-files \
|
||||
--from-file=botui/ui/suite/ \
|
||||
-n generalbots
|
||||
|
||||
# Restart pods
|
||||
kubectl rollout restart deployment/botserver -n generalbots
|
||||
```
|
||||
|
||||
### Port Already in Use
|
||||
|
||||
```bash
|
||||
# Find process using port
|
||||
lsof -ti:8088 | xargs kill -9
|
||||
lsof -ti:3000 | xargs kill -9
|
||||
```
|
||||
|
||||
### Permission Denied
|
||||
|
||||
```bash
|
||||
# Fix ownership and permissions
|
||||
sudo chown -R gbo:gbo /opt/gbo
|
||||
sudo chmod -R 755 /opt/gbo/bin
|
||||
```
|
||||
|
||||
## Maintenance
|
||||
|
||||
### Update UI Files
|
||||
|
||||
**Traditional:**
|
||||
```bash
|
||||
./botserver/deploy/deploy-ui.sh /opt/gbo
|
||||
sudo systemctl restart botui
|
||||
```
|
||||
|
||||
**Kubernetes:**
|
||||
```bash
|
||||
kubectl create configmap botui-files \
|
||||
--from-file=botui/ui/suite/ \
|
||||
-n generalbots \
|
||||
--dry-run=client -o yaml | kubectl apply -f -
|
||||
kubectl rollout restart deployment/botserver -n generalbots
|
||||
```
|
||||
|
||||
### Update Binaries
|
||||
|
||||
1. Build new release
|
||||
2. Stop services
|
||||
3. Replace binaries
|
||||
4. Start services
|
||||
|
||||
### Backup
|
||||
|
||||
```bash
|
||||
# Backup database
|
||||
pg_dump -U postgres -d gb > backup.sql
|
||||
|
||||
# Backup UI files (if customized)
|
||||
tar -czf ui-backup.tar.gz /opt/gbo/bin/ui/
|
||||
|
||||
# Backup configuration
|
||||
cp /opt/gbo/.env /opt/gbo/.env.backup
|
||||
```
|
||||
|
||||
## Monitoring
|
||||
|
||||
### Check Logs
|
||||
|
||||
**Traditional:**
|
||||
```bash
|
||||
tail -f /opt/gbo/logs/botserver.log
|
||||
tail -f /opt/gbo/logs/botui.log
|
||||
```
|
||||
|
||||
**Kubernetes:**
|
||||
```bash
|
||||
kubectl logs -f deployment/botserver -n generalbots
|
||||
```
|
||||
|
||||
### Health Checks
|
||||
|
||||
```bash
|
||||
# Check server health
|
||||
curl http://localhost:8088/health
|
||||
|
||||
# Check botui health
|
||||
curl http://localhost:3000/health
|
||||
```
|
||||
|
||||
## Security
|
||||
|
||||
- Always use HTTPS in production
|
||||
- Rotate secrets regularly
|
||||
- Update dependencies monthly
|
||||
- Review logs for suspicious activity
|
||||
- Use firewall to restrict access
|
||||
|
||||
## Support
|
||||
|
||||
For issues or questions:
|
||||
- Documentation: https://docs.pragmatismo.com.br
|
||||
- GitHub Issues: https://github.com/GeneralBots/BotServer/issues
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
|
||||
DEPLOY_DIR="${1:-/opt/gbo}"
|
||||
SRC_DIR="$(dirname "$0")/../.."
|
||||
|
||||
echo "Deploying UI files to $DEPLOY_DIR"
|
||||
|
||||
mkdir -p "$DEPLOY_DIR/bin/ui/suite"
|
||||
|
||||
cp -r "$SRC_DIR/botui/ui/suite/"* "$DEPLOY_DIR/bin/ui/suite/"
|
||||
|
||||
echo "UI files deployed successfully"
|
||||
echo "Location: $DEPLOY_DIR/bin/ui/suite"
|
||||
ls -la "$DEPLOY_DIR/bin/ui/suite" | head -20
|
||||
|
|
@ -14,7 +14,7 @@ use std::sync::Arc;
|
|||
use uuid::Uuid;
|
||||
|
||||
use crate::core::shared::schema::{okr_checkins, okr_key_results, okr_objectives, okr_templates};
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
|
||||
fn get_bot_context() -> (Uuid, Uuid) {
|
||||
let org_id = std::env::var("DEFAULT_ORG_ID")
|
||||
|
|
|
|||
|
|
@ -11,9 +11,9 @@ use serde::Deserialize;
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::bot::get_default_bot;
|
||||
use crate::core::bot::get_default_bot;
|
||||
use crate::core::shared::schema::{okr_checkins, okr_objectives};
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
|
||||
#[derive(Debug, Deserialize, Default)]
|
||||
pub struct ObjectivesQuery {
|
||||
|
|
|
|||
|
|
@ -13,8 +13,8 @@ use serde::{Deserialize, Serialize};
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::shared::state::AppState;
|
||||
use crate::shared::utils::DbPool;
|
||||
use crate::core::shared::state::AppState;
|
||||
use crate::core::shared::utils::DbPool;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct AppUsage {
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ pub mod insights;
|
|||
use crate::core::urls::ApiUrls;
|
||||
#[cfg(feature = "llm")]
|
||||
use crate::llm::observability::{ObservabilityConfig, ObservabilityManager, QuickStats};
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
use axum::{
|
||||
extract::State,
|
||||
response::{Html, IntoResponse},
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
567
src/attendance/llm_assist_commands.rs
Normal file
567
src/attendance/llm_assist_commands.rs
Normal file
|
|
@ -0,0 +1,567 @@
|
|||
use super::llm_assist_types::*;
|
||||
use super::llm_assist_helpers::*;
|
||||
use super::llm_assist_handlers::*;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::info;
|
||||
use std::fmt::Write;
|
||||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
pub async fn process_attendant_command(
|
||||
state: &Arc<AppState>,
|
||||
attendant_phone: &str,
|
||||
command: &str,
|
||||
current_session: Option<Uuid>,
|
||||
) -> Result<String, String> {
|
||||
let parts: Vec<&str> = command.split_whitespace().collect();
|
||||
if parts.is_empty() {
|
||||
return Err("Empty command".to_string());
|
||||
}
|
||||
|
||||
let cmd = parts[0].to_lowercase();
|
||||
let args: Vec<&str> = parts[1..].to_vec();
|
||||
|
||||
match cmd.as_str() {
|
||||
"/queue" | "/fila" => handle_queue_command(state).await,
|
||||
"/take" | "/pegar" => handle_take_command(state, attendant_phone).await,
|
||||
"/status" => handle_status_command(state, attendant_phone, args).await,
|
||||
"/transfer" | "/transferir" => handle_transfer_command(state, current_session, args).await,
|
||||
"/resolve" | "/resolver" => handle_resolve_command(state, current_session).await,
|
||||
"/tips" | "/dicas" => handle_tips_command(state, current_session).await,
|
||||
"/polish" | "/polir" => {
|
||||
let message = args.join(" ");
|
||||
handle_polish_command(state, current_session, &message).await
|
||||
}
|
||||
"/replies" | "/respostas" => handle_replies_command(state, current_session).await,
|
||||
"/summary" | "/resumo" => handle_summary_command(state, current_session).await,
|
||||
"/help" | "/ajuda" => Ok(get_help_text()),
|
||||
_ => Err(format!(
|
||||
"Unknown command: {}. Type /help for available commands.",
|
||||
cmd
|
||||
)),
|
||||
}
|
||||
}
|
||||
|
||||
async fn handle_queue_command(state: &Arc<AppState>) -> Result<String, String> {
|
||||
let conn = state.conn.clone();
|
||||
let result = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| e.to_string())?;
|
||||
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let sessions: Vec<crate::core::shared::models::UserSession> = user_sessions::table
|
||||
.filter(
|
||||
user_sessions::context_data
|
||||
.retrieve_as_text("needs_human")
|
||||
.eq("true"),
|
||||
)
|
||||
.filter(
|
||||
user_sessions::context_data
|
||||
.retrieve_as_text("status")
|
||||
.ne("resolved"),
|
||||
)
|
||||
.order(user_sessions::updated_at.desc())
|
||||
.limit(10)
|
||||
.load(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
Ok::<Vec<crate::core::shared::models::UserSession>, String>(sessions)
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())??;
|
||||
|
||||
if result.is_empty() {
|
||||
return Ok(" *Queue is empty*\nNo conversations waiting for attention.".to_string());
|
||||
}
|
||||
|
||||
let mut response = format!(" *Queue* ({} waiting)\n\n", result.len());
|
||||
|
||||
for (i, session) in result.iter().enumerate() {
|
||||
let name = session
|
||||
.context_data
|
||||
.get("name")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("Unknown");
|
||||
let channel = session
|
||||
.context_data
|
||||
.get("channel")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("web");
|
||||
let status = session
|
||||
.context_data
|
||||
.get("status")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("waiting");
|
||||
|
||||
let _ = write!(
|
||||
response,
|
||||
"{}. *{}* ({})\n Status: {} | ID: {}\n\n",
|
||||
i + 1,
|
||||
name,
|
||||
channel,
|
||||
status,
|
||||
&session.id.to_string()[..8]
|
||||
);
|
||||
}
|
||||
|
||||
response.push_str("Type `/take` to take the next conversation.");
|
||||
|
||||
Ok(response)
|
||||
}
|
||||
|
||||
async fn handle_take_command(
|
||||
state: &Arc<AppState>,
|
||||
attendant_phone: &str,
|
||||
) -> Result<String, String> {
|
||||
let conn = state.conn.clone();
|
||||
let phone = attendant_phone.to_string();
|
||||
|
||||
let result = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| e.to_string())?;
|
||||
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: Option<crate::core::shared::models::UserSession> = user_sessions::table
|
||||
.filter(
|
||||
user_sessions::context_data
|
||||
.retrieve_as_text("needs_human")
|
||||
.eq("true"),
|
||||
)
|
||||
.filter(
|
||||
user_sessions::context_data
|
||||
.retrieve_as_text("status")
|
||||
.eq("waiting"),
|
||||
)
|
||||
.order(user_sessions::updated_at.asc())
|
||||
.first(&mut db_conn)
|
||||
.optional()
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
if let Some(session) = session {
|
||||
let mut ctx = session.context_data.clone();
|
||||
ctx["assigned_to_phone"] = serde_json::json!(phone);
|
||||
ctx["status"] = serde_json::json!("assigned");
|
||||
ctx["assigned_at"] = serde_json::json!(chrono::Utc::now().to_rfc3339());
|
||||
|
||||
diesel::update(user_sessions::table.filter(user_sessions::id.eq(session.id)))
|
||||
.set(user_sessions::context_data.eq(&ctx))
|
||||
.execute(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
let name = session
|
||||
.context_data
|
||||
.get("name")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("Unknown");
|
||||
|
||||
Ok::<String, String>(format!(
|
||||
" *Conversation assigned*\n\nCustomer: *{}*\nSession: {}\n\nYou can now respond to this customer. Their messages will be forwarded to you.",
|
||||
name,
|
||||
&session.id.to_string()[..8]
|
||||
))
|
||||
} else {
|
||||
Ok::<String, String>(" No conversations waiting in queue.".to_string())
|
||||
}
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())??;
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
async fn handle_status_command(
|
||||
state: &Arc<AppState>,
|
||||
attendant_phone: &str,
|
||||
args: Vec<&str>,
|
||||
) -> Result<String, String> {
|
||||
if args.is_empty() {
|
||||
return Ok(
|
||||
" *Status Options*\n\n`/status online` - Available\n`/status busy` - In conversation\n`/status away` - Temporarily away\n`/status offline` - Not available"
|
||||
.to_string(),
|
||||
);
|
||||
}
|
||||
|
||||
let status = args[0].to_lowercase();
|
||||
let (emoji, text, status_value) = match status.as_str() {
|
||||
"online" => ("✅", "Online - Available for conversations", "online"),
|
||||
"busy" => ("🔵", "Busy - Handling conversations", "busy"),
|
||||
"away" => ("🟡", "Away - Temporarily unavailable", "away"),
|
||||
"offline" => ("⚫", "Offline - Not available", "offline"),
|
||||
_ => {
|
||||
return Err(format!(
|
||||
"Invalid status: {}. Use online, busy, away, or offline.",
|
||||
status
|
||||
))
|
||||
}
|
||||
};
|
||||
|
||||
let conn = state.conn.clone();
|
||||
let phone = attendant_phone.to_string();
|
||||
let status_val = status_value.to_string();
|
||||
|
||||
let update_result = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| e.to_string())?;
|
||||
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let sessions: Vec<crate::core::shared::models::UserSession> = user_sessions::table
|
||||
.filter(
|
||||
user_sessions::context_data
|
||||
.retrieve_as_text("assigned_to_phone")
|
||||
.eq(&phone),
|
||||
)
|
||||
.load(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
let session_count = sessions.len();
|
||||
for session in sessions {
|
||||
let mut ctx = session.context_data.clone();
|
||||
ctx["attendant_status"] = serde_json::json!(status_val);
|
||||
ctx["attendant_status_updated_at"] = serde_json::json!(chrono::Utc::now().to_rfc3339());
|
||||
|
||||
diesel::update(user_sessions::table.filter(user_sessions::id.eq(session.id)))
|
||||
.set(user_sessions::context_data.eq(&ctx))
|
||||
.execute(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
}
|
||||
|
||||
Ok::<usize, String>(session_count)
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
match update_result {
|
||||
Ok(count) => {
|
||||
info!(
|
||||
"Attendant {} set status to {} ({} sessions updated)",
|
||||
attendant_phone, status_value, count
|
||||
);
|
||||
Ok(format!("{} Status set to *{}*", emoji, text))
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Failed to persist status for {}: {}", attendant_phone, e);
|
||||
|
||||
Ok(format!("{} Status set to *{}*", emoji, text))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async fn handle_transfer_command(
|
||||
state: &Arc<AppState>,
|
||||
current_session: Option<Uuid>,
|
||||
args: Vec<&str>,
|
||||
) -> Result<String, String> {
|
||||
let session_id = current_session.ok_or("No active conversation to transfer")?;
|
||||
|
||||
if args.is_empty() {
|
||||
return Err("Usage: `/transfer @attendant_name` or `/transfer department`".to_string());
|
||||
}
|
||||
|
||||
let target = args.join(" ");
|
||||
let target_clean = target.trim_start_matches('@').to_string();
|
||||
|
||||
let conn = state.conn.clone();
|
||||
let target_attendant = target_clean.clone();
|
||||
|
||||
let transfer_result = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| e.to_string())?;
|
||||
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: crate::core::shared::models::UserSession = user_sessions::table
|
||||
.find(session_id)
|
||||
.first(&mut db_conn)
|
||||
.map_err(|e| format!("Session not found: {}", e))?;
|
||||
|
||||
let mut ctx = session.context_data;
|
||||
let previous_attendant = ctx
|
||||
.get("assigned_to_phone")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("unknown")
|
||||
.to_string();
|
||||
|
||||
ctx["transferred_from"] = serde_json::json!(previous_attendant);
|
||||
ctx["transfer_target"] = serde_json::json!(target_attendant);
|
||||
ctx["transferred_at"] = serde_json::json!(chrono::Utc::now().to_rfc3339());
|
||||
ctx["status"] = serde_json::json!("pending_transfer");
|
||||
|
||||
ctx["assigned_to_phone"] = serde_json::Value::Null;
|
||||
ctx["assigned_to"] = serde_json::Value::Null;
|
||||
|
||||
ctx["needs_human"] = serde_json::json!(true);
|
||||
|
||||
diesel::update(user_sessions::table.filter(user_sessions::id.eq(session_id)))
|
||||
.set((
|
||||
user_sessions::context_data.eq(&ctx),
|
||||
user_sessions::updated_at.eq(chrono::Utc::now()),
|
||||
))
|
||||
.execute(&mut db_conn)
|
||||
.map_err(|e| format!("Failed to update session: {}", e))?;
|
||||
|
||||
Ok::<String, String>(previous_attendant)
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())??;
|
||||
|
||||
info!(
|
||||
"Session {} transferred from {} to {}",
|
||||
session_id, transfer_result, target_clean
|
||||
);
|
||||
|
||||
Ok(format!(
|
||||
" *Transfer initiated*\n\nSession {} is being transferred to *{}*.\n\nThe conversation is now in the queue for the target attendant. They will be notified when they check their queue.",
|
||||
&session_id.to_string()[..8],
|
||||
target_clean
|
||||
))
|
||||
}
|
||||
|
||||
async fn handle_resolve_command(
|
||||
state: &Arc<AppState>,
|
||||
current_session: Option<Uuid>,
|
||||
) -> Result<String, String> {
|
||||
let session_id = current_session.ok_or("No active conversation to resolve")?;
|
||||
|
||||
let conn = state.conn.clone();
|
||||
tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| e.to_string())?;
|
||||
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: crate::core::shared::models::UserSession = user_sessions::table
|
||||
.find(session_id)
|
||||
.first(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
let mut ctx = session.context_data;
|
||||
ctx["status"] = serde_json::json!("resolved");
|
||||
ctx["needs_human"] = serde_json::json!(false);
|
||||
ctx["resolved_at"] = serde_json::json!(chrono::Utc::now().to_rfc3339());
|
||||
|
||||
diesel::update(user_sessions::table.filter(user_sessions::id.eq(session_id)))
|
||||
.set(user_sessions::context_data.eq(&ctx))
|
||||
.execute(&mut db_conn)
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
Ok::<(), String>(())
|
||||
})
|
||||
.await
|
||||
.map_err(|e| e.to_string())??;
|
||||
|
||||
Ok(format!(
|
||||
" *Conversation resolved*\n\nSession {} has been marked as resolved. The customer will be returned to bot mode.",
|
||||
&session_id.to_string()[..8]
|
||||
))
|
||||
}
|
||||
|
||||
async fn handle_tips_command(
|
||||
state: &Arc<AppState>,
|
||||
current_session: Option<Uuid>,
|
||||
) -> Result<String, String> {
|
||||
let session_id = current_session.ok_or("No active conversation. Use /take first.")?;
|
||||
|
||||
let history = load_conversation_history(state, session_id).await;
|
||||
|
||||
if history.is_empty() {
|
||||
return Ok(
|
||||
" No messages yet. Tips will appear when customer sends a message.".to_string(),
|
||||
);
|
||||
}
|
||||
|
||||
let last_customer_msg = history
|
||||
.iter()
|
||||
.rev()
|
||||
.find(|m| m.role == "customer")
|
||||
.map(|m| m.content.clone())
|
||||
.unwrap_or_default();
|
||||
|
||||
let request = TipRequest {
|
||||
session_id,
|
||||
customer_message: last_customer_msg,
|
||||
history,
|
||||
};
|
||||
|
||||
let (_, Json(tip_response)) = generate_tips(State(state.clone()), Json(request)).await;
|
||||
|
||||
if tip_response.tips.is_empty() {
|
||||
return Ok(" No specific tips for this conversation yet.".to_string());
|
||||
}
|
||||
|
||||
let mut result = " *Tips for this conversation*\n\n".to_string();
|
||||
|
||||
for tip in tip_response.tips {
|
||||
let emoji = match tip.tip_type {
|
||||
TipType::Intent
|
||||
| TipType::Action
|
||||
| TipType::Warning
|
||||
| TipType::Knowledge
|
||||
| TipType::History
|
||||
| TipType::General => "💡",
|
||||
};
|
||||
let _ = write!(result, "{} {}\n\n", emoji, tip.content);
|
||||
}
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
async fn handle_polish_command(
|
||||
state: &Arc<AppState>,
|
||||
current_session: Option<Uuid>,
|
||||
message: &str,
|
||||
) -> Result<String, String> {
|
||||
let session_id = current_session.ok_or("No active conversation")?;
|
||||
|
||||
if message.is_empty() {
|
||||
return Err("Usage: `/polish Your message here`".to_string());
|
||||
}
|
||||
|
||||
let request = PolishRequest {
|
||||
session_id,
|
||||
message: message.to_string(),
|
||||
tone: "professional".to_string(),
|
||||
};
|
||||
|
||||
let (_, Json(polish_response)) = polish_message(State(state.clone()), Json(request)).await;
|
||||
|
||||
if !polish_response.success {
|
||||
return Err(polish_response
|
||||
.error
|
||||
.unwrap_or_else(|| "Failed to polish message".to_string()));
|
||||
}
|
||||
|
||||
let mut result = " *Polished message*\n\n".to_string();
|
||||
{
|
||||
let _ = write!(result, "_{}_\n\n", polish_response.polished);
|
||||
}
|
||||
|
||||
if !polish_response.changes.is_empty() {
|
||||
result.push_str("Changes:\n");
|
||||
for change in polish_response.changes {
|
||||
let _ = writeln!(result, "• {}", change);
|
||||
}
|
||||
}
|
||||
|
||||
result.push_str("\n_Copy and send, or edit as needed._");
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
async fn handle_replies_command(
|
||||
state: &Arc<AppState>,
|
||||
current_session: Option<Uuid>,
|
||||
) -> Result<String, String> {
|
||||
let session_id = current_session.ok_or("No active conversation")?;
|
||||
|
||||
let history = load_conversation_history(state, session_id).await;
|
||||
|
||||
let request = SmartRepliesRequest {
|
||||
session_id,
|
||||
history,
|
||||
};
|
||||
|
||||
let (_, Json(replies_response)) =
|
||||
generate_smart_replies(State(state.clone()), Json(request)).await;
|
||||
|
||||
if replies_response.replies.is_empty() {
|
||||
return Ok(" No reply suggestions available.".to_string());
|
||||
}
|
||||
|
||||
let mut result = " *Suggested replies*\n\n".to_string();
|
||||
|
||||
for (i, reply) in replies_response.replies.iter().enumerate() {
|
||||
let _ = write!(
|
||||
result,
|
||||
"*{}. {}*\n_{}_\n\n",
|
||||
i + 1,
|
||||
reply.tone.to_uppercase(),
|
||||
reply.text
|
||||
);
|
||||
}
|
||||
|
||||
result.push_str("_Copy any reply or use as inspiration._");
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
async fn handle_summary_command(
|
||||
state: &Arc<AppState>,
|
||||
current_session: Option<Uuid>,
|
||||
) -> Result<String, String> {
|
||||
let session_id = current_session.ok_or("No active conversation")?;
|
||||
|
||||
let (_, Json(summary_response)) =
|
||||
generate_summary(State(state.clone()), Path(session_id)).await;
|
||||
|
||||
if !summary_response.success {
|
||||
return Err(summary_response
|
||||
.error
|
||||
.unwrap_or_else(|| "Failed to generate summary".to_string()));
|
||||
}
|
||||
|
||||
let summary = summary_response.summary;
|
||||
|
||||
let mut result = " *Conversation Summary*\n\n".to_string();
|
||||
{
|
||||
let _ = write!(result, "{}\n\n", summary.brief);
|
||||
}
|
||||
|
||||
if !summary.key_points.is_empty() {
|
||||
result.push_str("*Key Points:*\n");
|
||||
for point in &summary.key_points {
|
||||
let _ = writeln!(result, "• {}", point);
|
||||
}
|
||||
result.push('\n');
|
||||
}
|
||||
|
||||
if !summary.customer_needs.is_empty() {
|
||||
result.push_str("*Customer Needs:*\n");
|
||||
for need in &summary.customer_needs {
|
||||
let _ = writeln!(result, "• {}", need);
|
||||
}
|
||||
result.push('\n');
|
||||
}
|
||||
|
||||
if !summary.unresolved_issues.is_empty() {
|
||||
result.push_str("*Unresolved:*\n");
|
||||
for issue in &summary.unresolved_issues {
|
||||
let _ = writeln!(result, "• {}", issue);
|
||||
}
|
||||
result.push('\n');
|
||||
}
|
||||
|
||||
{
|
||||
let _ = write!(
|
||||
result,
|
||||
" {} messages | {} minutes | Sentiment: {}",
|
||||
summary.message_count, summary.duration_minutes, summary.sentiment_trend
|
||||
);
|
||||
|
||||
if !summary.recommended_action.is_empty() {
|
||||
let _ = write!(result, "\n\n *Recommended:* {}", summary.recommended_action);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
pub fn get_help_text() -> String {
|
||||
r"*Attendant Commands*
|
||||
|
||||
*Queue Management:*
|
||||
`/queue` - View waiting conversations
|
||||
`/take` - Take next conversation
|
||||
`/transfer @name` - Transfer conversation
|
||||
`/resolve` - Mark as resolved
|
||||
`/status [online|busy|away|offline]`
|
||||
|
||||
*AI Assistance:*
|
||||
`/tips` - Get tips for current conversation
|
||||
`/polish <message>` - Improve your message
|
||||
`/replies` - Get smart reply suggestions
|
||||
`/summary` - Get conversation summary
|
||||
|
||||
*Other:*
|
||||
`/help` - Show this help
|
||||
|
||||
_Portuguese: /fila, /pegar, /transferir, /resolver, /dicas, /polir, /respostas, /resumo, /ajuda_"
|
||||
.to_string()
|
||||
}
|
||||
111
src/attendance/llm_assist_config.rs
Normal file
111
src/attendance/llm_assist_config.rs
Normal file
|
|
@ -0,0 +1,111 @@
|
|||
use super::llm_assist_types::LlmAssistConfig;
|
||||
use log::info;
|
||||
use std::path::PathBuf;
|
||||
use uuid::Uuid;
|
||||
|
||||
impl LlmAssistConfig {
|
||||
pub fn from_config(bot_id: Uuid, work_path: &str) -> Self {
|
||||
let config_path = PathBuf::from(work_path)
|
||||
.join(format!("{}.gbai", bot_id))
|
||||
.join("config.csv");
|
||||
|
||||
let alt_path = PathBuf::from(work_path).join("config.csv");
|
||||
|
||||
let path = if config_path.exists() {
|
||||
config_path
|
||||
} else if alt_path.exists() {
|
||||
alt_path
|
||||
} else {
|
||||
return Self::default();
|
||||
};
|
||||
|
||||
let mut config = Self::default();
|
||||
|
||||
if let Ok(content) = std::fs::read_to_string(&path) {
|
||||
for line in content.lines() {
|
||||
let parts: Vec<&str> = line.split(',').map(|s| s.trim()).collect();
|
||||
|
||||
if parts.len() < 2 {
|
||||
continue;
|
||||
}
|
||||
|
||||
let key = parts[0].to_lowercase();
|
||||
let value = parts[1];
|
||||
|
||||
match key.as_str() {
|
||||
"attendant-llm-tips" => {
|
||||
config.tips_enabled = value.to_lowercase() == "true";
|
||||
}
|
||||
"attendant-polish-message" => {
|
||||
config.polish_enabled = value.to_lowercase() == "true";
|
||||
}
|
||||
"attendant-smart-replies" => {
|
||||
config.smart_replies_enabled = value.to_lowercase() == "true";
|
||||
}
|
||||
"attendant-auto-summary" => {
|
||||
config.auto_summary_enabled = value.to_lowercase() == "true";
|
||||
}
|
||||
"attendant-sentiment-analysis" => {
|
||||
config.sentiment_enabled = value.to_lowercase() == "true";
|
||||
}
|
||||
"bot-description" | "bot_description" => {
|
||||
config.bot_description = Some(value.to_string());
|
||||
}
|
||||
"bot-system-prompt" | "system-prompt" => {
|
||||
config.bot_system_prompt = Some(value.to_string());
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
info!(
|
||||
"LLM Assist config loaded: tips={}, polish={}, replies={}, summary={}, sentiment={}",
|
||||
config.tips_enabled,
|
||||
config.polish_enabled,
|
||||
config.smart_replies_enabled,
|
||||
config.auto_summary_enabled,
|
||||
config.sentiment_enabled
|
||||
);
|
||||
|
||||
config
|
||||
}
|
||||
|
||||
pub fn any_enabled(&self) -> bool {
|
||||
self.tips_enabled
|
||||
|| self.polish_enabled
|
||||
|| self.smart_replies_enabled
|
||||
|| self.auto_summary_enabled
|
||||
|| self.sentiment_enabled
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_bot_system_prompt(bot_id: Uuid, work_path: &str) -> String {
|
||||
let config = LlmAssistConfig::from_config(bot_id, work_path);
|
||||
if let Some(prompt) = config.bot_system_prompt {
|
||||
return prompt;
|
||||
}
|
||||
|
||||
let start_bas_path = PathBuf::from(work_path)
|
||||
.join(format!("{}.gbai", bot_id))
|
||||
.join(format!("{}.gbdialog", bot_id))
|
||||
.join("start.bas");
|
||||
|
||||
if let Ok(content) = std::fs::read_to_string(&start_bas_path) {
|
||||
let mut description_lines = Vec::new();
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
if trimmed.starts_with("REM ") || trimmed.starts_with("' ") {
|
||||
let comment = trimmed.trim_start_matches("REM ").trim_start_matches("' ");
|
||||
description_lines.push(comment);
|
||||
} else if !trimmed.is_empty() {
|
||||
break;
|
||||
}
|
||||
}
|
||||
if !description_lines.is_empty() {
|
||||
return description_lines.join(" ");
|
||||
}
|
||||
}
|
||||
|
||||
"You are a professional customer service assistant. Be helpful, empathetic, and solution-oriented. Maintain a friendly but professional tone.".to_string()
|
||||
}
|
||||
564
src/attendance/llm_assist_handlers.rs
Normal file
564
src/attendance/llm_assist_handlers.rs
Normal file
|
|
@ -0,0 +1,564 @@
|
|||
use super::llm_assist_types::*;
|
||||
use super::llm_assist_config::get_bot_system_prompt;
|
||||
use super::llm_assist_helpers::*;
|
||||
use crate::core::config::ConfigManager;
|
||||
use crate::core::shared::state::AppState;
|
||||
use axum::{
|
||||
extract::{Path, State},
|
||||
http::StatusCode,
|
||||
response::IntoResponse,
|
||||
Json,
|
||||
};
|
||||
use log::{error, info};
|
||||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
pub async fn generate_tips(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(request): Json<TipRequest>,
|
||||
) -> (StatusCode, Json<TipResponse>) {
|
||||
info!("Generating tips for session {}", request.session_id);
|
||||
|
||||
let session_result = get_session(&state, request.session_id).await;
|
||||
let session = match session_result {
|
||||
Ok(s) => s,
|
||||
Err(e) => {
|
||||
return (
|
||||
StatusCode::NOT_FOUND,
|
||||
Json(TipResponse {
|
||||
success: false,
|
||||
tips: vec![],
|
||||
error: Some(e),
|
||||
}),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
let work_path = std::env::var("WORK_PATH").unwrap_or_else(|_| "./work".to_string());
|
||||
let config = crate::attendance::llm_assist_config::LlmAssistConfig::from_config(session.bot_id, &work_path);
|
||||
|
||||
if !config.tips_enabled {
|
||||
return (
|
||||
StatusCode::OK,
|
||||
Json(TipResponse {
|
||||
success: true,
|
||||
tips: vec![],
|
||||
error: Some("Tips feature is disabled".to_string()),
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let history_context = request
|
||||
.history
|
||||
.iter()
|
||||
.map(|m| format!("{}: {}", m.role, m.content))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
|
||||
let bot_prompt = get_bot_system_prompt(session.bot_id, &work_path);
|
||||
|
||||
let system_prompt = format!(
|
||||
r#"You are an AI assistant helping a human customer service attendant.
|
||||
The bot they are replacing has this personality: {}
|
||||
|
||||
Your job is to provide helpful tips to the attendant based on the customer's message.
|
||||
|
||||
Analyze the customer message and provide 2-4 actionable tips. For each tip, classify it as:
|
||||
- intent: What the customer wants
|
||||
- action: Suggested action for attendant
|
||||
- warning: Sentiment or escalation concern
|
||||
- knowledge: Relevant info they should know
|
||||
- history: Insight from conversation history
|
||||
- general: General helpful advice
|
||||
|
||||
Respond in JSON format:
|
||||
{{
|
||||
"tips": [
|
||||
{{"type": "intent", "content": "...", "confidence": 0.9, "priority": 1}},
|
||||
{{"type": "action", "content": "...", "confidence": 0.8, "priority": 2}}
|
||||
]
|
||||
}}"#,
|
||||
bot_prompt
|
||||
);
|
||||
|
||||
let user_prompt = format!(
|
||||
r#"Conversation history:
|
||||
{}
|
||||
|
||||
Latest customer message: "{}"
|
||||
|
||||
Provide tips for the attendant."#,
|
||||
history_context, request.customer_message
|
||||
);
|
||||
|
||||
match execute_llm_with_context(&state, session.bot_id, &system_prompt, &user_prompt).await {
|
||||
Ok(response) => {
|
||||
let tips = parse_tips_response(&response);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(TipResponse {
|
||||
success: true,
|
||||
tips,
|
||||
error: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
Err(e) => {
|
||||
error!("LLM error generating tips: {}", e);
|
||||
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(TipResponse {
|
||||
success: true,
|
||||
tips: generate_fallback_tips(&request.customer_message),
|
||||
error: Some(format!("LLM unavailable, using fallback: {}", e)),
|
||||
}),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn polish_message(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(request): Json<PolishRequest>,
|
||||
) -> (StatusCode, Json<PolishResponse>) {
|
||||
info!("Polishing message for session {}", request.session_id);
|
||||
|
||||
let session_result = get_session(&state, request.session_id).await;
|
||||
let session = match session_result {
|
||||
Ok(s) => s,
|
||||
Err(e) => {
|
||||
return (
|
||||
StatusCode::NOT_FOUND,
|
||||
Json(PolishResponse {
|
||||
success: false,
|
||||
original: request.message.clone(),
|
||||
polished: request.message.clone(),
|
||||
changes: vec![],
|
||||
error: Some(e),
|
||||
}),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
let work_path = std::env::var("WORK_PATH").unwrap_or_else(|_| "./work".to_string());
|
||||
let config = crate::attendance::llm_assist_config::LlmAssistConfig::from_config(session.bot_id, &work_path);
|
||||
|
||||
if !config.polish_enabled {
|
||||
return (
|
||||
StatusCode::OK,
|
||||
Json(PolishResponse {
|
||||
success: true,
|
||||
original: request.message.clone(),
|
||||
polished: request.message.clone(),
|
||||
changes: vec![],
|
||||
error: Some("Polish feature is disabled".to_string()),
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let bot_prompt = get_bot_system_prompt(session.bot_id, &work_path);
|
||||
|
||||
let system_prompt = format!(
|
||||
r#"You are a professional editor helping a customer service attendant.
|
||||
The service has this tone: {}
|
||||
|
||||
Your job is to polish the attendant's message to be more {} while:
|
||||
1. Fixing grammar and spelling errors
|
||||
2. Improving clarity and flow
|
||||
3. Maintaining the original meaning
|
||||
4. Keeping it natural (not robotic)
|
||||
|
||||
Respond in JSON format:
|
||||
{{
|
||||
"polished": "The improved message",
|
||||
"changes": ["Changed X to Y", "Fixed grammar in..."]
|
||||
}}"#,
|
||||
bot_prompt, request.tone
|
||||
);
|
||||
|
||||
let user_prompt = format!(
|
||||
r#"Polish this message with a {} tone:
|
||||
|
||||
"{}"#,
|
||||
request.tone, request.message
|
||||
);
|
||||
|
||||
match execute_llm_with_context(&state, session.bot_id, &system_prompt, &user_prompt).await {
|
||||
Ok(response) => {
|
||||
let (polished, changes) = parse_polish_response(&response, &request.message);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(PolishResponse {
|
||||
success: true,
|
||||
original: request.message.clone(),
|
||||
polished,
|
||||
changes,
|
||||
error: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
Err(e) => {
|
||||
error!("LLM error polishing message: {}", e);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(PolishResponse {
|
||||
success: false,
|
||||
original: request.message.clone(),
|
||||
polished: request.message.clone(),
|
||||
changes: vec![],
|
||||
error: Some(format!("LLM error: {}", e)),
|
||||
}),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn generate_smart_replies(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(request): Json<SmartRepliesRequest>,
|
||||
) -> (StatusCode, Json<SmartRepliesResponse>) {
|
||||
info!(
|
||||
"Generating smart replies for session {}",
|
||||
request.session_id
|
||||
);
|
||||
|
||||
let session_result = get_session(&state, request.session_id).await;
|
||||
let session = match session_result {
|
||||
Ok(s) => s,
|
||||
Err(e) => {
|
||||
return (
|
||||
StatusCode::NOT_FOUND,
|
||||
Json(SmartRepliesResponse {
|
||||
success: false,
|
||||
replies: vec![],
|
||||
error: Some(e),
|
||||
}),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
let work_path = std::env::var("WORK_PATH").unwrap_or_else(|_| "./work".to_string());
|
||||
let config = crate::attendance::llm_assist_config::LlmAssistConfig::from_config(session.bot_id, &work_path);
|
||||
|
||||
if !config.smart_replies_enabled {
|
||||
return (
|
||||
StatusCode::OK,
|
||||
Json(SmartRepliesResponse {
|
||||
success: true,
|
||||
replies: vec![],
|
||||
error: Some("Smart replies feature is disabled".to_string()),
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let history_context = request
|
||||
.history
|
||||
.iter()
|
||||
.map(|m| format!("{}: {}", m.role, m.content))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
|
||||
let bot_prompt = get_bot_system_prompt(session.bot_id, &work_path);
|
||||
|
||||
let system_prompt = format!(
|
||||
r#"You are an AI assistant helping a customer service attendant craft responses.
|
||||
The service has this personality: {}
|
||||
|
||||
Generate exactly 3 reply suggestions that:
|
||||
1. Are contextually appropriate
|
||||
2. Sound natural and human (not robotic)
|
||||
3. Vary in approach (one empathetic, one solution-focused, one follow_up)
|
||||
4. Are ready to send (no placeholders like [name])
|
||||
|
||||
Respond in JSON format:
|
||||
{{
|
||||
"replies": [
|
||||
{{"text": "...", "tone": "empathetic", "confidence": 0.9, "category": "answer"}},
|
||||
{{"text": "...", "tone": "professional", "confidence": 0.85, "category": "solution"}},
|
||||
{{"text": "...", "tone": "friendly", "confidence": 0.8, "category": "follow_up"}}
|
||||
]
|
||||
}}"#,
|
||||
bot_prompt
|
||||
);
|
||||
|
||||
let user_prompt = format!(
|
||||
r"Conversation:
|
||||
{}
|
||||
|
||||
Generate 3 reply options for the attendant.",
|
||||
history_context
|
||||
);
|
||||
|
||||
match execute_llm_with_context(&state, session.bot_id, &system_prompt, &user_prompt).await {
|
||||
Ok(response) => {
|
||||
let replies = parse_smart_replies_response(&response);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(SmartRepliesResponse {
|
||||
success: true,
|
||||
replies,
|
||||
error: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
Err(e) => {
|
||||
error!("LLM error generating smart replies: {}", e);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(SmartRepliesResponse {
|
||||
success: true,
|
||||
replies: generate_fallback_replies(),
|
||||
error: Some(format!("LLM unavailable, using fallback: {}", e)),
|
||||
}),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn generate_summary(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Path(session_id): Path<Uuid>,
|
||||
) -> (StatusCode, Json<SummaryResponse>) {
|
||||
info!("Generating summary for session {}", session_id);
|
||||
|
||||
let session_result = get_session(&state, session_id).await;
|
||||
let session = match session_result {
|
||||
Ok(s) => s,
|
||||
Err(e) => {
|
||||
return (
|
||||
StatusCode::NOT_FOUND,
|
||||
Json(SummaryResponse {
|
||||
success: false,
|
||||
summary: ConversationSummary::default(),
|
||||
error: Some(e),
|
||||
}),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
let work_path = std::env::var("WORK_PATH").unwrap_or_else(|_| "./work".to_string());
|
||||
let config = crate::attendance::llm_assist_config::LlmAssistConfig::from_config(session.bot_id, &work_path);
|
||||
|
||||
if !config.auto_summary_enabled {
|
||||
return (
|
||||
StatusCode::OK,
|
||||
Json(SummaryResponse {
|
||||
success: true,
|
||||
summary: ConversationSummary::default(),
|
||||
error: Some("Auto-summary feature is disabled".to_string()),
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let history = load_conversation_history(&state, session_id).await;
|
||||
|
||||
if history.is_empty() {
|
||||
return (
|
||||
StatusCode::OK,
|
||||
Json(SummaryResponse {
|
||||
success: true,
|
||||
summary: ConversationSummary {
|
||||
brief: "No messages in conversation yet".to_string(),
|
||||
..Default::default()
|
||||
},
|
||||
error: None,
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let history_text = history
|
||||
.iter()
|
||||
.map(|m| format!("{}: {}", m.role, m.content))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
|
||||
let bot_prompt = get_bot_system_prompt(session.bot_id, &work_path);
|
||||
|
||||
let system_prompt = format!(
|
||||
r#"You are an AI assistant helping a customer service attendant understand a conversation.
|
||||
The bot/service personality is: {}
|
||||
|
||||
Analyze the conversation and provide a comprehensive summary.
|
||||
|
||||
Respond in JSON format:
|
||||
{{
|
||||
"brief": "One sentence summary",
|
||||
"key_points": ["Point 1", "Point 2"],
|
||||
"customer_needs": ["Need 1", "Need 2"],
|
||||
"unresolved_issues": ["Issue 1"],
|
||||
"sentiment_trend": "improving/stable/declining",
|
||||
"recommended_action": "What the attendant should do next"
|
||||
}}"#,
|
||||
bot_prompt
|
||||
);
|
||||
|
||||
let user_prompt = format!(
|
||||
r"Summarize this conversation:
|
||||
|
||||
{}",
|
||||
history_text
|
||||
);
|
||||
|
||||
match execute_llm_with_context(&state, session.bot_id, &system_prompt, &user_prompt).await {
|
||||
Ok(response) => {
|
||||
let mut summary = parse_summary_response(&response);
|
||||
summary.message_count = history.len() as i32;
|
||||
|
||||
if let (Some(first_ts), Some(last_ts)) = (
|
||||
history.first().and_then(|m| m.timestamp.as_ref()),
|
||||
history.last().and_then(|m| m.timestamp.as_ref()),
|
||||
) {
|
||||
if let (Ok(first), Ok(last)) = (
|
||||
chrono::DateTime::parse_from_rfc3339(first_ts),
|
||||
chrono::DateTime::parse_from_rfc3339(last_ts),
|
||||
) {
|
||||
summary.duration_minutes = (last - first).num_minutes() as i32;
|
||||
}
|
||||
}
|
||||
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(SummaryResponse {
|
||||
success: true,
|
||||
summary,
|
||||
error: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
Err(e) => {
|
||||
error!("LLM error generating summary: {}", e);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(SummaryResponse {
|
||||
success: false,
|
||||
summary: ConversationSummary {
|
||||
brief: format!("Conversation with {} messages", history.len()),
|
||||
message_count: history.len() as i32,
|
||||
..Default::default()
|
||||
},
|
||||
error: Some(format!("LLM error: {}", e)),
|
||||
}),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn analyze_sentiment(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Json(request): Json<SentimentRequest>,
|
||||
) -> impl IntoResponse {
|
||||
info!("Analyzing sentiment for session {}", request.session_id);
|
||||
|
||||
let session_result = get_session(&state, request.session_id).await;
|
||||
let session = match session_result {
|
||||
Ok(s) => s,
|
||||
Err(e) => {
|
||||
return (
|
||||
StatusCode::NOT_FOUND,
|
||||
Json(SentimentResponse {
|
||||
success: false,
|
||||
sentiment: SentimentAnalysis::default(),
|
||||
error: Some(e),
|
||||
}),
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
let work_path = std::env::var("WORK_PATH").unwrap_or_else(|_| "./work".to_string());
|
||||
let config = crate::attendance::llm_assist_config::LlmAssistConfig::from_config(session.bot_id, &work_path);
|
||||
|
||||
if !config.sentiment_enabled {
|
||||
let sentiment = analyze_sentiment_keywords(&request.message);
|
||||
return (
|
||||
StatusCode::OK,
|
||||
Json(SentimentResponse {
|
||||
success: true,
|
||||
sentiment,
|
||||
error: Some("LLM sentiment disabled, using keyword analysis".to_string()),
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let history_context = request
|
||||
.history
|
||||
.iter()
|
||||
.take(5)
|
||||
.map(|m| format!("{}: {}", m.role, m.content))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
|
||||
let system_prompt = r#"You are a sentiment analysis expert. Analyze the customer's emotional state.
|
||||
|
||||
Consider:
|
||||
1. Overall sentiment (positive/neutral/negative)
|
||||
2. Specific emotions present
|
||||
3. Risk of escalation
|
||||
4. Urgency level
|
||||
|
||||
Respond in JSON format:
|
||||
{
|
||||
"overall": "positive|neutral|negative",
|
||||
"score": 0.5,
|
||||
"emotions": [{"name": "frustration", "intensity": 0.7}],
|
||||
"escalation_risk": "low|medium|high",
|
||||
"urgency": "low|normal|high|urgent",
|
||||
"emoji": "😐"
|
||||
}"#;
|
||||
|
||||
let user_prompt = format!(
|
||||
r#"Recent conversation:
|
||||
{}
|
||||
|
||||
Current message to analyze: "{}"
|
||||
|
||||
Analyze the customer's sentiment."#,
|
||||
history_context, request.message
|
||||
);
|
||||
|
||||
match execute_llm_with_context(&state, session.bot_id, system_prompt, &user_prompt).await {
|
||||
Ok(response) => {
|
||||
let sentiment = parse_sentiment_response(&response);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(SentimentResponse {
|
||||
success: true,
|
||||
sentiment,
|
||||
error: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
Err(e) => {
|
||||
error!("LLM error analyzing sentiment: {}", e);
|
||||
let sentiment = analyze_sentiment_keywords(&request.message);
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(SentimentResponse {
|
||||
success: true,
|
||||
sentiment,
|
||||
error: Some(format!("LLM unavailable, using fallback: {}", e)),
|
||||
}),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_llm_config(
|
||||
State(_state): State<Arc<AppState>>,
|
||||
Path(bot_id): Path<Uuid>,
|
||||
) -> impl IntoResponse {
|
||||
let work_path = std::env::var("WORK_PATH").unwrap_or_else(|_| "./work".to_string());
|
||||
let config = crate::attendance::llm_assist_config::LlmAssistConfig::from_config(bot_id, &work_path);
|
||||
|
||||
(
|
||||
StatusCode::OK,
|
||||
Json(serde_json::json!({
|
||||
"tips_enabled": config.tips_enabled,
|
||||
"polish_enabled": config.polish_enabled,
|
||||
"smart_replies_enabled": config.smart_replies_enabled,
|
||||
"auto_summary_enabled": config.auto_summary_enabled,
|
||||
"sentiment_enabled": config.sentiment_enabled,
|
||||
"any_enabled": config.any_enabled()
|
||||
})),
|
||||
)
|
||||
}
|
||||
613
src/attendance/llm_assist_helpers.rs
Normal file
613
src/attendance/llm_assist_helpers.rs
Normal file
|
|
@ -0,0 +1,613 @@
|
|||
use super::llm_assist_types::*;
|
||||
use crate::core::config::ConfigManager;
|
||||
use crate::core::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use serde_json::json;
|
||||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
// ============================================================================
|
||||
// LLM EXECUTION
|
||||
// ============================================================================
|
||||
|
||||
pub async fn execute_llm_with_context(
|
||||
state: &Arc<AppState>,
|
||||
bot_id: Uuid,
|
||||
system_prompt: &str,
|
||||
user_prompt: &str,
|
||||
) -> Result<String, Box<dyn std::error::Error + Send + Sync>> {
|
||||
let config_manager = ConfigManager::new(state.conn.clone());
|
||||
|
||||
let model = config_manager
|
||||
.get_config(&bot_id, "llm-model", None)
|
||||
.unwrap_or_else(|_| {
|
||||
config_manager
|
||||
.get_config(&Uuid::nil(), "llm-model", None)
|
||||
.unwrap_or_default()
|
||||
});
|
||||
|
||||
let key = config_manager
|
||||
.get_config(&bot_id, "llm-key", None)
|
||||
.unwrap_or_else(|_| {
|
||||
config_manager
|
||||
.get_config(&Uuid::nil(), "llm-key", None)
|
||||
.unwrap_or_default()
|
||||
});
|
||||
|
||||
let messages = json::json!(<
|
||||
[
|
||||
{
|
||||
"role": "system",
|
||||
"content": system_prompt
|
||||
},
|
||||
{
|
||||
"role": "user",
|
||||
"content": user_prompt
|
||||
}
|
||||
]
|
||||
>);
|
||||
|
||||
let response = state
|
||||
.llm_provider
|
||||
.generate(user_prompt, &messages, &model, &key)
|
||||
.await?;
|
||||
|
||||
let handler = crate::llm::llm_models::get_handler(&model);
|
||||
let processed = handler.process_content(&response);
|
||||
|
||||
Ok(processed)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// SESSION HELPERS
|
||||
// ============================================================================
|
||||
|
||||
pub async fn get_session(state: &Arc<AppState>, session_id: Uuid) -> Result<UserSession, String> {
|
||||
let conn = state.conn.clone();
|
||||
|
||||
tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| format!("DB error: {}", e))?;
|
||||
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
user_sessions::table
|
||||
.find(session_id)
|
||||
.first::<UserSession>(&mut db_conn)
|
||||
.map_err(|e| format!("Session not found: {}", e))
|
||||
})
|
||||
.await
|
||||
.map_err(|e| format!("Task error: {}", e))?
|
||||
}
|
||||
|
||||
pub async fn load_conversation_history(
|
||||
state: &Arc<AppState>,
|
||||
session_id: Uuid,
|
||||
) -> Vec<ConversationMessage> {
|
||||
let conn = state.conn.clone();
|
||||
|
||||
let result = tokio::task::spawn_blocking(move || {
|
||||
let Ok(mut db_conn) = conn.get() else {
|
||||
return Vec::new();
|
||||
};
|
||||
|
||||
use crate::core::shared::models::schema::message_history;
|
||||
|
||||
let messages: Vec<(String, i32, chrono::NaiveDateTime)> = message_history::table
|
||||
.filter(message_history::session_id.eq(session_id))
|
||||
.select((
|
||||
message_history::content_encrypted,
|
||||
message_history::role,
|
||||
message_history::created_at,
|
||||
))
|
||||
.order(message_history::created_at.asc())
|
||||
.limit(50)
|
||||
.load(&mut db_conn)
|
||||
.unwrap_or_default();
|
||||
|
||||
messages
|
||||
.into_iter()
|
||||
.map(|(content, role, timestamp)| ConversationMessage {
|
||||
role: match role {
|
||||
0 => "customer".to_string(),
|
||||
1 => "bot".to_string(),
|
||||
2 => "attendant".to_string(),
|
||||
_ => "system".to_string(),
|
||||
},
|
||||
content,
|
||||
timestamp: Some(timestamp.and_utc().to_rfc3339()),
|
||||
})
|
||||
.collect()
|
||||
})
|
||||
.await
|
||||
.unwrap_or_default();
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// RESPONSE PARSERS
|
||||
// ============================================================================
|
||||
|
||||
pub fn parse_tips_response(response: &str) -> Vec<AttendantTip> {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(parsed) = serde_json::from_str::<serde_json::Value>(&json_str) {
|
||||
if let Some(tips_array) = parsed.get("tips").and_then(|t| t.as_array()) {
|
||||
return tips_array
|
||||
.iter()
|
||||
.filter_map(|tip| {
|
||||
let tip_type = match tip
|
||||
.get("type")
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("general")
|
||||
{
|
||||
"intent" => TipType::Intent,
|
||||
"action" => TipType::Action,
|
||||
"warning" => TipType::Warning,
|
||||
"knowledge" => TipType::Knowledge,
|
||||
"history" => TipType::History,
|
||||
_ => TipType::General,
|
||||
};
|
||||
|
||||
Some(AttendantTip {
|
||||
tip_type,
|
||||
content: tip.get("content").and_then(|c| c.as_str())?.to_string(),
|
||||
confidence: tip
|
||||
.get("confidence")
|
||||
.and_then(|c| c.as_f64())
|
||||
.unwrap_or(0.8) as f32,
|
||||
priority: tip.get("priority").and_then(|p| p.as_i64()).unwrap_or(2) as i32,
|
||||
})
|
||||
})
|
||||
.collect();
|
||||
}
|
||||
}
|
||||
|
||||
if response.trim().is_empty() {
|
||||
Vec::new()
|
||||
} else {
|
||||
vec![AttendantTip {
|
||||
tip_type: TipType::General,
|
||||
content: response.trim().to_string(),
|
||||
confidence: 0.7,
|
||||
priority: 2,
|
||||
}]
|
||||
}
|
||||
}
|
||||
|
||||
pub fn parse_polish_response(response: &str, original: &str) -> (String, Vec<String>) {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(parsed) = serde_json::from_str::<serde_json::Value>(&json_str) {
|
||||
let polished = parsed
|
||||
.get("polished")
|
||||
.and_then(|p| p.as_str())
|
||||
.unwrap_or(original)
|
||||
.to_string();
|
||||
|
||||
let changes = parsed
|
||||
.get("changes")
|
||||
.and_then(|c| c.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|v| v.as_str().map(String::from))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
return (polished, changes);
|
||||
}
|
||||
|
||||
(
|
||||
response.trim().to_string(),
|
||||
vec!["Message improved".to_string()],
|
||||
)
|
||||
}
|
||||
|
||||
pub fn parse_smart_replies_response(response: &str) -> Vec<SmartReply> {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(parsed) = serde_json::from_str::<serde_json::Value>(&json_str) {
|
||||
if let Some(replies_array) = parsed.get("replies").and_then(|r| r.as_array()) {
|
||||
return replies_array
|
||||
.iter()
|
||||
.filter_map(|reply| {
|
||||
Some(SmartReply {
|
||||
text: reply.get("text").and_then(|t| t.as_str())?.to_string(),
|
||||
tone: reply
|
||||
.get("tone")
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("professional")
|
||||
.to_string(),
|
||||
confidence: reply
|
||||
.get("confidence")
|
||||
.and_then(|c| c.as_f64())
|
||||
.unwrap_or(0.8) as f32,
|
||||
category: reply
|
||||
.get("category")
|
||||
.and_then(|c| c.as_str())
|
||||
.unwrap_or("answer")
|
||||
.to_string(),
|
||||
})
|
||||
})
|
||||
.collect();
|
||||
}
|
||||
}
|
||||
|
||||
generate_fallback_replies()
|
||||
}
|
||||
|
||||
pub fn parse_summary_response(response: &str) -> ConversationSummary {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(parsed) = serde_json::from_str::<serde_json::Value>(&json_str) {
|
||||
return ConversationSummary {
|
||||
brief: parsed
|
||||
.get("brief")
|
||||
.and_then(|b| b.as_str())
|
||||
.unwrap_or("Conversation summary")
|
||||
.to_string(),
|
||||
key_points: parsed
|
||||
.get("key_points")
|
||||
.and_then(|k| k.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|v| v.as_str().map(String::from))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default(),
|
||||
customer_needs: parsed
|
||||
.get("customer_needs")
|
||||
.and_then(|c| c.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|v| v.as_str().map(String::from))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default(),
|
||||
unresolved_issues: parsed
|
||||
.get("unresolved_issues")
|
||||
.and_then(|u| u.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|v| v.as_str().map(String::from))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default(),
|
||||
sentiment_trend: parsed
|
||||
.get("sentiment_trend")
|
||||
.and_then(|s| s.as_str())
|
||||
.unwrap_or("stable")
|
||||
.to_string(),
|
||||
recommended_action: parsed
|
||||
.get("recommended_action")
|
||||
.and_then(|r| r.as_str())
|
||||
.unwrap_or("")
|
||||
.to_string(),
|
||||
..Default::default()
|
||||
};
|
||||
}
|
||||
|
||||
ConversationSummary {
|
||||
brief: response.trim().to_string(),
|
||||
..Default::default()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn parse_sentiment_response(response: &str) -> SentimentAnalysis {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(parsed) = serde_json::from_str::<serde_json::Value>(&json_str) {
|
||||
let emotions = parsed
|
||||
.get("emotions")
|
||||
.and_then(|e| e.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|e| {
|
||||
Some(Emotion {
|
||||
name: e.get("name").and_then(|n| n.as_str())?.to_string(),
|
||||
intensity: e.get("intensity").and_then(|i| i.as_f64()).unwrap_or(0.5)
|
||||
as f32,
|
||||
})
|
||||
})
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
return SentimentAnalysis {
|
||||
overall: parsed
|
||||
.get("overall")
|
||||
.and_then(|o| o.as_str())
|
||||
.unwrap_or("neutral")
|
||||
.to_string(),
|
||||
score: parsed.get("score").and_then(|s| s.as_f64()).unwrap_or(0.0) as f32,
|
||||
emotions,
|
||||
escalation_risk: parsed
|
||||
.get("escalation_risk")
|
||||
.and_then(|e| e.as_str())
|
||||
.unwrap_or("low")
|
||||
.to_string(),
|
||||
urgency: parsed
|
||||
.get("urgency")
|
||||
.and_then(|u| u.as_str())
|
||||
.unwrap_or("normal")
|
||||
.to_string(),
|
||||
emoji: parsed
|
||||
.get("emoji")
|
||||
.and_then(|e| e.as_str())
|
||||
.unwrap_or("😐")
|
||||
.to_string(),
|
||||
};
|
||||
}
|
||||
|
||||
SentimentAnalysis::default()
|
||||
}
|
||||
|
||||
pub fn extract_json(response: &str) -> String {
|
||||
if let Some(start) = response.find('{') {
|
||||
if let Some(end) = response.rfind('}') {
|
||||
if end > start {
|
||||
return response[start..=end].to_string();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(start) = response.find('[') {
|
||||
if let Some(end) = response.rfind(']') {
|
||||
if end > start {
|
||||
return response[start..=end].to_string();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
response.to_string()
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// FALLBACK FUNCTIONS
|
||||
// ============================================================================
|
||||
|
||||
pub fn generate_fallback_tips(message: &str) -> Vec<AttendantTip> {
|
||||
let msg_lower = message.to_lowercase();
|
||||
let mut tips = Vec::new();
|
||||
|
||||
if msg_lower.contains("urgent")
|
||||
|| msg_lower.contains("asap")
|
||||
|| msg_lower.contains("immediately")
|
||||
|| msg_lower.contains("emergency")
|
||||
{
|
||||
tips.push(AttendantTip {
|
||||
tip_type: TipType::Warning,
|
||||
content: "Customer indicates urgency - prioritize quick response".to_string(),
|
||||
confidence: 0.9,
|
||||
priority: 1,
|
||||
});
|
||||
}
|
||||
|
||||
if msg_lower.contains("frustrated")
|
||||
|| msg_lower.contains("angry")
|
||||
|| msg_lower.contains("ridiculous")
|
||||
|| msg_lower.contains("unacceptable")
|
||||
{
|
||||
tips.push(AttendantTip {
|
||||
tip_type: TipType::Warning,
|
||||
content: "Customer may be frustrated - use empathetic language".to_string(),
|
||||
confidence: 0.85,
|
||||
priority: 1,
|
||||
});
|
||||
}
|
||||
|
||||
if message.contains('?') {
|
||||
tips.push(AttendantTip {
|
||||
tip_type: TipType::Intent,
|
||||
content: "Customer is asking a question - provide clear, direct answer".to_string(),
|
||||
confidence: 0.8,
|
||||
priority: 2,
|
||||
});
|
||||
}
|
||||
|
||||
if msg_lower.contains("problem")
|
||||
|| msg_lower.contains("issue")
|
||||
|| msg_lower.contains("not working")
|
||||
|| msg_lower.contains("broken")
|
||||
{
|
||||
tips.push(AttendantTip {
|
||||
tip_type: TipType::Action,
|
||||
content: "Customer reporting an issue - acknowledge and gather details".to_string(),
|
||||
confidence: 0.8,
|
||||
priority: 2,
|
||||
});
|
||||
}
|
||||
|
||||
if msg_lower.contains("thank")
|
||||
|| msg_lower.contains("great")
|
||||
|| msg_lower.contains("perfect")
|
||||
|| msg_lower.contains("awesome")
|
||||
{
|
||||
tips.push(AttendantTip {
|
||||
tip_type: TipType::General,
|
||||
content: "Customer is expressing satisfaction - good opportunity to close or upsell"
|
||||
.to_string(),
|
||||
confidence: 0.85,
|
||||
priority: 3,
|
||||
});
|
||||
}
|
||||
|
||||
if tips.is_empty() {
|
||||
tips.push(AttendantTip {
|
||||
tip_type: TipType::General,
|
||||
content: "Read message carefully and respond helpfully".to_string(),
|
||||
confidence: 0.5,
|
||||
priority: 3,
|
||||
});
|
||||
}
|
||||
|
||||
tips
|
||||
}
|
||||
|
||||
pub fn generate_fallback_replies() -> Vec<SmartReply> {
|
||||
vec![
|
||||
SmartReply {
|
||||
text: "Thank you for reaching out! I'd be happy to help you with that. Could you provide me with a bit more detail?".to_string(),
|
||||
tone: "friendly".to_string(),
|
||||
confidence: 0.7,
|
||||
category: "greeting".to_string(),
|
||||
},
|
||||
SmartReply {
|
||||
text: "I understand your concern. Let me look into this for you right away.".to_string(),
|
||||
tone: "empathetic".to_string(),
|
||||
confidence: 0.7,
|
||||
category: "acknowledgment".to_string(),
|
||||
},
|
||||
SmartReply {
|
||||
text: "Is there anything else I can help you with today?".to_string(),
|
||||
tone: "professional".to_string(),
|
||||
confidence: 0.7,
|
||||
category: "follow_up".to_string(),
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
pub fn analyze_sentiment_keywords(message: &str) -> SentimentAnalysis {
|
||||
let msg_lower = message.to_lowercase();
|
||||
|
||||
let positive_words = [
|
||||
"thank", "great", "perfect", "awesome", "excellent", "good", "happy", "love", "appreciate",
|
||||
"wonderful", "fantastic", "amazing", "helpful",
|
||||
];
|
||||
let negative_words = [
|
||||
"angry", "frustrated", "terrible", "awful", "horrible", "worst", "hate", "disappointed",
|
||||
"unacceptable", "ridiculous", "stupid", "problem", "issue", "broken", "failed", "error",
|
||||
];
|
||||
let urgent_words = ["urgent", "asap", "immediately", "emergency", "now", "critical"];
|
||||
|
||||
let positive_count = positive_words.iter().filter(|w| msg_lower.contains(*w)).count();
|
||||
let negative_count = negative_words.iter().filter(|w| msg_lower.contains(*w)).count();
|
||||
let urgent_count = urgent_words.iter().filter(|w| msg_lower.contains(*w)).count();
|
||||
|
||||
let score = match positive_count.cmp(&negative_count) {
|
||||
std::cmp::Ordering::Greater => 0.3 + (positive_count as f32 * 0.2).min(0.7),
|
||||
std::cmp::Ordering::Less => -0.3 - (negative_count as f32 * 0.2).min(0.7),
|
||||
std::cmp::Ordering::Equal => 0.0,
|
||||
};
|
||||
|
||||
let overall = if score > 0.2 {
|
||||
"positive"
|
||||
} else if score < -0.2 {
|
||||
"negative"
|
||||
} else {
|
||||
"neutral"
|
||||
};
|
||||
|
||||
let escalation_risk = if negative_count >= 3 {
|
||||
"high"
|
||||
} else if negative_count >= 1 {
|
||||
"medium"
|
||||
} else {
|
||||
"low"
|
||||
};
|
||||
|
||||
let urgency = if urgent_count >= 2 {
|
||||
"urgent"
|
||||
} else if urgent_count >= 1 {
|
||||
"high"
|
||||
} else {
|
||||
"normal"
|
||||
};
|
||||
|
||||
let emoji = match overall {
|
||||
"positive" => "😊",
|
||||
"negative" => "😟",
|
||||
_ => "😐",
|
||||
};
|
||||
|
||||
let mut emotions = Vec::new();
|
||||
if negative_count > 0 {
|
||||
emotions.push(Emotion {
|
||||
name: "frustration".to_string(),
|
||||
intensity: (negative_count as f32 * 0.3).min(1.0),
|
||||
});
|
||||
}
|
||||
if positive_count > 0 {
|
||||
emotions.push(Emotion {
|
||||
name: "satisfaction".to_string(),
|
||||
intensity: (positive_count as f32 * 0.3).min(1.0),
|
||||
});
|
||||
}
|
||||
if urgent_count > 0 {
|
||||
emotions.push(Emotion {
|
||||
name: "anxiety".to_string(),
|
||||
intensity: (urgent_count as f32 * 0.4).min(1.0),
|
||||
});
|
||||
}
|
||||
|
||||
SentimentAnalysis {
|
||||
overall: overall.to_string(),
|
||||
score,
|
||||
emotions,
|
||||
escalation_risk: escalation_risk.to_string(),
|
||||
urgency: urgency.to_string(),
|
||||
emoji: emoji.to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// TESTS
|
||||
// ============================================================================
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_fallback_tips_urgent() {
|
||||
let tips = generate_fallback_tips("This is URGENT! I need help immediately!");
|
||||
assert!(!tips.is_empty());
|
||||
assert!(tips.iter().any(|t| matches!(t.tip_type, TipType::Warning)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_fallback_tips_question() {
|
||||
let tips = generate_fallback_tips("How do I reset my password?");
|
||||
assert!(!tips.is_empty());
|
||||
assert!(tips.iter().any(|t| matches!(t.tip_type, TipType::Intent)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_sentiment_positive() {
|
||||
let sentiment = analyze_sentiment_keywords("Thank you so much! This is great!");
|
||||
assert_eq!(sentiment.overall, "positive");
|
||||
assert!(sentiment.score > 0.0);
|
||||
assert_eq!(sentiment.escalation_risk, "low");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_sentiment_negative() {
|
||||
let sentiment =
|
||||
analyze_sentiment_keywords("This is terrible! I'm very frustrated with this problem.");
|
||||
assert_eq!(sentiment.overall, "negative");
|
||||
assert!(sentiment.score < 0.0);
|
||||
assert!(sentiment.escalation_risk == "medium" || sentiment.escalation_risk == "high");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_sentiment_urgent() {
|
||||
let sentiment = analyze_sentiment_keywords("I need help ASAP! This is urgent!");
|
||||
assert!(sentiment.urgency == "high" || sentiment.urgency == "urgent");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_json() {
|
||||
let response = "Here is the result: {\"key\": \"value\"} and some more text.";
|
||||
let json = extract_json(&response);
|
||||
assert_eq!(json, "{\"key\": \"value\"}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_fallback_replies() {
|
||||
let replies = generate_fallback_replies();
|
||||
assert_eq!(replies.len(), 3);
|
||||
assert!(replies.iter().any(|r| r.category == "greeting"));
|
||||
assert!(replies.iter().any(|r| r.category == "follow_up"));
|
||||
}
|
||||
}
|
||||
173
src/attendance/llm_assist_types.rs
Normal file
173
src/attendance/llm_assist_types.rs
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
use serde::{Deserialize, Serialize};
|
||||
use uuid::Uuid;
|
||||
|
||||
// ============================================================================
|
||||
// CONFIG TYPES
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct LlmAssistConfig {
|
||||
pub tips_enabled: bool,
|
||||
pub polish_enabled: bool,
|
||||
pub smart_replies_enabled: bool,
|
||||
pub auto_summary_enabled: bool,
|
||||
pub sentiment_enabled: bool,
|
||||
pub bot_system_prompt: Option<String>,
|
||||
pub bot_description: Option<String>,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// REQUEST TYPES
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct TipRequest {
|
||||
pub session_id: Uuid,
|
||||
pub customer_message: String,
|
||||
#[serde(default)]
|
||||
pub history: Vec<ConversationMessage>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct PolishRequest {
|
||||
pub session_id: Uuid,
|
||||
pub message: String,
|
||||
#[serde(default = "default_tone")]
|
||||
pub tone: String,
|
||||
}
|
||||
|
||||
fn default_tone() -> String {
|
||||
"professional".to_string()
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SmartRepliesRequest {
|
||||
pub session_id: Uuid,
|
||||
#[serde(default)]
|
||||
pub history: Vec<ConversationMessage>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SummaryRequest {
|
||||
pub session_id: Uuid,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SentimentRequest {
|
||||
pub session_id: Uuid,
|
||||
pub message: String,
|
||||
#[serde(default)]
|
||||
pub history: Vec<ConversationMessage>,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// RESPONSE TYPES
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct TipResponse {
|
||||
pub success: bool,
|
||||
pub tips: Vec<AttendantTip>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct PolishResponse {
|
||||
pub success: bool,
|
||||
pub original: String,
|
||||
pub polished: String,
|
||||
pub changes: Vec<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SmartRepliesResponse {
|
||||
pub success: bool,
|
||||
pub replies: Vec<SmartReply>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SummaryResponse {
|
||||
pub success: bool,
|
||||
pub summary: ConversationSummary,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SentimentResponse {
|
||||
pub success: bool,
|
||||
pub sentiment: SentimentAnalysis,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// COMMON DATA TYPES
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ConversationMessage {
|
||||
pub role: String,
|
||||
pub content: String,
|
||||
pub timestamp: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct AttendantTip {
|
||||
pub tip_type: TipType,
|
||||
pub content: String,
|
||||
pub confidence: f32,
|
||||
pub priority: i32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum TipType {
|
||||
Intent,
|
||||
Action,
|
||||
Warning,
|
||||
Knowledge,
|
||||
History,
|
||||
General,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct SmartReply {
|
||||
pub text: String,
|
||||
pub tone: String,
|
||||
pub confidence: f32,
|
||||
pub category: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Default)]
|
||||
pub struct ConversationSummary {
|
||||
pub brief: String,
|
||||
pub key_points: Vec<String>,
|
||||
pub customer_needs: Vec<String>,
|
||||
pub unresolved_issues: Vec<String>,
|
||||
pub sentiment_trend: String,
|
||||
pub recommended_action: String,
|
||||
pub message_count: i32,
|
||||
pub duration_minutes: i32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Default)]
|
||||
pub struct SentimentAnalysis {
|
||||
pub overall: String,
|
||||
pub score: f32,
|
||||
pub emotions: Vec<Emotion>,
|
||||
pub escalation_risk: String,
|
||||
pub urgency: String,
|
||||
pub emoji: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct Emotion {
|
||||
pub name: String,
|
||||
pub intensity: f32,
|
||||
}
|
||||
168
src/attendance/llm_parser.rs
Normal file
168
src/attendance/llm_parser.rs
Normal file
|
|
@ -0,0 +1,168 @@
|
|||
//! Response parsing utilities for LLM assist
|
||||
//!
|
||||
//! Extracted from llm_assist.rs
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct AttendantTip {
|
||||
pub content: String,
|
||||
pub rationale: String,
|
||||
pub tone: String,
|
||||
pub applicable_context: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SmartReply {
|
||||
pub content: String,
|
||||
pub rationale: String,
|
||||
pub tone: String,
|
||||
pub confidence: Option<f32>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ConversationSummary {
|
||||
pub summary: String,
|
||||
pub key_points: Vec<String>,
|
||||
pub action_items: Vec<String>,
|
||||
pub sentiment: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SentimentAnalysis {
|
||||
pub sentiment: String,
|
||||
pub confidence: f32,
|
||||
pub key_emotions: Vec<String>,
|
||||
pub suggested_response_tone: String,
|
||||
}
|
||||
|
||||
/// Parse tips from LLM response
|
||||
pub fn parse_tips_response(response: &str) -> Vec<AttendantTip> {
|
||||
// Try to extract JSON array
|
||||
let json_str = extract_json(response);
|
||||
if let Ok(tips) = serde_json::from_str::<Vec<AttendantTip>>(&json_str) {
|
||||
return tips;
|
||||
}
|
||||
|
||||
// Fallback: parse line by line
|
||||
response
|
||||
.lines()
|
||||
.filter_map(|line| {
|
||||
let line = line.trim();
|
||||
if line.starts_with("- ") || line.starts_with("* ") {
|
||||
Some(AttendantTip {
|
||||
content: line[2..].to_string(),
|
||||
rationale: String::new(),
|
||||
tone: "neutral".to_string(),
|
||||
applicable_context: None,
|
||||
})
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Parse polish response
|
||||
pub fn parse_polish_response(response: &str, original: &str) -> (String, Vec<String>) {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
// Try to parse as JSON object with "polished" field
|
||||
if let Ok(value) = serde_json::from_str::<serde_json::Value>(&json_str) {
|
||||
let polished = value["polished"].as_str().unwrap_or(response).to_string();
|
||||
let suggestions: Vec<String> = value["suggestions"]
|
||||
.as_array()
|
||||
.map(|arr| arr.iter().filter_map(|v| v.as_str().map(String::from)).collect())
|
||||
.unwrap_or_default();
|
||||
return (polished, suggestions);
|
||||
}
|
||||
|
||||
// Fallback: use response as-is
|
||||
(response.to_string(), Vec::new())
|
||||
}
|
||||
|
||||
/// Parse smart replies response
|
||||
pub fn parse_smart_replies_response(response: &str) -> Vec<SmartReply> {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(replies) = serde_json::from_str::<Vec<SmartReply>>(&json_str) {
|
||||
return replies;
|
||||
}
|
||||
|
||||
// Fallback replies
|
||||
vec![
|
||||
SmartReply {
|
||||
content: "I understand. Let me help you with that.".to_string(),
|
||||
rationale: "Default acknowledgement".to_string(),
|
||||
tone: "professional".to_string(),
|
||||
confidence: None,
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
/// Parse summary response
|
||||
pub fn parse_summary_response(response: &str) -> ConversationSummary {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(summary) = serde_json::from_str::<ConversationSummary>(&json_str) {
|
||||
return summary;
|
||||
}
|
||||
|
||||
// Fallback summary
|
||||
ConversationSummary {
|
||||
summary: response.lines().take(3).collect::<Vec<_>>().join(" "),
|
||||
key_points: Vec::new(),
|
||||
action_items: Vec::new(),
|
||||
sentiment: "neutral".to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse sentiment response
|
||||
pub fn parse_sentiment_response(response: &str) -> SentimentAnalysis {
|
||||
let json_str = extract_json(response);
|
||||
|
||||
if let Ok(analysis) = serde_json::from_str::<SentimentAnalysis>(&json_str) {
|
||||
return analysis;
|
||||
}
|
||||
|
||||
// Fallback: keyword-based analysis
|
||||
let response_lower = response.to_lowercase();
|
||||
let (sentiment, confidence) = if response_lower.contains("positive") || response_lower.contains("happy") {
|
||||
("positive".to_string(), 0.7)
|
||||
} else if response_lower.contains("negative") || response_lower.contains("angry") {
|
||||
("negative".to_string(), 0.7)
|
||||
} else {
|
||||
("neutral".to_string(), 0.5)
|
||||
};
|
||||
|
||||
SentimentAnalysis {
|
||||
sentiment,
|
||||
confidence,
|
||||
key_emotions: Vec::new(),
|
||||
suggested_response_tone: "professional".to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract JSON from response (handles code blocks and plain JSON)
|
||||
pub fn extract_json(response: &str) -> String {
|
||||
// Remove code fences if present
|
||||
let response = response.trim();
|
||||
|
||||
if let Some(start) = response.find("```") {
|
||||
if let Some(json_start) = response[start..].find('{') {
|
||||
let json_part = &response[start + json_start..];
|
||||
if let Some(end) = json_part.find("```") {
|
||||
return json_part[..end].trim().to_string();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Try to find first { and last }
|
||||
if let Some(start) = response.find('{') {
|
||||
if let Some(end) = response.rfind('}') {
|
||||
return response[start..=end].to_string();
|
||||
}
|
||||
}
|
||||
|
||||
response.to_string()
|
||||
}
|
||||
158
src/attendance/llm_types.rs
Normal file
158
src/attendance/llm_types.rs
Normal file
|
|
@ -0,0 +1,158 @@
|
|||
// LLM assist types extracted from llm_assist.rs
|
||||
use serde::{Deserialize, Serialize};
|
||||
use uuid::Uuid;
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct LlmAssistConfig {
|
||||
pub tips_enabled: bool,
|
||||
pub polish_enabled: bool,
|
||||
pub smart_replies_enabled: bool,
|
||||
pub auto_summary_enabled: bool,
|
||||
pub sentiment_enabled: bool,
|
||||
pub bot_system_prompt: Option<String>,
|
||||
pub bot_description: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct TipRequest {
|
||||
pub session_id: Uuid,
|
||||
pub customer_message: String,
|
||||
#[serde(default)]
|
||||
pub history: Vec<ConversationMessage>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct PolishRequest {
|
||||
pub session_id: Uuid,
|
||||
pub message: String,
|
||||
#[serde(default = "default_tone")]
|
||||
pub tone: String,
|
||||
}
|
||||
|
||||
fn default_tone() -> String {
|
||||
"professional".to_string()
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SmartRepliesRequest {
|
||||
pub session_id: Uuid,
|
||||
#[serde(default)]
|
||||
pub history: Vec<ConversationMessage>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SummaryRequest {
|
||||
pub session_id: Uuid,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SentimentRequest {
|
||||
pub session_id: Uuid,
|
||||
pub message: String,
|
||||
#[serde(default)]
|
||||
pub history: Vec<ConversationMessage>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ConversationMessage {
|
||||
pub role: String,
|
||||
pub content: String,
|
||||
pub timestamp: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct TipResponse {
|
||||
pub success: bool,
|
||||
pub tips: Vec<AttendantTip>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct AttendantTip {
|
||||
pub tip_type: TipType,
|
||||
pub content: String,
|
||||
pub confidence: f32,
|
||||
pub priority: i32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum TipType {
|
||||
Intent,
|
||||
Action,
|
||||
Warning,
|
||||
Knowledge,
|
||||
History,
|
||||
General,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct PolishResponse {
|
||||
pub success: bool,
|
||||
pub original: String,
|
||||
pub polished: String,
|
||||
pub changes: Vec<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SmartRepliesResponse {
|
||||
pub success: bool,
|
||||
pub replies: Vec<SmartReply>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct SmartReply {
|
||||
pub text: String,
|
||||
pub tone: String,
|
||||
pub confidence: f32,
|
||||
pub category: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SummaryResponse {
|
||||
pub success: bool,
|
||||
pub summary: ConversationSummary,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Default)]
|
||||
pub struct ConversationSummary {
|
||||
pub brief: String,
|
||||
pub key_points: Vec<String>,
|
||||
pub customer_needs: Vec<String>,
|
||||
pub unresolved_issues: Vec<String>,
|
||||
pub sentiment_trend: String,
|
||||
pub recommended_action: String,
|
||||
pub message_count: i32,
|
||||
pub duration_minutes: i32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct SentimentResponse {
|
||||
pub success: bool,
|
||||
pub sentiment: SentimentAnalysis,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Default)]
|
||||
pub struct SentimentAnalysis {
|
||||
pub overall: String,
|
||||
pub score: f32,
|
||||
pub emotions: Vec<Emotion>,
|
||||
pub escalation_risk: String,
|
||||
pub urgency: String,
|
||||
pub emoji: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct Emotion {
|
||||
pub name: String,
|
||||
pub intensity: f32,
|
||||
}
|
||||
|
|
@ -1,7 +1,21 @@
|
|||
pub mod drive;
|
||||
pub mod keyword_services;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_types;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_assist;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_assist_types;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_assist_config;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_assist_handlers;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_assist_commands;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_assist_helpers;
|
||||
#[cfg(feature = "llm")]
|
||||
pub mod llm_parser;
|
||||
pub mod queue;
|
||||
|
||||
pub use drive::{AttendanceDriveConfig, AttendanceDriveService, RecordMetadata, SyncResult};
|
||||
|
|
@ -10,11 +24,15 @@ pub use keyword_services::{
|
|||
KeywordParser, ParsedCommand,
|
||||
};
|
||||
#[cfg(feature = "llm")]
|
||||
pub use llm_assist::{
|
||||
AttendantTip, ConversationMessage, ConversationSummary, LlmAssistConfig, PolishRequest,
|
||||
PolishResponse, SentimentAnalysis, SentimentResponse, SmartRepliesRequest,
|
||||
SmartRepliesResponse, SmartReply, SummaryRequest, SummaryResponse, TipRequest, TipResponse,
|
||||
TipType,
|
||||
pub use llm_assist_types::*;
|
||||
#[cfg(feature = "llm")]
|
||||
pub use llm_assist::*;
|
||||
#[cfg(feature = "llm")]
|
||||
pub use llm_parser::{
|
||||
AttendantTip, SmartReply,
|
||||
ConversationSummary, SentimentAnalysis,
|
||||
parse_tips_response, parse_polish_response, parse_smart_replies_response,
|
||||
parse_summary_response, parse_sentiment_response, extract_json,
|
||||
};
|
||||
pub use queue::{
|
||||
AssignRequest, AttendantStats, AttendantStatus, QueueFilters, QueueItem, QueueStatus,
|
||||
|
|
@ -24,8 +42,8 @@ pub use queue::{
|
|||
use crate::core::bot::channels::whatsapp::WhatsAppAdapter;
|
||||
use crate::core::bot::channels::ChannelAdapter;
|
||||
use crate::core::urls::ApiUrls;
|
||||
use crate::shared::models::{BotResponse, UserSession};
|
||||
use crate::shared::state::{AppState, AttendantNotification};
|
||||
use crate::core::shared::models::{BotResponse, UserSession};
|
||||
use crate::core::shared::state::{AppState, AttendantNotification};
|
||||
use axum::{
|
||||
extract::{
|
||||
ws::{Message, WebSocket, WebSocketUpgrade},
|
||||
|
|
@ -122,7 +140,7 @@ pub async fn attendant_respond(
|
|||
let conn = state.conn.clone();
|
||||
let session_result = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().ok()?;
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
user_sessions::table
|
||||
.find(session_id)
|
||||
.first::<UserSession>(&mut db_conn)
|
||||
|
|
@ -277,7 +295,7 @@ async fn save_message_to_history(
|
|||
tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().map_err(|e| format!("DB error: {}", e))?;
|
||||
|
||||
use crate::shared::models::schema::message_history;
|
||||
use crate::core::shared::models::schema::message_history;
|
||||
|
||||
diesel::insert_into(message_history::table)
|
||||
.values((
|
||||
|
|
@ -519,7 +537,7 @@ async fn handle_attendant_message(
|
|||
let conn = state.conn.clone();
|
||||
if let Some(session) = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().ok()?;
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
user_sessions::table
|
||||
.find(uuid)
|
||||
.first::<UserSession>(&mut db_conn)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use axum::{
|
||||
extract::{Path, Query, State},
|
||||
http::StatusCode,
|
||||
|
|
@ -307,8 +307,8 @@ pub async fn list_queue(
|
|||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {}", e))?;
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::shared::models::schema::users;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::users;
|
||||
|
||||
let sessions_data: Vec<UserSession> = user_sessions::table
|
||||
.order(user_sessions::created_at.desc())
|
||||
|
|
@ -399,7 +399,7 @@ pub async fn list_attendants(
|
|||
let conn = state.conn.clone();
|
||||
let result = tokio::task::spawn_blocking(move || {
|
||||
let mut db_conn = conn.get().ok()?;
|
||||
use crate::shared::models::schema::bots;
|
||||
use crate::core::shared::models::schema::bots;
|
||||
bots::table
|
||||
.filter(bots::is_active.eq(true))
|
||||
.select(bots::id)
|
||||
|
|
@ -463,7 +463,7 @@ pub async fn assign_conversation(
|
|||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {}", e))?;
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = user_sessions::table
|
||||
.filter(user_sessions::id.eq(session_id))
|
||||
|
|
@ -538,7 +538,7 @@ pub async fn transfer_conversation(
|
|||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {}", e))?;
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = user_sessions::table
|
||||
.filter(user_sessions::id.eq(session_id))
|
||||
|
|
@ -618,7 +618,7 @@ pub async fn resolve_conversation(
|
|||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {}", e))?;
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = user_sessions::table
|
||||
.filter(user_sessions::id.eq(session_id))
|
||||
|
|
@ -688,7 +688,7 @@ pub async fn get_insights(
|
|||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {}", e))?;
|
||||
|
||||
use crate::shared::models::schema::message_history;
|
||||
use crate::core::shared::models::schema::message_history;
|
||||
|
||||
let messages: Vec<(String, i32)> = message_history::table
|
||||
.filter(message_history::session_id.eq(session_id))
|
||||
|
|
|
|||
|
|
@ -13,13 +13,13 @@ use serde::{Deserialize, Serialize};
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::bot::get_default_bot;
|
||||
use crate::core::bot::get_default_bot;
|
||||
use crate::core::shared::schema::{
|
||||
attendant_agent_status, attendant_canned_responses, attendant_queue_agents, attendant_queues,
|
||||
attendant_session_messages, attendant_sessions, attendant_tags, attendant_transfers,
|
||||
attendant_wrap_up_codes,
|
||||
};
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Queryable, Insertable, AsChangeset)]
|
||||
#[diesel(table_name = attendant_queues)]
|
||||
|
|
@ -998,7 +998,10 @@ pub async fn get_attendant_stats(
|
|||
})?;
|
||||
|
||||
let (org_id, bot_id) = get_bot_context(&state);
|
||||
let today = Utc::now().date_naive().and_hms_opt(0, 0, 0).unwrap();
|
||||
let today = Utc::now().date_naive().and_hms_opt(0, 0, 0).unwrap_or_else(|| {
|
||||
// Fallback to midnight (0,0,0 should always be valid)
|
||||
chrono::NaiveTime::from_hms_opt(0, 0, 0).unwrap_or_else(|| chrono::NaiveTime::MIN)
|
||||
});
|
||||
let today_utc = DateTime::<Utc>::from_naive_utc_and_offset(today, Utc);
|
||||
|
||||
let total_sessions_today: i64 = attendant_sessions::table
|
||||
|
|
|
|||
|
|
@ -10,11 +10,11 @@ use serde::Deserialize;
|
|||
use std::sync::Arc;
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::bot::get_default_bot;
|
||||
use crate::core::bot::get_default_bot;
|
||||
use crate::core::shared::schema::{
|
||||
attendant_agent_status, attendant_queues, attendant_sessions,
|
||||
};
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
|
||||
#[derive(Debug, Deserialize, Default)]
|
||||
pub struct SessionListQuery {
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::core::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
|
||||
fn is_sensitive_config_key(key: &str) -> bool {
|
||||
let key_lower = key.to_lowercase();
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ use crate::auto_task::task_types::{
|
|||
use crate::auto_task::intent_classifier::IntentClassifier;
|
||||
use crate::auto_task::intent_compiler::IntentCompiler;
|
||||
use crate::auto_task::safety_layer::{SafetyLayer, SimulationResult};
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
use axum::{
|
||||
extract::{Path, Query, State},
|
||||
http::StatusCode,
|
||||
|
|
@ -1345,8 +1345,8 @@ pub async fn simulate_plan_handler(
|
|||
|
||||
fn get_current_session(
|
||||
state: &Arc<AppState>,
|
||||
) -> Result<crate::shared::models::UserSession, Box<dyn std::error::Error + Send + Sync>> {
|
||||
use crate::shared::models::user_sessions::dsl::*;
|
||||
) -> Result<crate::core::shared::models::UserSession, Box<dyn std::error::Error + Send + Sync>> {
|
||||
use crate::core::shared::models::user_sessions::dsl::*;
|
||||
use diesel::prelude::*;
|
||||
|
||||
let mut conn = state
|
||||
|
|
@ -1356,7 +1356,7 @@ fn get_current_session(
|
|||
|
||||
let session = user_sessions
|
||||
.order(created_at.desc())
|
||||
.first::<crate::shared::models::UserSession>(&mut conn)
|
||||
.first::<crate::core::shared::models::UserSession>(&mut conn)
|
||||
.optional()
|
||||
.map_err(|e| format!("DB query error: {}", e))?
|
||||
.ok_or("No active session found")?;
|
||||
|
|
@ -1366,7 +1366,7 @@ fn get_current_session(
|
|||
|
||||
fn create_auto_task_from_plan(
|
||||
_state: &Arc<AppState>,
|
||||
session: &crate::shared::models::UserSession,
|
||||
session: &crate::core::shared::models::UserSession,
|
||||
plan_id: &str,
|
||||
execution_mode: ExecutionMode,
|
||||
priority: TaskPriority,
|
||||
|
|
@ -1701,7 +1701,7 @@ fn update_task_status(
|
|||
fn create_task_record(
|
||||
state: &Arc<AppState>,
|
||||
task_id: Uuid,
|
||||
session: &crate::shared::models::UserSession,
|
||||
session: &crate::core::shared::models::UserSession,
|
||||
intent: &str,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
let mut conn = state.conn.get()?;
|
||||
|
|
@ -1799,7 +1799,7 @@ fn simulate_task_execution(
|
|||
_state: &Arc<AppState>,
|
||||
safety_layer: &SafetyLayer,
|
||||
task_id: &str,
|
||||
session: &crate::shared::models::UserSession,
|
||||
session: &crate::core::shared::models::UserSession,
|
||||
) -> Result<SimulationResult, Box<dyn std::error::Error + Send + Sync>> {
|
||||
info!("Simulating task execution task_id={task_id}");
|
||||
safety_layer.simulate_execution(task_id, session)
|
||||
|
|
@ -1809,7 +1809,7 @@ fn simulate_plan_execution(
|
|||
_state: &Arc<AppState>,
|
||||
safety_layer: &SafetyLayer,
|
||||
plan_id: &str,
|
||||
session: &crate::shared::models::UserSession,
|
||||
session: &crate::core::shared::models::UserSession,
|
||||
) -> Result<SimulationResult, Box<dyn std::error::Error + Send + Sync>> {
|
||||
info!("Simulating plan execution plan_id={plan_id}");
|
||||
safety_layer.simulate_execution(plan_id, session)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{DateTime, Utc};
|
||||
use diesel::prelude::*;
|
||||
use diesel::sql_query;
|
||||
|
|
|
|||
|
|
@ -2,8 +2,8 @@ use crate::auto_task::app_generator::AppGenerator;
|
|||
use crate::auto_task::intent_compiler::IntentCompiler;
|
||||
use crate::basic::ScriptService;
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
#[cfg(feature = "llm")]
|
||||
use crate::core::config::ConfigManager;
|
||||
use chrono::{DateTime, Utc};
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
#[cfg(feature = "llm")]
|
||||
use crate::core::config::ConfigManager;
|
||||
use chrono::{DateTime, Utc};
|
||||
|
|
|
|||
|
|
@ -40,7 +40,7 @@ pub use intent_compiler::{CompiledIntent, IntentCompiler};
|
|||
pub use safety_layer::{AuditEntry, ConstraintCheckResult, SafetyLayer, SimulationResult};
|
||||
|
||||
use crate::core::urls::ApiUrls;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
use axum::{
|
||||
extract::{
|
||||
ws::{Message, WebSocket, WebSocketUpgrade},
|
||||
|
|
@ -53,7 +53,7 @@ use log::{debug, error, info, warn};
|
|||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
|
||||
pub fn configure_autotask_routes() -> axum::Router<std::sync::Arc<crate::shared::state::AppState>> {
|
||||
pub fn configure_autotask_routes() -> axum::Router<std::sync::Arc<crate::core::shared::state::AppState>> {
|
||||
use axum::routing::{get, post};
|
||||
|
||||
axum::Router::new()
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{DateTime, Utc};
|
||||
use diesel::prelude::*;
|
||||
use log::{info, trace, warn};
|
||||
|
|
|
|||
|
|
@ -2,8 +2,8 @@
|
|||
use crate::basic::keywords::set_schedule::execute_set_schedule;
|
||||
use crate::basic::keywords::table_definition::process_table_definitions;
|
||||
use crate::basic::keywords::webhook::execute_webhook_registration;
|
||||
use crate::shared::models::TriggerKind;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::TriggerKind;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::ExpressionMethods;
|
||||
use diesel::QueryDsl;
|
||||
use diesel::RunQueryDsl;
|
||||
|
|
@ -424,10 +424,10 @@ impl BasicCompiler {
|
|||
.conn
|
||||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {e}"))?;
|
||||
use crate::shared::models::system_automations::dsl::*;
|
||||
use crate::core::shared::models::system_automations::dsl::*;
|
||||
diesel::delete(
|
||||
system_automations
|
||||
.filter(bot_id.eq(bot_uuid))
|
||||
.filter(bot_id.eq(&bot_uuid))
|
||||
.filter(kind.eq(TriggerKind::Scheduled as i32))
|
||||
.filter(param.eq(&script_name)),
|
||||
)
|
||||
|
|
@ -505,7 +505,13 @@ impl BasicCompiler {
|
|||
}
|
||||
|
||||
if trimmed.to_uppercase().starts_with("USE WEBSITE") {
|
||||
let re = Regex::new(r#"(?i)USE\s+WEBSITE\s+"([^"]+)"(?:\s+REFRESH\s+"([^"]+)")?"#).unwrap();
|
||||
let re = match Regex::new(r#"(?i)USE\s+WEBSITE\s+"([^"]+)"(?:\s+REFRESH\s+"([^"]+)")?"#) {
|
||||
Ok(re) => re,
|
||||
Err(e) => {
|
||||
log::warn!("Invalid regex pattern: {}", e);
|
||||
continue;
|
||||
}
|
||||
};
|
||||
if let Some(caps) = re.captures(&normalized) {
|
||||
if let Some(url_match) = caps.get(1) {
|
||||
let url = url_match.as_str();
|
||||
|
|
@ -548,10 +554,10 @@ impl BasicCompiler {
|
|||
.conn
|
||||
.get()
|
||||
.map_err(|e| format!("Failed to get database connection: {}", e))?;
|
||||
use crate::shared::models::system_automations::dsl::*;
|
||||
use crate::core::shared::models::system_automations::dsl::*;
|
||||
diesel::delete(
|
||||
system_automations
|
||||
.filter(bot_id.eq(bot_uuid))
|
||||
.filter(bot_id.eq(&bot_uuid))
|
||||
.filter(kind.eq(TriggerKind::Scheduled as i32))
|
||||
.filter(param.eq(&script_name)),
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{info, trace, warn};
|
||||
use rhai::{Dynamic, Engine};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{info, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::Utc;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, trace};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, info, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
use serde_json::json;
|
||||
|
|
@ -26,7 +26,7 @@ pub fn clear_suggestions_keyword(
|
|||
.register_custom_syntax(["CLEAR", "SUGGESTIONS"], true, move |_context, _inputs| {
|
||||
if let Some(cache_client) = &cache {
|
||||
let redis_key = format!("suggestions:{}:{}", user_session.user_id, user_session.id);
|
||||
let mut conn = match cache_client.get_connection() {
|
||||
let mut conn: redis::Connection = match cache_client.get_connection() {
|
||||
Ok(conn) => conn,
|
||||
Err(e) => {
|
||||
error!("Failed to connect to cache: {}", e);
|
||||
|
|
@ -366,7 +366,7 @@ pub fn get_suggestions(
|
|||
cache: Option<&Arc<redis::Client>>,
|
||||
user_id: &str,
|
||||
session_id: &str,
|
||||
) -> Vec<crate::shared::models::Suggestion> {
|
||||
) -> Vec<crate::core::shared::models::Suggestion> {
|
||||
let mut suggestions = Vec::new();
|
||||
|
||||
if let Some(cache_client) = cache {
|
||||
|
|
@ -391,7 +391,7 @@ pub fn get_suggestions(
|
|||
Ok(items) => {
|
||||
for item in items {
|
||||
if let Ok(json) = serde_json::from_str::<serde_json::Value>(&item) {
|
||||
let suggestion = crate::shared::models::Suggestion {
|
||||
let suggestion = crate::core::shared::models::Suggestion {
|
||||
text: json["text"].as_str().unwrap_or("").to_string(),
|
||||
context: json["context"].as_str().map(|s| s.to_string()),
|
||||
action: json.get("action").and_then(|v| serde_json::to_string(v).ok()),
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{info, trace, warn};
|
||||
use rhai::{Dynamic, Engine};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{debug, trace};
|
||||
use rhai::{Dynamic, Engine, EvalAltResult, Map, Position};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, info, trace, warn};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::core::shared::get_content_type;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::state::AppState;
|
||||
use axum::{
|
||||
body::Body,
|
||||
extract::{Path, State},
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -4,8 +4,8 @@ pub mod slice;
|
|||
pub mod sort;
|
||||
pub mod unique;
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Engine};
|
||||
use std::collections::HashSet;
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
use crate::core::shared::schema::calendar_events;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{DateTime, Duration, Timelike, Utc};
|
||||
use diesel::prelude::*;
|
||||
use log::{error, info, trace};
|
||||
|
|
@ -11,7 +11,7 @@ use uuid::Uuid;
|
|||
|
||||
#[derive(Debug)]
|
||||
pub struct CalendarEngine {
|
||||
_db: crate::shared::utils::DbPool,
|
||||
_db: crate::core::shared::utils::DbPool,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
|
|
@ -49,7 +49,7 @@ pub struct RecurrenceRule {
|
|||
|
||||
impl CalendarEngine {
|
||||
#[must_use]
|
||||
pub fn new(db: crate::shared::utils::DbPool) -> Self {
|
||||
pub fn new(db: crate::core::shared::utils::DbPool) -> Self {
|
||||
Self { _db: db }
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
|
|
@ -24,7 +24,7 @@ pub fn set_bot_memory_keyword(state: Arc<AppState>, user: UserSession, engine: &
|
|||
let value_clone = value;
|
||||
|
||||
tokio::spawn(async move {
|
||||
use crate::shared::models::bot_memories;
|
||||
use crate::core::shared::models::bot_memories;
|
||||
|
||||
let mut conn = match state_for_spawn.conn.get() {
|
||||
Ok(conn) => conn,
|
||||
|
|
@ -78,7 +78,7 @@ pub fn set_bot_memory_keyword(state: Arc<AppState>, user: UserSession, engine: &
|
|||
}
|
||||
}
|
||||
} else {
|
||||
let new_memory = crate::shared::models::BotMemory {
|
||||
let new_memory = crate::core::shared::models::BotMemory {
|
||||
id: Uuid::new_v4(),
|
||||
bot_id: bot_uuid,
|
||||
key: key_clone.clone(),
|
||||
|
|
@ -121,7 +121,7 @@ pub fn set_bot_memory_keyword(state: Arc<AppState>, user: UserSession, engine: &
|
|||
let value_clone = value;
|
||||
|
||||
tokio::spawn(async move {
|
||||
use crate::shared::models::bot_memories;
|
||||
use crate::core::shared::models::bot_memories;
|
||||
|
||||
let mut conn = match state_for_spawn.conn.get() {
|
||||
Ok(conn) => conn,
|
||||
|
|
@ -206,7 +206,7 @@ pub fn set_bot_memory_keyword(state: Arc<AppState>, user: UserSession, engine: &
|
|||
let state_clone3 = Arc::clone(&state);
|
||||
let user_clone3 = user.clone();
|
||||
engine.register_fn("GET_BOT_MEMORY", move |key_param: String| -> String {
|
||||
use crate::shared::models::bot_memories;
|
||||
use crate::core::shared::models::bot_memories;
|
||||
|
||||
let state = Arc::clone(&state_clone3);
|
||||
let conn_result = state.conn.get();
|
||||
|
|
@ -235,7 +235,7 @@ pub fn get_bot_memory_keyword(state: Arc<AppState>, user: UserSession, engine: &
|
|||
|
||||
|
||||
engine.register_fn("GET BOT MEMORY", move |key_param: String| -> String {
|
||||
use crate::shared::models::bot_memories;
|
||||
use crate::core::shared::models::bot_memories;
|
||||
|
||||
let state = Arc::clone(&state_clone);
|
||||
let conn_result = state.conn.get();
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, info};
|
||||
use rhai::{Dynamic, Engine, EvalAltResult};
|
||||
|
|
@ -93,7 +93,7 @@ pub fn register_clear_kb_keyword(
|
|||
}
|
||||
|
||||
fn clear_specific_kb(
|
||||
conn_pool: crate::shared::utils::DbPool,
|
||||
conn_pool: crate::core::shared::utils::DbPool,
|
||||
session_id: Uuid,
|
||||
kb_name: &str,
|
||||
) -> Result<(), String> {
|
||||
|
|
@ -129,7 +129,7 @@ fn clear_specific_kb(
|
|||
}
|
||||
|
||||
fn clear_all_kbs(
|
||||
conn_pool: crate::shared::utils::DbPool,
|
||||
conn_pool: crate::core::shared::utils::DbPool,
|
||||
session_id: Uuid,
|
||||
) -> Result<usize, String> {
|
||||
let mut conn = conn_pool
|
||||
|
|
@ -158,7 +158,7 @@ fn clear_all_kbs(
|
|||
}
|
||||
|
||||
pub fn get_active_kb_count(
|
||||
conn_pool: &crate::shared::utils::DbPool,
|
||||
conn_pool: &crate::core::shared::utils::DbPool,
|
||||
session_id: Uuid,
|
||||
) -> Result<i64, String> {
|
||||
let mut conn = conn_pool
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
use crate::basic::keywords::use_tool::clear_session_tools;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
use crate::security::command_guard::SafeCommand;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{trace, warn};
|
||||
use rhai::{Dynamic, Engine};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{debug, info};
|
||||
use rhai::{Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use rhai::Dynamic;
|
||||
use rhai::Engine;
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
#[cfg(feature = "llm")]
|
||||
use crate::llm::LLMProvider;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{debug, info};
|
||||
#[cfg(feature = "llm")]
|
||||
use log::warn;
|
||||
|
|
@ -38,7 +38,7 @@ pub fn create_site_keyword(state: &AppState, user: UserSession, engine: &mut Eng
|
|||
let prompt = context.eval_expression_tree(&inputs[2])?;
|
||||
|
||||
let config = match state_clone.config.as_ref() {
|
||||
Some(c) => c.clone(),
|
||||
Some(c) => <crate::core::config::AppConfig as Clone>::clone(c),
|
||||
None => {
|
||||
return Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"Config must be initialized".into(),
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{DateTime, Duration, NaiveDate, Utc};
|
||||
use diesel::prelude::*;
|
||||
use log::{error, trace};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::Utc;
|
||||
use diesel::prelude::*;
|
||||
use log::{debug, error, info};
|
||||
|
|
@ -74,7 +74,7 @@ pub fn get_queue_impl(state: &Arc<AppState>, filter: Option<String>) -> Dynamic
|
|||
}
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let mut query = user_sessions::table
|
||||
.filter(
|
||||
|
|
@ -214,7 +214,7 @@ pub fn next_in_queue_impl(state: &Arc<AppState>) -> Dynamic {
|
|||
Err(e) => return create_error_result(&format!("DB error: {}", e)),
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: Option<UserSession> = user_sessions::table
|
||||
.filter(
|
||||
|
|
@ -327,7 +327,7 @@ pub fn assign_conversation_impl(
|
|||
return create_error_result("DB error: failed to get connection");
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = match user_sessions::table.find(session_uuid).first(&mut db_conn)
|
||||
{
|
||||
|
|
@ -414,7 +414,7 @@ pub fn resolve_conversation_impl(
|
|||
return create_error_result("DB error: failed to get connection");
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = match user_sessions::table.find(session_uuid).first(&mut db_conn)
|
||||
{
|
||||
|
|
@ -501,7 +501,7 @@ pub fn set_priority_impl(state: &Arc<AppState>, session_id: &str, priority: Dyna
|
|||
return create_error_result("DB error: failed to get connection");
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = match user_sessions::table.find(session_uuid).first(&mut db_conn)
|
||||
{
|
||||
|
|
@ -682,7 +682,7 @@ pub fn get_attendant_stats_impl(state: &Arc<AppState>, attendant_id: &str) -> Dy
|
|||
Err(e) => return create_error_result(&format!("DB error: {}", e)),
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let today = Utc::now().date_naive();
|
||||
let today_start = today.and_hms_opt(0, 0, 0).unwrap_or_else(|| today.and_hms_opt(0, 0, 1).unwrap_or_default());
|
||||
|
|
@ -966,7 +966,7 @@ pub fn get_summary_impl(state: &Arc<AppState>, session_id: &str) -> Dynamic {
|
|||
Err(e) => return create_error_result(&format!("DB error: {}", e)),
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::message_history;
|
||||
use crate::core::shared::models::schema::message_history;
|
||||
|
||||
let message_count: i64 = message_history::table
|
||||
.filter(message_history::session_id.eq(session_uuid))
|
||||
|
|
@ -1133,7 +1133,7 @@ pub fn tag_conversation_impl(
|
|||
return create_error_result("DB error: failed to get connection");
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = match user_sessions::table.find(session_uuid).first(&mut db_conn)
|
||||
{
|
||||
|
|
@ -1227,7 +1227,7 @@ pub fn add_note_impl(
|
|||
return create_error_result("DB error: failed to get connection");
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let session: UserSession = match user_sessions::table.find(session_uuid).first(&mut db_conn)
|
||||
{
|
||||
|
|
@ -1302,7 +1302,7 @@ pub fn get_customer_history_impl(state: &Arc<AppState>, user_id: &str) -> Dynami
|
|||
Err(e) => return create_error_result(&format!("DB error: {}", e)),
|
||||
};
|
||||
|
||||
use crate::shared::models::schema::user_sessions;
|
||||
use crate::core::shared::models::schema::user_sessions;
|
||||
|
||||
let sessions: Vec<UserSession> = user_sessions::table
|
||||
.filter(user_sessions::user_id.eq(user_uuid))
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
pub mod attendance;
|
||||
pub mod score_lead;
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
use crate::core::shared::schema::bot_memories;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::Utc;
|
||||
use diesel::prelude::*;
|
||||
use log::{debug, error, info, trace};
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
use super::table_access::{check_table_access, AccessType, UserRoles};
|
||||
use crate::core::shared::{sanitize_identifier, sanitize_sql_value};
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::shared::utils::{json_value_to_dynamic, to_array};
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use crate::core::shared::utils::{json_value_to_dynamic, to_array};
|
||||
use diesel::prelude::*;
|
||||
use diesel::sql_query;
|
||||
use diesel::sql_types::Text;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{Datelike, Duration, NaiveDate, NaiveDateTime};
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{Datelike, NaiveDate, NaiveDateTime, Timelike};
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
|
|
|
|||
|
|
@ -3,8 +3,8 @@ pub mod datediff;
|
|||
pub mod extract;
|
||||
pub mod now;
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use chrono::{Datelike, Local, Timelike, Utc};
|
||||
use log::debug;
|
||||
use rhai::{Dynamic, Engine, Map};
|
||||
|
|
|
|||
|
|
@ -143,7 +143,7 @@ fn find_bot_by_name(
|
|||
conn: &mut PgConnection,
|
||||
bot_name: &str,
|
||||
) -> Result<Uuid, Box<dyn std::error::Error + Send + Sync>> {
|
||||
use crate::shared::models::bots;
|
||||
use crate::core::shared::models::bots;
|
||||
|
||||
let bot_id: Uuid = bots::table
|
||||
.filter(bots::name.eq(bot_name))
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
pub mod on_error;
|
||||
pub mod throw;
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Dynamic, Engine, EvalAltResult, Map, Position};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{debug, trace};
|
||||
use rhai::{Dynamic, Engine, EvalAltResult, Position};
|
||||
use std::cell::RefCell;
|
||||
|
|
|
|||
163
src/basic/keywords/face_api/azure.rs
Normal file
163
src/basic/keywords/face_api/azure.rs
Normal file
|
|
@ -0,0 +1,163 @@
|
|||
//! Azure Face API Types
|
||||
//!
|
||||
//! This module contains Azure-specific response types and conversions.
|
||||
|
||||
use crate::botmodels::{BoundingBox, DetectedFace, EmotionScores, FaceAttributes, FaceLandmarks, Gender, GlassesType, Point2D};
|
||||
use serde::Deserialize;
|
||||
use uuid::Uuid;
|
||||
|
||||
// ============================================================================
|
||||
// Azure API Response Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub(crate) struct AzureFaceResponse {
|
||||
face_id: Option<String>,
|
||||
face_rectangle: AzureFaceRectangle,
|
||||
face_landmarks: Option<AzureFaceLandmarks>,
|
||||
face_attributes: Option<AzureFaceAttributes>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceRectangle {
|
||||
top: f32,
|
||||
left: f32,
|
||||
width: f32,
|
||||
height: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceLandmarks {
|
||||
pupil_left: Option<AzurePoint>,
|
||||
pupil_right: Option<AzurePoint>,
|
||||
nose_tip: Option<AzurePoint>,
|
||||
mouth_left: Option<AzurePoint>,
|
||||
mouth_right: Option<AzurePoint>,
|
||||
eyebrow_left_outer: Option<AzurePoint>,
|
||||
eyebrow_left_inner: Option<AzurePoint>,
|
||||
eyebrow_right_outer: Option<AzurePoint>,
|
||||
eyebrow_right_inner: Option<AzurePoint>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
struct AzurePoint {
|
||||
x: f32,
|
||||
y: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceAttributes {
|
||||
age: Option<f32>,
|
||||
gender: Option<String>,
|
||||
smile: Option<f32>,
|
||||
glasses: Option<String>,
|
||||
emotion: Option<AzureEmotion>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
struct AzureEmotion {
|
||||
anger: f32,
|
||||
contempt: f32,
|
||||
disgust: f32,
|
||||
fear: f32,
|
||||
happiness: f32,
|
||||
neutral: f32,
|
||||
sadness: f32,
|
||||
surprise: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub(crate) struct AzureVerifyResponse {
|
||||
pub confidence: f64,
|
||||
}
|
||||
|
||||
impl AzureFaceResponse {
|
||||
pub(crate) fn into_detected_face(self) -> DetectedFace {
|
||||
let face_id = self.face_id
|
||||
.and_then(|id| Uuid::parse_str(&id).ok())
|
||||
.unwrap_or_else(Uuid::new_v4);
|
||||
|
||||
let landmarks = self.face_landmarks.map(|lm| {
|
||||
FaceLandmarks {
|
||||
left_eye: lm.pupil_left.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
right_eye: lm.pupil_right.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
nose_tip: lm.nose_tip.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
mouth_left: lm.mouth_left.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
mouth_right: lm.mouth_right.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
left_eyebrow_left: lm.eyebrow_left_outer.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
left_eyebrow_right: lm.eyebrow_left_inner.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
right_eyebrow_left: lm.eyebrow_right_inner.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
right_eyebrow_right: lm.eyebrow_right_outer.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
}
|
||||
});
|
||||
|
||||
let attributes = self.face_attributes.map(|attrs| {
|
||||
let gender = attrs.gender.as_ref().map(|g| {
|
||||
match g.to_lowercase().as_str() {
|
||||
"male" => Gender::Male,
|
||||
"female" => Gender::Female,
|
||||
_ => Gender::Unknown,
|
||||
}
|
||||
});
|
||||
|
||||
let emotion = attrs.emotion.map(|e| EmotionScores {
|
||||
anger: e.anger,
|
||||
contempt: e.contempt,
|
||||
disgust: e.disgust,
|
||||
fear: e.fear,
|
||||
happiness: e.happiness,
|
||||
neutral: e.neutral,
|
||||
sadness: e.sadness,
|
||||
surprise: e.surprise,
|
||||
});
|
||||
|
||||
let glasses = attrs.glasses.as_ref().map(|g| {
|
||||
match g.to_lowercase().as_str() {
|
||||
"noглasses" | "noglasses" => GlassesType::NoGlasses,
|
||||
"readingglasses" => GlassesType::ReadingGlasses,
|
||||
"sunglasses" => GlassesType::Sunglasses,
|
||||
"swimminggoggles" => GlassesType::SwimmingGoggles,
|
||||
_ => GlassesType::NoGlasses,
|
||||
}
|
||||
});
|
||||
|
||||
FaceAttributes {
|
||||
age: attrs.age,
|
||||
gender,
|
||||
emotion,
|
||||
glasses,
|
||||
facial_hair: None,
|
||||
head_pose: None,
|
||||
smile: attrs.smile,
|
||||
blur: None,
|
||||
exposure: None,
|
||||
noise: None,
|
||||
occlusion: None,
|
||||
}
|
||||
});
|
||||
|
||||
DetectedFace {
|
||||
id: face_id,
|
||||
bounding_box: BoundingBox {
|
||||
left: self.face_rectangle.left,
|
||||
top: self.face_rectangle.top,
|
||||
width: self.face_rectangle.width,
|
||||
height: self.face_rectangle.height,
|
||||
},
|
||||
confidence: 1.0,
|
||||
landmarks,
|
||||
attributes,
|
||||
embedding: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
34
src/basic/keywords/face_api/error.rs
Normal file
34
src/basic/keywords/face_api/error.rs
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
//! Face API Error Types
|
||||
//!
|
||||
//! This module contains error types for Face API operations.
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum FaceApiError {
|
||||
ConfigError(String),
|
||||
NetworkError(String),
|
||||
ApiError(String),
|
||||
ParseError(String),
|
||||
InvalidInput(String),
|
||||
NoFaceFound,
|
||||
NotImplemented(String),
|
||||
RateLimited,
|
||||
Unauthorized,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for FaceApiError {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
Self::ConfigError(msg) => write!(f, "Configuration error: {}", msg),
|
||||
Self::NetworkError(msg) => write!(f, "Network error: {}", msg),
|
||||
Self::ApiError(msg) => write!(f, "API error: {}", msg),
|
||||
Self::ParseError(msg) => write!(f, "Parse error: {}", msg),
|
||||
Self::InvalidInput(msg) => write!(f, "Invalid input: {}", msg),
|
||||
Self::NoFaceFound => write!(f, "No face found in image"),
|
||||
Self::NotImplemented(provider) => write!(f, "{} provider not implemented", provider),
|
||||
Self::RateLimited => write!(f, "Rate limit exceeded"),
|
||||
Self::Unauthorized => write!(f, "Unauthorized - check API credentials"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::error::Error for FaceApiError {}
|
||||
105
src/basic/keywords/face_api/executor.rs
Normal file
105
src/basic/keywords/face_api/executor.rs
Normal file
|
|
@ -0,0 +1,105 @@
|
|||
//! Face API BASIC Keyword Executors
|
||||
//!
|
||||
//! This module contains functions to execute Face API keywords from BASIC code.
|
||||
|
||||
use super::results::{FaceAnalysisResult, FaceDetectionResult, FaceVerificationResult};
|
||||
use super::service::FaceApiService;
|
||||
use super::types::{AnalysisOptions, DetectionOptions, FaceAttributeType, VerificationOptions};
|
||||
|
||||
// ============================================================================
|
||||
// BASIC Keyword Executor
|
||||
// ============================================================================
|
||||
|
||||
/// Execute DETECT FACES keyword
|
||||
pub async fn execute_detect_faces(
|
||||
service: &FaceApiService,
|
||||
image_url: &str,
|
||||
options: Option<DetectionOptions>,
|
||||
) -> Result<FaceDetectionResult, super::error::FaceApiError> {
|
||||
let image = super::types::ImageSource::Url(image_url.to_string());
|
||||
let opts = options.unwrap_or_default();
|
||||
service.detect_faces(&image, &opts).await
|
||||
}
|
||||
|
||||
/// Execute VERIFY FACE keyword
|
||||
pub async fn execute_verify_face(
|
||||
service: &FaceApiService,
|
||||
face1_url: &str,
|
||||
face2_url: &str,
|
||||
options: Option<VerificationOptions>,
|
||||
) -> Result<FaceVerificationResult, super::error::FaceApiError> {
|
||||
let face1 = super::types::FaceSource::Image(super::types::ImageSource::Url(face1_url.to_string()));
|
||||
let face2 = super::types::FaceSource::Image(super::types::ImageSource::Url(face2_url.to_string()));
|
||||
let opts = options.unwrap_or_default();
|
||||
service.verify_faces(&face1, &face2, &opts).await
|
||||
}
|
||||
|
||||
/// Execute ANALYZE FACE keyword
|
||||
pub async fn execute_analyze_face(
|
||||
service: &FaceApiService,
|
||||
image_url: &str,
|
||||
attributes: Option<Vec<FaceAttributeType>>,
|
||||
options: Option<AnalysisOptions>,
|
||||
) -> Result<FaceAnalysisResult, super::error::FaceApiError> {
|
||||
let source = super::types::FaceSource::Image(super::types::ImageSource::Url(image_url.to_string()));
|
||||
let attrs = attributes.unwrap_or_else(|| vec![
|
||||
FaceAttributeType::Age,
|
||||
FaceAttributeType::Gender,
|
||||
FaceAttributeType::Emotion,
|
||||
FaceAttributeType::Smile,
|
||||
]);
|
||||
let opts: AnalysisOptions = options.unwrap_or_default();
|
||||
service.analyze_face(&source, &attrs, &opts).await
|
||||
}
|
||||
|
||||
/// Convert detection result to BASIC-friendly format
|
||||
pub fn detection_to_basic_value(result: &FaceDetectionResult) -> serde_json::Value {
|
||||
serde_json::json!({
|
||||
"success": result.success,
|
||||
"face_count": result.face_count,
|
||||
"faces": result.faces.iter().map(|f| {
|
||||
serde_json::json!({
|
||||
"id": f.id.to_string(),
|
||||
"bounds": {
|
||||
"left": f.bounding_box.left,
|
||||
"top": f.bounding_box.top,
|
||||
"width": f.bounding_box.width,
|
||||
"height": f.bounding_box.height
|
||||
},
|
||||
"confidence": f.confidence,
|
||||
"age": f.attributes.as_ref().and_then(|a| a.age),
|
||||
"gender": f.attributes.as_ref().and_then(|a| a.gender).map(|g| format!("{:?}", g).to_lowercase()),
|
||||
"emotion": f.attributes.as_ref().and_then(|a| a.emotion.as_ref()).map(|e| e.dominant_emotion()),
|
||||
"smile": f.attributes.as_ref().and_then(|a| a.smile)
|
||||
})
|
||||
}).collect::<Vec<_>>(),
|
||||
"processing_time_ms": result.processing_time_ms,
|
||||
"error": result.error
|
||||
})
|
||||
}
|
||||
|
||||
/// Convert verification result to BASIC-friendly format
|
||||
pub fn verification_to_basic_value(result: &FaceVerificationResult) -> serde_json::Value {
|
||||
serde_json::json!({
|
||||
"success": result.success,
|
||||
"is_match": result.is_match,
|
||||
"confidence": result.confidence,
|
||||
"threshold": result.threshold,
|
||||
"processing_time_ms": result.processing_time_ms,
|
||||
"error": result.error
|
||||
})
|
||||
}
|
||||
|
||||
/// Convert analysis result to BASIC-friendly format
|
||||
pub fn analysis_to_basic_value(result: &FaceAnalysisResult) -> serde_json::Value {
|
||||
serde_json::json!({
|
||||
"success": result.success,
|
||||
"age": result.estimated_age,
|
||||
"gender": result.gender,
|
||||
"emotion": result.dominant_emotion,
|
||||
"smile": result.smile_intensity,
|
||||
"quality": result.quality_score,
|
||||
"processing_time_ms": result.processing_time_ms,
|
||||
"error": result.error
|
||||
})
|
||||
}
|
||||
44
src/basic/keywords/face_api/mod.rs
Normal file
44
src/basic/keywords/face_api/mod.rs
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
//! Face API BASIC Keywords
|
||||
//!
|
||||
//! Provides face detection, verification, and analysis capabilities through BASIC keywords.
|
||||
//! Supports Azure Face API, AWS Rekognition, and local OpenCV fallback.
|
||||
|
||||
mod azure;
|
||||
mod error;
|
||||
mod executor;
|
||||
mod results;
|
||||
mod service;
|
||||
mod types;
|
||||
|
||||
// Re-export all public types
|
||||
pub use error::FaceApiError;
|
||||
pub use executor::{
|
||||
analysis_to_basic_value,
|
||||
detection_to_basic_value,
|
||||
execute_analyze_face,
|
||||
execute_detect_faces,
|
||||
execute_verify_face,
|
||||
verification_to_basic_value,
|
||||
};
|
||||
pub use results::{
|
||||
FaceAnalysisResult,
|
||||
FaceDetectionResult,
|
||||
FaceGroup,
|
||||
SimilarFaceResult,
|
||||
FaceVerificationResult,
|
||||
};
|
||||
pub use service::FaceApiService;
|
||||
pub use types::{
|
||||
AnalyzeFaceKeyword,
|
||||
AnalysisOptions,
|
||||
DetectFacesKeyword,
|
||||
DetectionOptions,
|
||||
FaceAttributeType,
|
||||
FaceSource,
|
||||
FindSimilarFacesKeyword,
|
||||
GroupFacesKeyword,
|
||||
GroupingOptions,
|
||||
ImageSource,
|
||||
VerifyFaceKeyword,
|
||||
VerificationOptions,
|
||||
};
|
||||
174
src/basic/keywords/face_api/results.rs
Normal file
174
src/basic/keywords/face_api/results.rs
Normal file
|
|
@ -0,0 +1,174 @@
|
|||
//! Face API Result Types
|
||||
//!
|
||||
//! This module contains all result types returned by Face API operations.
|
||||
|
||||
use crate::botmodels::{DetectedFace, FaceAttributes};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use uuid::Uuid;
|
||||
|
||||
// ============================================================================
|
||||
// Result Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceDetectionResult {
|
||||
pub success: bool,
|
||||
pub faces: Vec<DetectedFace>,
|
||||
pub face_count: usize,
|
||||
pub image_width: Option<u32>,
|
||||
pub image_height: Option<u32>,
|
||||
pub processing_time_ms: u64,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl FaceDetectionResult {
|
||||
pub fn success(faces: Vec<DetectedFace>, processing_time_ms: u64) -> Self {
|
||||
let face_count = faces.len();
|
||||
Self {
|
||||
success: true,
|
||||
faces,
|
||||
face_count,
|
||||
image_width: None,
|
||||
image_height: None,
|
||||
processing_time_ms,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error(message: String) -> Self {
|
||||
Self {
|
||||
success: false,
|
||||
faces: Vec::new(),
|
||||
face_count: 0,
|
||||
image_width: None,
|
||||
image_height: None,
|
||||
processing_time_ms: 0,
|
||||
error: Some(message),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_image_size(mut self, width: u32, height: u32) -> Self {
|
||||
self.image_width = Some(width);
|
||||
self.image_height = Some(height);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceVerificationResult {
|
||||
pub success: bool,
|
||||
pub is_match: bool,
|
||||
pub confidence: f64,
|
||||
pub threshold: f64,
|
||||
pub face1_id: Option<Uuid>,
|
||||
pub face2_id: Option<Uuid>,
|
||||
pub processing_time_ms: u64,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl FaceVerificationResult {
|
||||
pub fn match_found(confidence: f64, threshold: f64, processing_time_ms: u64) -> Self {
|
||||
Self {
|
||||
success: true,
|
||||
is_match: confidence >= threshold,
|
||||
confidence,
|
||||
threshold,
|
||||
face1_id: None,
|
||||
face2_id: None,
|
||||
processing_time_ms,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error(message: String) -> Self {
|
||||
Self {
|
||||
success: false,
|
||||
is_match: false,
|
||||
confidence: 0.0,
|
||||
threshold: 0.0,
|
||||
face1_id: None,
|
||||
face2_id: None,
|
||||
processing_time_ms: 0,
|
||||
error: Some(message),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_face_ids(mut self, face1_id: Uuid, face2_id: Uuid) -> Self {
|
||||
self.face1_id = Some(face1_id);
|
||||
self.face2_id = Some(face2_id);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceAnalysisResult {
|
||||
pub success: bool,
|
||||
pub face: Option<DetectedFace>,
|
||||
pub attributes: Option<FaceAttributes>,
|
||||
pub dominant_emotion: Option<String>,
|
||||
pub estimated_age: Option<f32>,
|
||||
pub gender: Option<String>,
|
||||
pub smile_intensity: Option<f32>,
|
||||
pub quality_score: Option<f32>,
|
||||
pub processing_time_ms: u64,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl FaceAnalysisResult {
|
||||
pub fn success(face: DetectedFace, processing_time_ms: u64) -> Self {
|
||||
let attributes = face.attributes.clone();
|
||||
let dominant_emotion = attributes.as_ref()
|
||||
.and_then(|a| a.emotion.as_ref())
|
||||
.map(|e| e.dominant_emotion().to_string());
|
||||
let estimated_age = attributes.as_ref().and_then(|a| a.age);
|
||||
let gender = attributes.as_ref()
|
||||
.and_then(|a| a.gender)
|
||||
.map(|g| format!("{:?}", g).to_lowercase());
|
||||
let smile_intensity = attributes.as_ref().and_then(|a| a.smile);
|
||||
|
||||
Self {
|
||||
success: true,
|
||||
face: Some(face),
|
||||
attributes,
|
||||
dominant_emotion,
|
||||
estimated_age,
|
||||
gender,
|
||||
smile_intensity,
|
||||
quality_score: None,
|
||||
processing_time_ms,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error(message: String) -> Self {
|
||||
Self {
|
||||
success: false,
|
||||
face: None,
|
||||
attributes: None,
|
||||
dominant_emotion: None,
|
||||
estimated_age: None,
|
||||
gender: None,
|
||||
smile_intensity: None,
|
||||
quality_score: None,
|
||||
processing_time_ms: 0,
|
||||
error: Some(message),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SimilarFaceResult {
|
||||
pub face_id: Uuid,
|
||||
pub confidence: f64,
|
||||
pub person_id: Option<String>,
|
||||
pub metadata: Option<HashMap<String, serde_json::Value>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceGroup {
|
||||
pub group_id: Uuid,
|
||||
pub face_ids: Vec<Uuid>,
|
||||
pub representative_face_id: Option<Uuid>,
|
||||
pub confidence: f64,
|
||||
}
|
||||
|
|
@ -1,432 +1,18 @@
|
|||
//! Face API BASIC Keywords
|
||||
//! Face API Service
|
||||
//!
|
||||
//! Provides face detection, verification, and analysis capabilities through BASIC keywords.
|
||||
//! Supports Azure Face API, AWS Rekognition, and local OpenCV fallback.
|
||||
//! This module contains the main FaceApiService implementation with support for
|
||||
//! multiple providers: Azure Face API, AWS Rekognition, OpenCV, and InsightFace.
|
||||
|
||||
use crate::botmodels::{GlassesType, FaceLandmarks, Point2D};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use super::azure::AzureFaceResponse;
|
||||
use super::error::FaceApiError;
|
||||
use super::results::{FaceAnalysisResult, FaceDetectionResult, FaceVerificationResult};
|
||||
use super::types::{AnalysisOptions, DetectionOptions, FaceAttributeType, FaceSource, ImageSource, VerificationOptions};
|
||||
use crate::botmodels::{BoundingBox, DetectedFace, EmotionScores, FaceApiConfig, FaceApiProvider, FaceAttributes, FaceLandmarks, Gender, GlassesType, Point2D};
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use tokio::sync::RwLock;
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::botmodels::{
|
||||
DetectedFace, EmotionScores, FaceApiConfig, FaceApiProvider, FaceAttributes,
|
||||
Gender, BoundingBox,
|
||||
};
|
||||
|
||||
// ============================================================================
|
||||
// Keyword Definitions
|
||||
// ============================================================================
|
||||
|
||||
/// DETECT FACES keyword - Detect faces in an image
|
||||
///
|
||||
/// Syntax:
|
||||
/// faces = DETECT FACES image_url
|
||||
/// faces = DETECT FACES image_url WITH OPTIONS options
|
||||
///
|
||||
/// Examples:
|
||||
/// faces = DETECT FACES "https://example.com/photo.jpg"
|
||||
/// faces = DETECT FACES photo WITH OPTIONS { "return_landmarks": true, "return_attributes": true }
|
||||
///
|
||||
/// Returns: Array of detected faces with bounding boxes and optional attributes
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct DetectFacesKeyword {
|
||||
pub image_source: ImageSource,
|
||||
pub options: DetectionOptions,
|
||||
}
|
||||
|
||||
/// VERIFY FACE keyword - Verify if two faces belong to the same person
|
||||
///
|
||||
/// Syntax:
|
||||
/// result = VERIFY FACE face1 AGAINST face2
|
||||
/// result = VERIFY FACE image1 AGAINST image2
|
||||
///
|
||||
/// Examples:
|
||||
/// match = VERIFY FACE saved_face AGAINST new_photo
|
||||
/// result = VERIFY FACE "https://example.com/id.jpg" AGAINST camera_capture
|
||||
///
|
||||
/// Returns: Verification result with confidence score
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct VerifyFaceKeyword {
|
||||
pub face1: FaceSource,
|
||||
pub face2: FaceSource,
|
||||
pub options: VerificationOptions,
|
||||
}
|
||||
|
||||
/// ANALYZE FACE keyword - Analyze face attributes in detail
|
||||
///
|
||||
/// Syntax:
|
||||
/// analysis = ANALYZE FACE image_url
|
||||
/// analysis = ANALYZE FACE face_id WITH ATTRIBUTES attributes_list
|
||||
///
|
||||
/// Examples:
|
||||
/// analysis = ANALYZE FACE photo WITH ATTRIBUTES ["age", "emotion", "gender"]
|
||||
/// result = ANALYZE FACE captured_image
|
||||
///
|
||||
/// Returns: Detailed face analysis including emotions, age, gender, etc.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct AnalyzeFaceKeyword {
|
||||
pub source: FaceSource,
|
||||
pub attributes: Vec<FaceAttributeType>,
|
||||
pub options: AnalysisOptions,
|
||||
}
|
||||
|
||||
/// FIND SIMILAR FACES keyword - Find similar faces in a collection
|
||||
///
|
||||
/// Syntax:
|
||||
/// similar = FIND SIMILAR FACES TO face IN collection
|
||||
///
|
||||
/// Examples:
|
||||
/// matches = FIND SIMILAR FACES TO suspect_photo IN employee_database
|
||||
///
|
||||
/// Returns: Array of similar faces with similarity scores
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FindSimilarFacesKeyword {
|
||||
pub target_face: FaceSource,
|
||||
pub collection_name: String,
|
||||
pub max_results: usize,
|
||||
pub min_confidence: f32,
|
||||
}
|
||||
|
||||
/// GROUP FACES keyword - Group faces by similarity
|
||||
///
|
||||
/// Syntax:
|
||||
/// groups = GROUP FACES face_list
|
||||
///
|
||||
/// Examples:
|
||||
/// groups = GROUP FACES detected_faces
|
||||
///
|
||||
/// Returns: Groups of similar faces
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct GroupFacesKeyword {
|
||||
pub faces: Vec<FaceSource>,
|
||||
pub options: GroupingOptions,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Supporting Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(untagged)]
|
||||
pub enum ImageSource {
|
||||
Url(String),
|
||||
Base64(String),
|
||||
FilePath(String),
|
||||
Variable(String),
|
||||
Binary(Vec<u8>),
|
||||
Bytes(Vec<u8>),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(untagged)]
|
||||
pub enum FaceSource {
|
||||
Image(ImageSource),
|
||||
FaceId(Uuid),
|
||||
DetectedFace(Box<DetectedFace>),
|
||||
Embedding(Vec<f32>),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct DetectionOptions {
|
||||
#[serde(default = "default_true")]
|
||||
pub return_face_id: bool,
|
||||
#[serde(default)]
|
||||
pub return_landmarks: Option<bool>,
|
||||
#[serde(default)]
|
||||
pub return_attributes: Option<bool>,
|
||||
#[serde(default)]
|
||||
pub return_embedding: bool,
|
||||
#[serde(default)]
|
||||
pub detection_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub recognition_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub max_faces: Option<usize>,
|
||||
#[serde(default = "default_min_face_size")]
|
||||
pub min_face_size: u32,
|
||||
}
|
||||
|
||||
fn default_true() -> bool {
|
||||
true
|
||||
}
|
||||
|
||||
fn _default_max_faces() -> usize {
|
||||
100
|
||||
}
|
||||
|
||||
fn default_min_face_size() -> u32 {
|
||||
36
|
||||
}
|
||||
|
||||
impl Default for DetectionOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
return_face_id: true,
|
||||
return_landmarks: Some(false),
|
||||
return_attributes: Some(false),
|
||||
return_embedding: false,
|
||||
detection_model: None,
|
||||
recognition_model: None,
|
||||
max_faces: Some(100),
|
||||
min_face_size: 36,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct VerificationOptions {
|
||||
#[serde(default = "default_confidence_threshold")]
|
||||
pub confidence_threshold: f64,
|
||||
#[serde(default)]
|
||||
pub recognition_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub threshold: Option<f64>,
|
||||
}
|
||||
|
||||
fn default_confidence_threshold() -> f64 {
|
||||
0.6
|
||||
}
|
||||
|
||||
impl Default for VerificationOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
confidence_threshold: 0.8,
|
||||
recognition_model: None,
|
||||
threshold: Some(0.8),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct AnalysisOptions {
|
||||
#[serde(default = "default_true")]
|
||||
pub return_landmarks: bool,
|
||||
#[serde(default)]
|
||||
pub detection_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub recognition_model: Option<String>,
|
||||
}
|
||||
|
||||
impl Default for AnalysisOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
return_landmarks: true,
|
||||
detection_model: None,
|
||||
recognition_model: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct GroupingOptions {
|
||||
#[serde(default = "default_similarity_threshold")]
|
||||
pub similarity_threshold: f32,
|
||||
}
|
||||
|
||||
fn default_similarity_threshold() -> f32 {
|
||||
0.5
|
||||
}
|
||||
|
||||
impl Default for GroupingOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
similarity_threshold: 0.5,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, Serialize, Deserialize, PartialEq, Eq, Hash)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum FaceAttributeType {
|
||||
Age,
|
||||
Gender,
|
||||
Emotion,
|
||||
Smile,
|
||||
Glasses,
|
||||
FacialHair,
|
||||
HeadPose,
|
||||
Blur,
|
||||
Exposure,
|
||||
Noise,
|
||||
Occlusion,
|
||||
Accessories,
|
||||
Hair,
|
||||
Makeup,
|
||||
QualityForRecognition,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Result Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceDetectionResult {
|
||||
pub success: bool,
|
||||
pub faces: Vec<DetectedFace>,
|
||||
pub face_count: usize,
|
||||
pub image_width: Option<u32>,
|
||||
pub image_height: Option<u32>,
|
||||
pub processing_time_ms: u64,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl FaceDetectionResult {
|
||||
pub fn success(faces: Vec<DetectedFace>, processing_time_ms: u64) -> Self {
|
||||
let face_count = faces.len();
|
||||
Self {
|
||||
success: true,
|
||||
faces,
|
||||
face_count,
|
||||
image_width: None,
|
||||
image_height: None,
|
||||
processing_time_ms,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error(message: String) -> Self {
|
||||
Self {
|
||||
success: false,
|
||||
faces: Vec::new(),
|
||||
face_count: 0,
|
||||
image_width: None,
|
||||
image_height: None,
|
||||
processing_time_ms: 0,
|
||||
error: Some(message),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_image_size(mut self, width: u32, height: u32) -> Self {
|
||||
self.image_width = Some(width);
|
||||
self.image_height = Some(height);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceVerificationResult {
|
||||
pub success: bool,
|
||||
pub is_match: bool,
|
||||
pub confidence: f64,
|
||||
pub threshold: f64,
|
||||
pub face1_id: Option<Uuid>,
|
||||
pub face2_id: Option<Uuid>,
|
||||
pub processing_time_ms: u64,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl FaceVerificationResult {
|
||||
pub fn match_found(confidence: f64, threshold: f64, processing_time_ms: u64) -> Self {
|
||||
Self {
|
||||
success: true,
|
||||
is_match: confidence >= threshold,
|
||||
confidence,
|
||||
threshold,
|
||||
face1_id: None,
|
||||
face2_id: None,
|
||||
processing_time_ms,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error(message: String) -> Self {
|
||||
Self {
|
||||
success: false,
|
||||
is_match: false,
|
||||
confidence: 0.0,
|
||||
threshold: 0.0,
|
||||
face1_id: None,
|
||||
face2_id: None,
|
||||
processing_time_ms: 0,
|
||||
error: Some(message),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_face_ids(mut self, face1_id: Uuid, face2_id: Uuid) -> Self {
|
||||
self.face1_id = Some(face1_id);
|
||||
self.face2_id = Some(face2_id);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceAnalysisResult {
|
||||
pub success: bool,
|
||||
pub face: Option<DetectedFace>,
|
||||
pub attributes: Option<FaceAttributes>,
|
||||
pub dominant_emotion: Option<String>,
|
||||
pub estimated_age: Option<f32>,
|
||||
pub gender: Option<String>,
|
||||
pub smile_intensity: Option<f32>,
|
||||
pub quality_score: Option<f32>,
|
||||
pub processing_time_ms: u64,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl FaceAnalysisResult {
|
||||
pub fn success(face: DetectedFace, processing_time_ms: u64) -> Self {
|
||||
let attributes = face.attributes.clone();
|
||||
let dominant_emotion = attributes.as_ref()
|
||||
.and_then(|a| a.emotion.as_ref())
|
||||
.map(|e| e.dominant_emotion().to_string());
|
||||
let estimated_age = attributes.as_ref().and_then(|a| a.age);
|
||||
let gender = attributes.as_ref()
|
||||
.and_then(|a| a.gender)
|
||||
.map(|g| format!("{:?}", g).to_lowercase());
|
||||
let smile_intensity = attributes.as_ref().and_then(|a| a.smile);
|
||||
|
||||
Self {
|
||||
success: true,
|
||||
face: Some(face),
|
||||
attributes,
|
||||
dominant_emotion,
|
||||
estimated_age,
|
||||
gender,
|
||||
smile_intensity,
|
||||
quality_score: None,
|
||||
processing_time_ms,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error(message: String) -> Self {
|
||||
Self {
|
||||
success: false,
|
||||
face: None,
|
||||
attributes: None,
|
||||
dominant_emotion: None,
|
||||
estimated_age: None,
|
||||
gender: None,
|
||||
smile_intensity: None,
|
||||
quality_score: None,
|
||||
processing_time_ms: 0,
|
||||
error: Some(message),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SimilarFaceResult {
|
||||
pub face_id: Uuid,
|
||||
pub confidence: f64,
|
||||
pub person_id: Option<String>,
|
||||
pub metadata: Option<HashMap<String, serde_json::Value>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FaceGroup {
|
||||
pub group_id: Uuid,
|
||||
pub face_ids: Vec<Uuid>,
|
||||
pub representative_face_id: Option<Uuid>,
|
||||
pub confidence: f64,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Helper Functions
|
||||
// ============================================================================
|
||||
|
||||
/// Calculate cosine similarity between two embedding vectors
|
||||
fn cosine_similarity(a: &[f32], b: &[f32]) -> f32 {
|
||||
if a.len() != b.len() || a.is_empty() {
|
||||
|
|
@ -444,10 +30,6 @@ fn cosine_similarity(a: &[f32], b: &[f32]) -> f32 {
|
|||
(dot_product / (norm_a * norm_b)).clamp(0.0, 1.0)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Face API Service
|
||||
// ============================================================================
|
||||
|
||||
pub struct FaceApiService {
|
||||
config: FaceApiConfig,
|
||||
client: reqwest::Client,
|
||||
|
|
@ -667,7 +249,7 @@ impl FaceApiService {
|
|||
return Err(FaceApiError::ApiError(error_text));
|
||||
}
|
||||
|
||||
let result: AzureVerifyResponse = response.json().await
|
||||
let result: super::azure::AzureVerifyResponse = response.json().await
|
||||
.map_err(|e| FaceApiError::ParseError(e.to_string()))?;
|
||||
|
||||
Ok(FaceVerificationResult::match_found(
|
||||
|
|
@ -1293,294 +875,3 @@ impl FaceApiService {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Azure API Response Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceResponse {
|
||||
face_id: Option<String>,
|
||||
face_rectangle: AzureFaceRectangle,
|
||||
face_landmarks: Option<AzureFaceLandmarks>,
|
||||
face_attributes: Option<AzureFaceAttributes>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceRectangle {
|
||||
top: f32,
|
||||
left: f32,
|
||||
width: f32,
|
||||
height: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceLandmarks {
|
||||
pupil_left: Option<AzurePoint>,
|
||||
pupil_right: Option<AzurePoint>,
|
||||
nose_tip: Option<AzurePoint>,
|
||||
mouth_left: Option<AzurePoint>,
|
||||
mouth_right: Option<AzurePoint>,
|
||||
eyebrow_left_outer: Option<AzurePoint>,
|
||||
eyebrow_left_inner: Option<AzurePoint>,
|
||||
eyebrow_right_outer: Option<AzurePoint>,
|
||||
eyebrow_right_inner: Option<AzurePoint>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
struct AzurePoint {
|
||||
x: f32,
|
||||
y: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureFaceAttributes {
|
||||
age: Option<f32>,
|
||||
gender: Option<String>,
|
||||
smile: Option<f32>,
|
||||
glasses: Option<String>,
|
||||
emotion: Option<AzureEmotion>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
struct AzureEmotion {
|
||||
anger: f32,
|
||||
contempt: f32,
|
||||
disgust: f32,
|
||||
fear: f32,
|
||||
happiness: f32,
|
||||
neutral: f32,
|
||||
sadness: f32,
|
||||
surprise: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct AzureVerifyResponse {
|
||||
confidence: f64,
|
||||
}
|
||||
|
||||
impl AzureFaceResponse {
|
||||
fn into_detected_face(self) -> DetectedFace {
|
||||
use crate::botmodels::{FaceLandmarks, Point2D, GlassesType};
|
||||
|
||||
let face_id = self.face_id
|
||||
.and_then(|id| Uuid::parse_str(&id).ok())
|
||||
.unwrap_or_else(Uuid::new_v4);
|
||||
|
||||
let landmarks = self.face_landmarks.map(|lm| {
|
||||
FaceLandmarks {
|
||||
left_eye: lm.pupil_left.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
right_eye: lm.pupil_right.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
nose_tip: lm.nose_tip.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
mouth_left: lm.mouth_left.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
mouth_right: lm.mouth_right.map(|p| Point2D { x: p.x, y: p.y })
|
||||
.unwrap_or(Point2D { x: 0.0, y: 0.0 }),
|
||||
left_eyebrow_left: lm.eyebrow_left_outer.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
left_eyebrow_right: lm.eyebrow_left_inner.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
right_eyebrow_left: lm.eyebrow_right_inner.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
right_eyebrow_right: lm.eyebrow_right_outer.map(|p| Point2D { x: p.x, y: p.y }),
|
||||
}
|
||||
});
|
||||
|
||||
let attributes = self.face_attributes.map(|attrs| {
|
||||
let gender = attrs.gender.as_ref().map(|g| {
|
||||
match g.to_lowercase().as_str() {
|
||||
"male" => Gender::Male,
|
||||
"female" => Gender::Female,
|
||||
_ => Gender::Unknown,
|
||||
}
|
||||
});
|
||||
|
||||
let emotion = attrs.emotion.map(|e| EmotionScores {
|
||||
anger: e.anger,
|
||||
contempt: e.contempt,
|
||||
disgust: e.disgust,
|
||||
fear: e.fear,
|
||||
happiness: e.happiness,
|
||||
neutral: e.neutral,
|
||||
sadness: e.sadness,
|
||||
surprise: e.surprise,
|
||||
});
|
||||
|
||||
let glasses = attrs.glasses.as_ref().map(|g| {
|
||||
match g.to_lowercase().as_str() {
|
||||
"noглasses" | "noglasses" => GlassesType::NoGlasses,
|
||||
"readingglasses" => GlassesType::ReadingGlasses,
|
||||
"sunglasses" => GlassesType::Sunglasses,
|
||||
"swimminggoggles" => GlassesType::SwimmingGoggles,
|
||||
_ => GlassesType::NoGlasses,
|
||||
}
|
||||
});
|
||||
|
||||
FaceAttributes {
|
||||
age: attrs.age,
|
||||
gender,
|
||||
emotion,
|
||||
glasses,
|
||||
facial_hair: None,
|
||||
head_pose: None,
|
||||
smile: attrs.smile,
|
||||
blur: None,
|
||||
exposure: None,
|
||||
noise: None,
|
||||
occlusion: None,
|
||||
}
|
||||
});
|
||||
|
||||
DetectedFace {
|
||||
id: face_id,
|
||||
bounding_box: BoundingBox {
|
||||
left: self.face_rectangle.left,
|
||||
top: self.face_rectangle.top,
|
||||
width: self.face_rectangle.width,
|
||||
height: self.face_rectangle.height,
|
||||
},
|
||||
confidence: 1.0,
|
||||
landmarks,
|
||||
attributes,
|
||||
embedding: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Error Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum FaceApiError {
|
||||
ConfigError(String),
|
||||
NetworkError(String),
|
||||
ApiError(String),
|
||||
ParseError(String),
|
||||
InvalidInput(String),
|
||||
NoFaceFound,
|
||||
NotImplemented(String),
|
||||
RateLimited,
|
||||
Unauthorized,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for FaceApiError {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
Self::ConfigError(msg) => write!(f, "Configuration error: {}", msg),
|
||||
Self::NetworkError(msg) => write!(f, "Network error: {}", msg),
|
||||
Self::ApiError(msg) => write!(f, "API error: {}", msg),
|
||||
Self::ParseError(msg) => write!(f, "Parse error: {}", msg),
|
||||
Self::InvalidInput(msg) => write!(f, "Invalid input: {}", msg),
|
||||
Self::NoFaceFound => write!(f, "No face found in image"),
|
||||
Self::NotImplemented(provider) => write!(f, "{} provider not implemented", provider),
|
||||
Self::RateLimited => write!(f, "Rate limit exceeded"),
|
||||
Self::Unauthorized => write!(f, "Unauthorized - check API credentials"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::error::Error for FaceApiError {}
|
||||
|
||||
// ============================================================================
|
||||
// BASIC Keyword Executor
|
||||
// ============================================================================
|
||||
|
||||
/// Execute DETECT FACES keyword
|
||||
pub async fn execute_detect_faces(
|
||||
service: &FaceApiService,
|
||||
image_url: &str,
|
||||
options: Option<DetectionOptions>,
|
||||
) -> Result<FaceDetectionResult, FaceApiError> {
|
||||
let image = ImageSource::Url(image_url.to_string());
|
||||
let opts = options.unwrap_or_default();
|
||||
service.detect_faces(&image, &opts).await
|
||||
}
|
||||
|
||||
/// Execute VERIFY FACE keyword
|
||||
pub async fn execute_verify_face(
|
||||
service: &FaceApiService,
|
||||
face1_url: &str,
|
||||
face2_url: &str,
|
||||
options: Option<VerificationOptions>,
|
||||
) -> Result<FaceVerificationResult, FaceApiError> {
|
||||
let face1 = FaceSource::Image(ImageSource::Url(face1_url.to_string()));
|
||||
let face2 = FaceSource::Image(ImageSource::Url(face2_url.to_string()));
|
||||
let opts = options.unwrap_or_default();
|
||||
service.verify_faces(&face1, &face2, &opts).await
|
||||
}
|
||||
|
||||
/// Execute ANALYZE FACE keyword
|
||||
pub async fn execute_analyze_face(
|
||||
service: &FaceApiService,
|
||||
image_url: &str,
|
||||
attributes: Option<Vec<FaceAttributeType>>,
|
||||
options: Option<AnalysisOptions>,
|
||||
) -> Result<FaceAnalysisResult, FaceApiError> {
|
||||
let source = FaceSource::Image(ImageSource::Url(image_url.to_string()));
|
||||
let attrs = attributes.unwrap_or_else(|| vec![
|
||||
FaceAttributeType::Age,
|
||||
FaceAttributeType::Gender,
|
||||
FaceAttributeType::Emotion,
|
||||
FaceAttributeType::Smile,
|
||||
]);
|
||||
let opts = options.unwrap_or_default();
|
||||
service.analyze_face(&source, &attrs, &opts).await
|
||||
}
|
||||
|
||||
/// Convert detection result to BASIC-friendly format
|
||||
pub fn detection_to_basic_value(result: &FaceDetectionResult) -> serde_json::Value {
|
||||
serde_json::json!({
|
||||
"success": result.success,
|
||||
"face_count": result.face_count,
|
||||
"faces": result.faces.iter().map(|f| {
|
||||
serde_json::json!({
|
||||
"id": f.id.to_string(),
|
||||
"bounds": {
|
||||
"left": f.bounding_box.left,
|
||||
"top": f.bounding_box.top,
|
||||
"width": f.bounding_box.width,
|
||||
"height": f.bounding_box.height
|
||||
},
|
||||
"confidence": f.confidence,
|
||||
"age": f.attributes.as_ref().and_then(|a| a.age),
|
||||
"gender": f.attributes.as_ref().and_then(|a| a.gender).map(|g| format!("{:?}", g).to_lowercase()),
|
||||
"emotion": f.attributes.as_ref().and_then(|a| a.emotion.as_ref()).map(|e| e.dominant_emotion()),
|
||||
"smile": f.attributes.as_ref().and_then(|a| a.smile)
|
||||
})
|
||||
}).collect::<Vec<_>>(),
|
||||
"processing_time_ms": result.processing_time_ms,
|
||||
"error": result.error
|
||||
})
|
||||
}
|
||||
|
||||
/// Convert verification result to BASIC-friendly format
|
||||
pub fn verification_to_basic_value(result: &FaceVerificationResult) -> serde_json::Value {
|
||||
serde_json::json!({
|
||||
"success": result.success,
|
||||
"is_match": result.is_match,
|
||||
"confidence": result.confidence,
|
||||
"threshold": result.threshold,
|
||||
"processing_time_ms": result.processing_time_ms,
|
||||
"error": result.error
|
||||
})
|
||||
}
|
||||
|
||||
/// Convert analysis result to BASIC-friendly format
|
||||
pub fn analysis_to_basic_value(result: &FaceAnalysisResult) -> serde_json::Value {
|
||||
serde_json::json!({
|
||||
"success": result.success,
|
||||
"age": result.estimated_age,
|
||||
"gender": result.gender,
|
||||
"emotion": result.dominant_emotion,
|
||||
"smile": result.smile_intensity,
|
||||
"quality": result.quality_score,
|
||||
"processing_time_ms": result.processing_time_ms,
|
||||
"error": result.error
|
||||
})
|
||||
}
|
||||
250
src/basic/keywords/face_api/types.rs
Normal file
250
src/basic/keywords/face_api/types.rs
Normal file
|
|
@ -0,0 +1,250 @@
|
|||
//! Face API Types
|
||||
//!
|
||||
//! This module contains all type definitions for the Face API keywords including
|
||||
//! image sources, face sources, detection options, and attribute types.
|
||||
|
||||
use crate::botmodels::DetectedFace;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use uuid::Uuid;
|
||||
|
||||
// ============================================================================
|
||||
// Keyword Definitions
|
||||
// ============================================================================
|
||||
|
||||
/// DETECT FACES keyword - Detect faces in an image
|
||||
///
|
||||
/// Syntax:
|
||||
/// faces = DETECT FACES image_url
|
||||
/// faces = DETECT FACES image_url WITH OPTIONS options
|
||||
///
|
||||
/// Examples:
|
||||
/// faces = DETECT FACES "https://example.com/photo.jpg"
|
||||
/// faces = DETECT FACES photo WITH OPTIONS { "return_landmarks": true, "return_attributes": true }
|
||||
///
|
||||
/// Returns: Array of detected faces with bounding boxes and optional attributes
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct DetectFacesKeyword {
|
||||
pub image_source: ImageSource,
|
||||
pub options: DetectionOptions,
|
||||
}
|
||||
|
||||
/// VERIFY FACE keyword - Verify if two faces belong to the same person
|
||||
///
|
||||
/// Syntax:
|
||||
/// result = VERIFY FACE face1 AGAINST face2
|
||||
/// result = VERIFY FACE image1 AGAINST image2
|
||||
///
|
||||
/// Examples:
|
||||
/// match = VERIFY FACE saved_face AGAINST new_photo
|
||||
/// result = VERIFY FACE "https://example.com/id.jpg" AGAINST camera_capture
|
||||
///
|
||||
/// Returns: Verification result with confidence score
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct VerifyFaceKeyword {
|
||||
pub face1: FaceSource,
|
||||
pub face2: FaceSource,
|
||||
pub options: VerificationOptions,
|
||||
}
|
||||
|
||||
/// ANALYZE FACE keyword - Analyze face attributes in detail
|
||||
///
|
||||
/// Syntax:
|
||||
/// analysis = ANALYZE FACE image_url
|
||||
/// analysis = ANALYZE FACE face_id WITH ATTRIBUTES attributes_list
|
||||
///
|
||||
/// Examples:
|
||||
/// analysis = ANALYZE FACE photo WITH ATTRIBUTES ["age", "emotion", "gender"]
|
||||
/// result = ANALYZE FACE captured_image
|
||||
///
|
||||
/// Returns: Detailed face analysis including emotions, age, gender, etc.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct AnalyzeFaceKeyword {
|
||||
pub source: FaceSource,
|
||||
pub attributes: Vec<FaceAttributeType>,
|
||||
pub options: AnalysisOptions,
|
||||
}
|
||||
|
||||
/// FIND SIMILAR FACES keyword - Find similar faces in a collection
|
||||
///
|
||||
/// Syntax:
|
||||
/// similar = FIND SIMILAR FACES TO face IN collection
|
||||
///
|
||||
/// Examples:
|
||||
/// matches = FIND SIMILAR FACES TO suspect_photo IN employee_database
|
||||
///
|
||||
/// Returns: Array of similar faces with similarity scores
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FindSimilarFacesKeyword {
|
||||
pub target_face: FaceSource,
|
||||
pub collection_name: String,
|
||||
pub max_results: usize,
|
||||
pub min_confidence: f32,
|
||||
}
|
||||
|
||||
/// GROUP FACES keyword - Group faces by similarity
|
||||
///
|
||||
/// Syntax:
|
||||
/// groups = GROUP FACES face_list
|
||||
///
|
||||
/// Examples:
|
||||
/// groups = GROUP FACES detected_faces
|
||||
///
|
||||
/// Returns: Groups of similar faces
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct GroupFacesKeyword {
|
||||
pub faces: Vec<FaceSource>,
|
||||
pub options: GroupingOptions,
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Supporting Types
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(untagged)]
|
||||
pub enum ImageSource {
|
||||
Url(String),
|
||||
Base64(String),
|
||||
FilePath(String),
|
||||
Variable(String),
|
||||
Binary(Vec<u8>),
|
||||
Bytes(Vec<u8>),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(untagged)]
|
||||
pub enum FaceSource {
|
||||
Image(ImageSource),
|
||||
FaceId(Uuid),
|
||||
DetectedFace(Box<DetectedFace>),
|
||||
Embedding(Vec<f32>),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct DetectionOptions {
|
||||
#[serde(default = "default_true")]
|
||||
pub return_face_id: bool,
|
||||
#[serde(default)]
|
||||
pub return_landmarks: Option<bool>,
|
||||
#[serde(default)]
|
||||
pub return_attributes: Option<bool>,
|
||||
#[serde(default)]
|
||||
pub return_embedding: bool,
|
||||
#[serde(default)]
|
||||
pub detection_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub recognition_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub max_faces: Option<usize>,
|
||||
#[serde(default = "default_min_face_size")]
|
||||
pub min_face_size: u32,
|
||||
}
|
||||
|
||||
fn default_true() -> bool {
|
||||
true
|
||||
}
|
||||
|
||||
fn _default_max_faces() -> usize {
|
||||
100
|
||||
}
|
||||
|
||||
fn default_min_face_size() -> u32 {
|
||||
36
|
||||
}
|
||||
|
||||
impl Default for DetectionOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
return_face_id: true,
|
||||
return_landmarks: Some(false),
|
||||
return_attributes: Some(false),
|
||||
return_embedding: false,
|
||||
detection_model: None,
|
||||
recognition_model: None,
|
||||
max_faces: Some(100),
|
||||
min_face_size: 36,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct VerificationOptions {
|
||||
#[serde(default = "default_confidence_threshold")]
|
||||
pub confidence_threshold: f64,
|
||||
#[serde(default)]
|
||||
pub recognition_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub threshold: Option<f64>,
|
||||
}
|
||||
|
||||
fn default_confidence_threshold() -> f64 {
|
||||
0.6
|
||||
}
|
||||
|
||||
impl Default for VerificationOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
confidence_threshold: 0.8,
|
||||
recognition_model: None,
|
||||
threshold: Some(0.8),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct AnalysisOptions {
|
||||
#[serde(default = "default_true")]
|
||||
pub return_landmarks: bool,
|
||||
#[serde(default)]
|
||||
pub detection_model: Option<String>,
|
||||
#[serde(default)]
|
||||
pub recognition_model: Option<String>,
|
||||
}
|
||||
|
||||
impl Default for AnalysisOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
return_landmarks: true,
|
||||
detection_model: None,
|
||||
recognition_model: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct GroupingOptions {
|
||||
#[serde(default = "default_similarity_threshold")]
|
||||
pub similarity_threshold: f32,
|
||||
}
|
||||
|
||||
fn default_similarity_threshold() -> f32 {
|
||||
0.5
|
||||
}
|
||||
|
||||
impl Default for GroupingOptions {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
similarity_threshold: 0.5,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, Serialize, Deserialize, PartialEq, Eq, Hash)]
|
||||
#[serde(rename_all = "snake_case")]
|
||||
pub enum FaceAttributeType {
|
||||
Age,
|
||||
Gender,
|
||||
Emotion,
|
||||
Smile,
|
||||
Glasses,
|
||||
FacialHair,
|
||||
HeadPose,
|
||||
Blur,
|
||||
Exposure,
|
||||
Noise,
|
||||
Occlusion,
|
||||
Accessories,
|
||||
Hair,
|
||||
Makeup,
|
||||
QualityForRecognition,
|
||||
}
|
||||
File diff suppressed because it is too large
Load diff
221
src/basic/keywords/file_ops/archive.rs
Normal file
221
src/basic/keywords/file_ops/archive.rs
Normal file
|
|
@ -0,0 +1,221 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use crate::core::shared::models::schema::bots::dsl::*;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use flate2::read::GzDecoder;
|
||||
use log::{error, trace};
|
||||
use std::error::Error;
|
||||
use std::fs::{self, File};
|
||||
use std::io::{Read, Write};
|
||||
use std::path::Path;
|
||||
use tar::Archive;
|
||||
use zip::{write::FileOptions, ZipArchive, ZipWriter};
|
||||
|
||||
use super::basic_io::execute_read;
|
||||
|
||||
pub async fn execute_compress(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
files: &[String],
|
||||
archive_name: &str,
|
||||
) -> Result<String, Box<dyn Error + Send + Sync>> {
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let temp_dir = std::env::temp_dir();
|
||||
let archive_path = temp_dir.join(archive_name);
|
||||
let file = File::create(&archive_path)?;
|
||||
let mut zip = ZipWriter::new(file);
|
||||
|
||||
let options = FileOptions::<()>::default().compression_method(zip::CompressionMethod::Deflated);
|
||||
|
||||
for file_path in files {
|
||||
let content = execute_read(state, user, file_path).await?;
|
||||
let file_name = Path::new(file_path)
|
||||
.file_name()
|
||||
.and_then(|n| n.to_str())
|
||||
.unwrap_or(file_path);
|
||||
|
||||
zip.start_file(file_name, options)?;
|
||||
zip.write_all(content.as_bytes())?;
|
||||
}
|
||||
|
||||
zip.finish()?;
|
||||
|
||||
let archive_content = fs::read(&archive_path)?;
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{archive_name}");
|
||||
|
||||
client
|
||||
.put_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.body(archive_content.into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 put failed: {e}"))?;
|
||||
|
||||
fs::remove_file(&archive_path).ok();
|
||||
|
||||
trace!("COMPRESS successful: {archive_name}");
|
||||
Ok(archive_name.to_string())
|
||||
}
|
||||
|
||||
pub fn has_zip_extension(archive: &str) -> bool {
|
||||
Path::new(archive)
|
||||
.extension()
|
||||
.is_some_and(|ext| ext.eq_ignore_ascii_case("zip"))
|
||||
}
|
||||
|
||||
pub fn has_tar_gz_extension(archive: &str) -> bool {
|
||||
let path = Path::new(archive);
|
||||
if let Some(ext) = path.extension() {
|
||||
if ext.eq_ignore_ascii_case("tgz") {
|
||||
return true;
|
||||
}
|
||||
if ext.eq_ignore_ascii_case("gz") {
|
||||
if let Some(stem) = path.file_stem() {
|
||||
return Path::new(stem)
|
||||
.extension()
|
||||
.is_some_and(|e| e.eq_ignore_ascii_case("tar"));
|
||||
}
|
||||
}
|
||||
}
|
||||
false
|
||||
}
|
||||
|
||||
pub async fn execute_extract(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
archive: &str,
|
||||
destination: &str,
|
||||
) -> Result<Vec<String>, Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let archive_key = format!("{bot_name}.gbdrive/{archive}");
|
||||
|
||||
let response = client
|
||||
.get_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&archive_key)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 get failed: {e}"))?;
|
||||
|
||||
let data = response.body.collect().await?.into_bytes();
|
||||
|
||||
let temp_dir = std::env::temp_dir();
|
||||
let archive_path = temp_dir.join(archive);
|
||||
fs::write(&archive_path, &data)?;
|
||||
|
||||
let mut extracted_files = Vec::new();
|
||||
|
||||
if has_zip_extension(archive) {
|
||||
let file = File::open(&archive_path)?;
|
||||
let mut zip = ZipArchive::new(file)?;
|
||||
|
||||
for i in 0..zip.len() {
|
||||
let mut zip_file = zip.by_index(i)?;
|
||||
let file_name = zip_file.name().to_string();
|
||||
|
||||
let mut content = Vec::new();
|
||||
zip_file.read_to_end(&mut content)?;
|
||||
|
||||
let dest_path = format!("{}/{file_name}", destination.trim_end_matches('/'));
|
||||
|
||||
let dest_key = format!("{bot_name}.gbdrive/{dest_path}");
|
||||
client
|
||||
.put_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&dest_key)
|
||||
.body(content.into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 put failed: {e}"))?;
|
||||
|
||||
extracted_files.push(dest_path);
|
||||
}
|
||||
} else if has_tar_gz_extension(archive) {
|
||||
let file = File::open(&archive_path)?;
|
||||
let decoder = GzDecoder::new(file);
|
||||
let mut tar = Archive::new(decoder);
|
||||
|
||||
for entry in tar.entries()? {
|
||||
let mut entry = entry?;
|
||||
let file_name = entry.path()?.to_string_lossy().to_string();
|
||||
|
||||
let mut content = Vec::new();
|
||||
entry.read_to_end(&mut content)?;
|
||||
|
||||
let dest_path = format!("{}/{file_name}", destination.trim_end_matches('/'));
|
||||
|
||||
let dest_key = format!("{bot_name}.gbdrive/{dest_path}");
|
||||
client
|
||||
.put_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&dest_key)
|
||||
.body(content.into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 put failed: {e}"))?;
|
||||
|
||||
extracted_files.push(dest_path);
|
||||
}
|
||||
}
|
||||
|
||||
fs::remove_file(&archive_path).ok();
|
||||
|
||||
trace!("EXTRACT successful: {} files", extracted_files.len());
|
||||
Ok(extracted_files)
|
||||
}
|
||||
186
src/basic/keywords/file_ops/basic_io.rs
Normal file
186
src/basic/keywords/file_ops/basic_io.rs
Normal file
|
|
@ -0,0 +1,186 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use crate::core::shared::models::schema::bots::dsl::*;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, trace};
|
||||
use std::error::Error;
|
||||
|
||||
pub async fn execute_read(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
path: &str,
|
||||
) -> Result<String, Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{path}");
|
||||
|
||||
let response = client
|
||||
.get_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 get failed: {e}"))?;
|
||||
|
||||
let data = response.body.collect().await?.into_bytes();
|
||||
let content =
|
||||
String::from_utf8(data.to_vec()).map_err(|_| "File content is not valid UTF-8")?;
|
||||
|
||||
trace!("READ successful: {} bytes", content.len());
|
||||
Ok(content)
|
||||
}
|
||||
|
||||
pub async fn execute_write(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
path: &str,
|
||||
content: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{path}");
|
||||
|
||||
client
|
||||
.put_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.body(content.as_bytes().to_vec().into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 put failed: {e}"))?;
|
||||
|
||||
trace!("WRITE successful: {} bytes to {path}", content.len());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn execute_delete_file(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
path: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{path}");
|
||||
|
||||
client
|
||||
.delete_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 delete failed: {e}"))?;
|
||||
|
||||
trace!("DELETE_FILE successful: {path}");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn execute_list(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
path: &str,
|
||||
) -> Result<Vec<String>, Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let prefix = format!("{bot_name}.gbdrive/{path}");
|
||||
|
||||
let response = client
|
||||
.list_objects_v2()
|
||||
.bucket(&bucket_name)
|
||||
.prefix(&prefix)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 list failed: {e}"))?;
|
||||
|
||||
let files: Vec<String> = response
|
||||
.contents()
|
||||
.iter()
|
||||
.filter_map(|obj| {
|
||||
obj.key().map(|k| {
|
||||
k.strip_prefix(&format!("{bot_name}.gbdrive/"))
|
||||
.unwrap_or(k)
|
||||
.to_string()
|
||||
})
|
||||
})
|
||||
.collect();
|
||||
|
||||
trace!("LIST successful: {} files", files.len());
|
||||
Ok(files)
|
||||
}
|
||||
269
src/basic/keywords/file_ops/copy_move.rs
Normal file
269
src/basic/keywords/file_ops/copy_move.rs
Normal file
|
|
@ -0,0 +1,269 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use crate::basic::keywords::use_account::{
|
||||
get_account_credentials, is_account_path, parse_account_path,
|
||||
};
|
||||
use crate::core::shared::models::schema::bots::dsl::*;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::trace;
|
||||
use std::error::Error;
|
||||
|
||||
use super::basic_io::{execute_delete_file, execute_read, execute_write};
|
||||
|
||||
pub async fn execute_copy(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
source: &str,
|
||||
destination: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let source_is_account = is_account_path(source);
|
||||
let dest_is_account = is_account_path(destination);
|
||||
|
||||
if source_is_account || dest_is_account {
|
||||
return execute_copy_with_account(state, user, source, destination).await;
|
||||
}
|
||||
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
log::error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let source_key = format!("{bot_name}.gbdrive/{source}");
|
||||
let dest_key = format!("{bot_name}.gbdrive/{destination}");
|
||||
|
||||
let copy_source = format!("{bucket_name}/{source_key}");
|
||||
|
||||
client
|
||||
.copy_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&dest_key)
|
||||
.copy_source(©_source)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 copy failed: {e}"))?;
|
||||
|
||||
trace!("COPY successful: {source} -> {destination}");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn execute_copy_with_account(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
source: &str,
|
||||
destination: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let source_is_account = is_account_path(source);
|
||||
let dest_is_account = is_account_path(destination);
|
||||
|
||||
let content = if source_is_account {
|
||||
let (email, path) = parse_account_path(source).ok_or("Invalid account:// path format")?;
|
||||
let creds = get_account_credentials(&state.conn, &email, user.bot_id)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to get credentials: {e}"))?;
|
||||
download_from_account(&creds, &path).await?
|
||||
} else {
|
||||
read_from_local(state, user, source).await?
|
||||
};
|
||||
|
||||
if dest_is_account {
|
||||
let (email, path) =
|
||||
parse_account_path(destination).ok_or("Invalid account:// path format")?;
|
||||
let creds = get_account_credentials(&state.conn, &email, user.bot_id)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to get credentials: {e}"))?;
|
||||
upload_to_account(&creds, &path, &content).await?;
|
||||
} else {
|
||||
write_to_local(state, user, destination, &content).await?;
|
||||
}
|
||||
|
||||
trace!("COPY with account successful: {source} -> {destination}");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn download_from_account(
|
||||
creds: &crate::basic::keywords::use_account::AccountCredentials,
|
||||
path: &str,
|
||||
) -> Result<Vec<u8>, Box<dyn Error + Send + Sync>> {
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
match creds.provider.as_str() {
|
||||
"gmail" | "google" => {
|
||||
let url = format!(
|
||||
"https://www.googleapis.com/drive/v3/files/{}?alt=media",
|
||||
urlencoding::encode(path)
|
||||
);
|
||||
let resp = client
|
||||
.get(&url)
|
||||
.bearer_auth(&creds.access_token)
|
||||
.send()
|
||||
.await?;
|
||||
if !resp.status().is_success() {
|
||||
return Err(format!("Google Drive download failed: {}", resp.status()).into());
|
||||
}
|
||||
Ok(resp.bytes().await?.to_vec())
|
||||
}
|
||||
"outlook" | "microsoft" => {
|
||||
let url = format!(
|
||||
"https://graph.microsoft.com/v1.0/me/drive/root:/{}:/content",
|
||||
urlencoding::encode(path)
|
||||
);
|
||||
let resp = client
|
||||
.get(&url)
|
||||
.bearer_auth(&creds.access_token)
|
||||
.send()
|
||||
.await?;
|
||||
if !resp.status().is_success() {
|
||||
return Err(format!("OneDrive download failed: {}", resp.status()).into());
|
||||
}
|
||||
Ok(resp.bytes().await?.to_vec())
|
||||
}
|
||||
_ => Err(format!("Unsupported provider: {}", creds.provider).into()),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn upload_to_account(
|
||||
creds: &crate::basic::keywords::use_account::AccountCredentials,
|
||||
path: &str,
|
||||
content: &[u8],
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
match creds.provider.as_str() {
|
||||
"gmail" | "google" => {
|
||||
let url = format!(
|
||||
"https://www.googleapis.com/upload/drive/v3/files?uploadType=media&name={}",
|
||||
urlencoding::encode(path)
|
||||
);
|
||||
let resp = client
|
||||
.post(&url)
|
||||
.bearer_auth(&creds.access_token)
|
||||
.body(content.to_vec())
|
||||
.send()
|
||||
.await?;
|
||||
if !resp.status().is_success() {
|
||||
return Err(format!("Google Drive upload failed: {}", resp.status()).into());
|
||||
}
|
||||
}
|
||||
"outlook" | "microsoft" => {
|
||||
let url = format!(
|
||||
"https://graph.microsoft.com/v1.0/me/drive/root:/{}:/content",
|
||||
urlencoding::encode(path)
|
||||
);
|
||||
let resp = client
|
||||
.put(&url)
|
||||
.bearer_auth(&creds.access_token)
|
||||
.body(content.to_vec())
|
||||
.send()
|
||||
.await?;
|
||||
if !resp.status().is_success() {
|
||||
return Err(format!("OneDrive upload failed: {}", resp.status()).into());
|
||||
}
|
||||
}
|
||||
_ => return Err(format!("Unsupported provider: {}", creds.provider).into()),
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn read_from_local(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
path: &str,
|
||||
) -> Result<Vec<u8>, Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get()?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)?
|
||||
};
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{path}");
|
||||
|
||||
let result = client
|
||||
.get_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.send()
|
||||
.await?;
|
||||
let bytes = result.body.collect().await?.into_bytes();
|
||||
Ok(bytes.to_vec())
|
||||
}
|
||||
|
||||
pub async fn write_to_local(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
path: &str,
|
||||
content: &[u8],
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get()?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)?
|
||||
};
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{path}");
|
||||
|
||||
client
|
||||
.put_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.body(content.to_vec().into())
|
||||
.send()
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn execute_move(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
source: &str,
|
||||
destination: &str,
|
||||
) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
execute_copy(state, user, source, destination).await?;
|
||||
|
||||
execute_delete_file(state, user, source).await?;
|
||||
|
||||
trace!("MOVE successful: {source} -> {destination}");
|
||||
Ok(())
|
||||
}
|
||||
744
src/basic/keywords/file_ops/handlers.rs
Normal file
744
src/basic/keywords/file_ops/handlers.rs
Normal file
|
|
@ -0,0 +1,744 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
||||
use super::archive::*;
|
||||
use super::basic_io::*;
|
||||
use super::copy_move::*;
|
||||
use super::pdf::*;
|
||||
use super::transfer::*;
|
||||
use super::utils::dynamic_to_file_data;
|
||||
|
||||
pub fn register_file_operations(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
register_read_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_write_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_delete_file_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_copy_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_move_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_list_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_compress_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_extract_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_upload_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_download_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_generate_pdf_keyword(Arc::clone(&state), user.clone(), engine);
|
||||
register_merge_pdf_keyword(state, user, engine);
|
||||
}
|
||||
|
||||
pub fn register_read_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
engine
|
||||
.register_custom_syntax(["READ", "$expr$"], false, move |context, inputs| {
|
||||
let path = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
|
||||
trace!("READ file: {path}");
|
||||
|
||||
let state_for_task = Arc::clone(&state);
|
||||
let user_for_task = user.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_read(&state_for_task, &user_for_task, &path).await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send READ result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(30)) {
|
||||
Ok(Ok(content)) => Ok(Dynamic::from(content)),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("READ failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"READ timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("READ thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
})
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_write_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["WRITE", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let path = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let data = context.eval_expression_tree(&inputs[1])?;
|
||||
|
||||
trace!("WRITE to file: {path}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
let data_str = if data.is_string() {
|
||||
data.to_string()
|
||||
} else {
|
||||
serde_json::to_string(&super::utils::dynamic_to_json(&data)).unwrap_or_default()
|
||||
};
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_write(&state_for_task, &user_for_task, &path, &data_str).await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send WRITE result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(30)) {
|
||||
Ok(Ok(_)) => Ok(Dynamic::UNIT),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("WRITE failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"WRITE timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("WRITE thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_delete_file_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user.clone();
|
||||
let state_clone2 = Arc::clone(&state);
|
||||
let user_clone2 = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["DELETE", "FILE", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let path = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
|
||||
trace!("DELETE FILE: {path}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_delete_file(&state_for_task, &user_for_task, &path).await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send DELETE FILE result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(30)) {
|
||||
Ok(Ok(_)) => Ok(Dynamic::UNIT),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("DELETE FILE failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"DELETE FILE timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("DELETE FILE thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["DELETE", "FILE", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let path = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
|
||||
trace!("DELETE FILE: {path}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone2);
|
||||
let user_for_task = user_clone2.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_delete_file(&state_for_task, &user_for_task, &path).await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send DELETE FILE result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(30)) {
|
||||
Ok(Ok(_)) => Ok(Dynamic::UNIT),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("DELETE FILE failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"DELETE FILE timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("DELETE FILE thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_copy_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["COPY", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let source = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let destination = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("COPY from {source} to {destination}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_copy(&state_for_task, &user_for_task, &source, &destination)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send COPY result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(60)) {
|
||||
Ok(Ok(_)) => Ok(Dynamic::UNIT),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("COPY failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"COPY timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("COPY thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_move_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["MOVE", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let source = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let destination = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("MOVE from {source} to {destination}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_move(&state_for_task, &user_for_task, &source, &destination)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send MOVE result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(60)) {
|
||||
Ok(Ok(_)) => Ok(Dynamic::UNIT),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("MOVE failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"MOVE timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("MOVE thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_list_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(["LIST", "$expr$"], false, move |context, inputs| {
|
||||
let path = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
|
||||
trace!("LIST directory: {path}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_list(&state_for_task, &user_for_task, &path).await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send LIST result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(30)) {
|
||||
Ok(Ok(files)) => {
|
||||
let array: rhai::Array = files.iter().map(|f| Dynamic::from(f.clone())).collect();
|
||||
Ok(Dynamic::from(array))
|
||||
}
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("LIST failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"LIST timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("LIST thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
})
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_compress_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["COMPRESS", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let files = context.eval_expression_tree(&inputs[0])?;
|
||||
let archive_name = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("COMPRESS to: {archive_name}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let file_list: Vec<String> = if files.is_array() {
|
||||
files
|
||||
.into_array()
|
||||
.unwrap_or_default()
|
||||
.iter()
|
||||
.map(|f| f.to_string())
|
||||
.collect()
|
||||
} else {
|
||||
vec![files.to_string()]
|
||||
};
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_compress(
|
||||
&state_for_task,
|
||||
&user_for_task,
|
||||
&file_list,
|
||||
&archive_name,
|
||||
)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send COMPRESS result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(120)) {
|
||||
Ok(Ok(path)) => Ok(Dynamic::from(path)),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("COMPRESS failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"COMPRESS timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("COMPRESS thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_extract_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["EXTRACT", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let archive = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let destination = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("EXTRACT {archive} to {destination}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_extract(&state_for_task, &user_for_task, &archive, &destination)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send EXTRACT result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(120)) {
|
||||
Ok(Ok(files)) => {
|
||||
let array: rhai::Array = files.iter().map(|f| Dynamic::from(f.clone())).collect();
|
||||
Ok(Dynamic::from(array))
|
||||
}
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("EXTRACT failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"EXTRACT timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("EXTRACT thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_upload_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["UPLOAD", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let file = context.eval_expression_tree(&inputs[0])?;
|
||||
let destination = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("UPLOAD to: {destination}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
let file_data = dynamic_to_file_data(&file);
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_upload(&state_for_task, &user_for_task, file_data, &destination)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send UPLOAD result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(300)) {
|
||||
Ok(Ok(url)) => Ok(Dynamic::from(url)),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("UPLOAD failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"UPLOAD timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("UPLOAD thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_download_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["DOWNLOAD", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let url = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let local_path = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("DOWNLOAD {url} to {local_path}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_download(&state_for_task, &user_for_task, &url, &local_path)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
error!("Failed to send DOWNLOAD result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(300)) {
|
||||
Ok(Ok(path)) => Ok(Dynamic::from(path)),
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("DOWNLOAD failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"DOWNLOAD timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("DOWNLOAD thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
47
src/basic/keywords/file_ops/mod.rs
Normal file
47
src/basic/keywords/file_ops/mod.rs
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
// Re-export all public functions for backward compatibility
|
||||
pub mod archive;
|
||||
pub mod basic_io;
|
||||
pub mod copy_move;
|
||||
pub mod handlers;
|
||||
pub mod pdf;
|
||||
pub mod transfer;
|
||||
pub mod utils;
|
||||
|
||||
// Re-export all public functions from each module
|
||||
pub use archive::*;
|
||||
pub use basic_io::*;
|
||||
pub use copy_move::*;
|
||||
pub use handlers::*;
|
||||
pub use pdf::*;
|
||||
pub use transfer::*;
|
||||
pub use utils::*;
|
||||
277
src/basic/keywords/file_ops/pdf.rs
Normal file
277
src/basic/keywords/file_ops/pdf.rs
Normal file
|
|
@ -0,0 +1,277 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use crate::core::shared::models::schema::bots::dsl::*;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::trace;
|
||||
use rhai::{Dynamic, Engine, Map};
|
||||
use serde_json::Value;
|
||||
use std::error::Error;
|
||||
use std::fmt::Write as FmtWrite;
|
||||
use std::sync::Arc;
|
||||
|
||||
use super::basic_io::{execute_read, execute_write};
|
||||
use super::utils::dynamic_to_json;
|
||||
|
||||
pub struct PdfResult {
|
||||
pub url: String,
|
||||
pub local_name: String,
|
||||
}
|
||||
|
||||
pub fn register_generate_pdf_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["GENERATE", "PDF", "$expr$", ",", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let template = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let data = context.eval_expression_tree(&inputs[1])?;
|
||||
let output = context.eval_expression_tree(&inputs[2])?.to_string();
|
||||
|
||||
trace!("GENERATE PDF template: {template}, output: {output}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
let data_json = dynamic_to_json(&data);
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_generate_pdf(
|
||||
&state_for_task,
|
||||
&user_for_task,
|
||||
&template,
|
||||
data_json,
|
||||
&output,
|
||||
)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
log::error!("Failed to send GENERATE PDF result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(120)) {
|
||||
Ok(Ok(result)) => {
|
||||
let mut map: Map = Map::new();
|
||||
map.insert("url".into(), Dynamic::from(result.url));
|
||||
map.insert("localName".into(), Dynamic::from(result.local_name));
|
||||
Ok(Dynamic::from(map))
|
||||
}
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("GENERATE PDF failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"GENERATE PDF timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("GENERATE PDF thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub fn register_merge_pdf_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user;
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["MERGE", "PDF", "$expr$", ",", "$expr$"],
|
||||
false,
|
||||
move |context, inputs| {
|
||||
let files = context.eval_expression_tree(&inputs[0])?;
|
||||
let output = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("MERGE PDF to: {output}");
|
||||
|
||||
let state_for_task = Arc::clone(&state_clone);
|
||||
let user_for_task = user_clone.clone();
|
||||
|
||||
let file_list: Vec<String> = if files.is_array() {
|
||||
files
|
||||
.into_array()
|
||||
.unwrap_or_default()
|
||||
.iter()
|
||||
.map(|f| f.to_string())
|
||||
.collect()
|
||||
} else {
|
||||
vec![files.to_string()]
|
||||
};
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel();
|
||||
|
||||
std::thread::spawn(move || {
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.enable_all()
|
||||
.build();
|
||||
|
||||
let send_err = if let Ok(rt) = rt {
|
||||
let result = rt.block_on(async move {
|
||||
execute_merge_pdf(&state_for_task, &user_for_task, &file_list, &output)
|
||||
.await
|
||||
});
|
||||
tx.send(result).err()
|
||||
} else {
|
||||
tx.send(Err("Failed to build tokio runtime".into())).err()
|
||||
};
|
||||
|
||||
if send_err.is_some() {
|
||||
log::error!("Failed to send MERGE PDF result from thread");
|
||||
}
|
||||
});
|
||||
|
||||
match rx.recv_timeout(std::time::Duration::from_secs(120)) {
|
||||
Ok(Ok(result)) => {
|
||||
let mut map: Map = Map::new();
|
||||
map.insert("url".into(), Dynamic::from(result.url));
|
||||
map.insert("localName".into(), Dynamic::from(result.local_name));
|
||||
Ok(Dynamic::from(map))
|
||||
}
|
||||
Ok(Err(e)) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("MERGE PDF failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
Err(std::sync::mpsc::RecvTimeoutError::Timeout) => {
|
||||
Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
"MERGE PDF timed out".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
}
|
||||
Err(e) => Err(Box::new(rhai::EvalAltResult::ErrorRuntime(
|
||||
format!("MERGE PDF thread failed: {e}").into(),
|
||||
rhai::Position::NONE,
|
||||
))),
|
||||
}
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
pub async fn execute_generate_pdf(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
template: &str,
|
||||
data: Value,
|
||||
output: &str,
|
||||
) -> Result<PdfResult, Box<dyn Error + Send + Sync>> {
|
||||
let template_content = execute_read(state, user, template).await?;
|
||||
|
||||
let mut html_content = template_content;
|
||||
if let Value::Object(obj) = &data {
|
||||
for (key, value) in obj {
|
||||
let placeholder = format!("{{{{{key}}}}}");
|
||||
let value_str = match value {
|
||||
Value::String(s) => s.clone(),
|
||||
_ => value.to_string(),
|
||||
};
|
||||
html_content = html_content.replace(&placeholder, &value_str);
|
||||
}
|
||||
}
|
||||
|
||||
let mut pdf_content = String::from("<!-- PDF Content Generated from Template: ");
|
||||
let _ = writeln!(pdf_content, "{template} -->\n{html_content}");
|
||||
|
||||
execute_write(state, user, output, &pdf_content).await?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)?
|
||||
};
|
||||
|
||||
let url = format!("s3://{bot_name}.gbai/{bot_name}.gbdrive/{output}");
|
||||
|
||||
trace!("GENERATE_PDF successful: {output}");
|
||||
Ok(PdfResult {
|
||||
url,
|
||||
local_name: output.to_string(),
|
||||
})
|
||||
}
|
||||
|
||||
pub async fn execute_merge_pdf(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
files: &[String],
|
||||
output: &str,
|
||||
) -> Result<PdfResult, Box<dyn Error + Send + Sync>> {
|
||||
let mut merged_content = String::from("<!-- Merged PDF -->\n");
|
||||
|
||||
for file in files {
|
||||
let content = execute_read(state, user, file).await?;
|
||||
let _ = writeln!(merged_content, "\n<!-- From: {file} -->\n{content}");
|
||||
}
|
||||
|
||||
execute_write(state, user, output, &merged_content).await?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)?
|
||||
};
|
||||
|
||||
let url = format!("s3://{bot_name}.gbai/{bot_name}.gbdrive/{output}");
|
||||
|
||||
trace!(
|
||||
"MERGE_PDF successful: {} files merged to {output}",
|
||||
files.len()
|
||||
);
|
||||
Ok(PdfResult {
|
||||
url,
|
||||
local_name: output.to_string(),
|
||||
})
|
||||
}
|
||||
112
src/basic/keywords/file_ops/transfer.rs
Normal file
112
src/basic/keywords/file_ops/transfer.rs
Normal file
|
|
@ -0,0 +1,112 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use crate::core::shared::models::schema::bots::dsl::*;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, trace};
|
||||
use std::error::Error;
|
||||
|
||||
use super::basic_io::execute_write;
|
||||
|
||||
pub struct FileData {
|
||||
pub content: Vec<u8>,
|
||||
pub filename: String,
|
||||
}
|
||||
|
||||
pub async fn execute_upload(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
file_data: FileData,
|
||||
destination: &str,
|
||||
) -> Result<String, Box<dyn Error + Send + Sync>> {
|
||||
let client = state.drive.as_ref().ok_or("S3 client not configured")?;
|
||||
|
||||
let bot_name: String = {
|
||||
let mut db_conn = state.conn.get().map_err(|e| format!("DB error: {e}"))?;
|
||||
bots.filter(id.eq(&user.bot_id))
|
||||
.select(name)
|
||||
.first(&mut *db_conn)
|
||||
.map_err(|e| {
|
||||
error!("Failed to query bot name: {e}");
|
||||
e
|
||||
})?
|
||||
};
|
||||
|
||||
let bucket_name = format!("{bot_name}.gbai");
|
||||
let key = format!("{bot_name}.gbdrive/{destination}");
|
||||
|
||||
let content_disposition = format!("attachment; filename=\"{}\"", file_data.filename);
|
||||
|
||||
trace!(
|
||||
"Uploading file '{}' to {bucket_name}/{key} ({} bytes)",
|
||||
file_data.filename,
|
||||
file_data.content.len()
|
||||
);
|
||||
|
||||
client
|
||||
.put_object()
|
||||
.bucket(&bucket_name)
|
||||
.key(&key)
|
||||
.content_disposition(&content_disposition)
|
||||
.body(file_data.content.into())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("S3 put failed: {e}"))?;
|
||||
|
||||
let url = format!("s3://{bucket_name}/{key}");
|
||||
trace!(
|
||||
"UPLOAD successful: {url} (original filename: {})",
|
||||
file_data.filename
|
||||
);
|
||||
Ok(url)
|
||||
}
|
||||
|
||||
pub async fn execute_download(
|
||||
state: &AppState,
|
||||
user: &UserSession,
|
||||
url: &str,
|
||||
local_path: &str,
|
||||
) -> Result<String, Box<dyn Error + Send + Sync>> {
|
||||
let client = reqwest::Client::new();
|
||||
let response = client
|
||||
.get(url)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Download failed: {e}"))?;
|
||||
|
||||
let content = response.bytes().await?;
|
||||
|
||||
execute_write(state, user, local_path, &String::from_utf8_lossy(&content)).await?;
|
||||
|
||||
trace!("DOWNLOAD successful: {url} -> {local_path}");
|
||||
Ok(local_path.to_string())
|
||||
}
|
||||
109
src/basic/keywords/file_ops/utils.rs
Normal file
109
src/basic/keywords/file_ops/utils.rs
Normal file
|
|
@ -0,0 +1,109 @@
|
|||
/*****************************************************************************\
|
||||
| █████ █████ ██ █ █████ █████ ████ ██ ████ █████ █████ ███ ® |
|
||||
| ██ █ ███ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| ██ ███ ████ █ ██ █ ████ █████ ██████ ██ ████ █ █ █ ██ |
|
||||
| ██ ██ █ █ ██ █ █ ██ ██ ██ ██ ██ ██ █ ██ ██ █ █ |
|
||||
| █████ █████ █ ███ █████ ██ ██ ██ ██ █████ ████ █████ █ ███ |
|
||||
| |
|
||||
| General Bots Copyright (c) pragmatismo.com.br. All rights reserved. |
|
||||
| Licensed under the AGPL-3.0. |
|
||||
| |
|
||||
| According to our dual licensing model, this program can be used either |
|
||||
| under the terms of the GNU Affero General Public License, version 3, |
|
||||
| or under a proprietary license. |
|
||||
| |
|
||||
| The texts of the GNU Affero General Public License with an additional |
|
||||
| permission and of our proprietary license can be found at and |
|
||||
| in the LICENSE file you have received along with this program. |
|
||||
| |
|
||||
| This program is distributed in the hope that it will be useful, |
|
||||
| but WITHOUT ANY WARRANTY, without even the implied warranty of |
|
||||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
||||
| GNU Affero General Public License for more details. |
|
||||
| |
|
||||
| "General Bots" is a registered trademark of pragmatismo.com.br. |
|
||||
| The licensing of the program under the AGPLv3 does not imply a |
|
||||
| trademark license. Therefore any rights, title and interest in |
|
||||
| our trademarks remain entirely with us. |
|
||||
| |
|
||||
\*****************************************************************************/
|
||||
|
||||
use rhai::{Dynamic, Map};
|
||||
use serde_json::Value;
|
||||
|
||||
use super::transfer::FileData;
|
||||
|
||||
pub fn dynamic_to_json(value: &Dynamic) -> Value {
|
||||
if value.is_unit() {
|
||||
Value::Null
|
||||
} else if value.is_bool() {
|
||||
Value::Bool(value.as_bool().unwrap_or(false))
|
||||
} else if value.is_int() {
|
||||
Value::Number(value.as_int().unwrap_or(0).into())
|
||||
} else if value.is_float() {
|
||||
if let Ok(f) = value.as_float() {
|
||||
serde_json::Number::from_f64(f)
|
||||
.map(Value::Number)
|
||||
.unwrap_or(Value::Null)
|
||||
} else {
|
||||
Value::Null
|
||||
}
|
||||
} else if value.is_string() {
|
||||
Value::String(value.to_string())
|
||||
} else if value.is_array() {
|
||||
let arr = value.clone().into_array().unwrap_or_default();
|
||||
Value::Array(arr.iter().map(dynamic_to_json).collect())
|
||||
} else if value.is_map() {
|
||||
let map = value.clone().try_cast::<Map>().unwrap_or_default();
|
||||
let obj: serde_json::Map<String, Value> = map
|
||||
.iter()
|
||||
.map(|(k, v)| (k.to_string(), dynamic_to_json(v)))
|
||||
.collect();
|
||||
Value::Object(obj)
|
||||
} else {
|
||||
Value::String(value.to_string())
|
||||
}
|
||||
}
|
||||
|
||||
pub fn dynamic_to_file_data(value: &Dynamic) -> FileData {
|
||||
if value.is_map() {
|
||||
let map = value.clone().try_cast::<Map>().unwrap_or_default();
|
||||
let content = map
|
||||
.get("data")
|
||||
.map(|v| v.to_string().into_bytes())
|
||||
.unwrap_or_default();
|
||||
let filename = map
|
||||
.get("filename")
|
||||
.map(|v| v.to_string())
|
||||
.unwrap_or_else(|| "file".to_string());
|
||||
|
||||
FileData { content, filename }
|
||||
} else {
|
||||
FileData {
|
||||
content: value.to_string().into_bytes(),
|
||||
filename: "file".to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use rhai::Dynamic;
|
||||
use serde_json::Value;
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_to_json() {
|
||||
let dynamic = Dynamic::from("hello");
|
||||
let json = dynamic_to_json(&dynamic);
|
||||
assert_eq!(json, Value::String("hello".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_to_file_data() {
|
||||
let dynamic = Dynamic::from("test content");
|
||||
let file_data = dynamic_to_file_data(&dynamic);
|
||||
assert_eq!(file_data.filename, "file");
|
||||
assert!(!file_data.content.is_empty());
|
||||
}
|
||||
}
|
||||
|
|
@ -1,9 +1,9 @@
|
|||
use super::table_access::{check_table_access, filter_fields_by_role, AccessType, UserRoles};
|
||||
use crate::security::sql_guard::sanitize_identifier;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::shared::utils;
|
||||
use crate::shared::utils::to_array;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use crate::core::shared::utils;
|
||||
use crate::core::shared::utils::to_array;
|
||||
use diesel::pg::PgConnection;
|
||||
use diesel::prelude::*;
|
||||
use diesel::sql_types::Text;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use rhai::Dynamic;
|
||||
use rhai::Engine;
|
||||
pub fn for_keyword(_state: &AppState, _user: UserSession, engine: &mut Engine) {
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
use crate::shared::models::schema::bots::dsl::*;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::schema::bots::dsl::*;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use diesel::prelude::*;
|
||||
use log::{error, trace};
|
||||
use reqwest::{self, Client};
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
11
src/basic/keywords/hearing/mod.rs
Normal file
11
src/basic/keywords/hearing/mod.rs
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
mod processing;
|
||||
mod talk;
|
||||
mod syntax;
|
||||
mod types;
|
||||
mod validators;
|
||||
|
||||
pub use processing::{process_hear_input, process_audio_to_text, process_qrcode, process_video_description};
|
||||
pub use talk::{execute_talk, talk_keyword};
|
||||
pub use syntax::hear_keyword;
|
||||
pub use types::{InputType, ValidationResult};
|
||||
pub use validators::validate_input;
|
||||
282
src/basic/keywords/hearing/processing.rs
Normal file
282
src/basic/keywords/hearing/processing.rs
Normal file
|
|
@ -0,0 +1,282 @@
|
|||
use crate::core::shared::models::Attachment;
|
||||
use crate::core::shared::state::AppState;
|
||||
use uuid::Uuid;
|
||||
|
||||
use super::types::InputType;
|
||||
use super::validators::validate_input;
|
||||
|
||||
pub async fn process_hear_input(
|
||||
state: &AppState,
|
||||
session_id: Uuid,
|
||||
variable_name: &str,
|
||||
input: &str,
|
||||
attachments: Option<Vec<Attachment>>,
|
||||
) -> Result<(String, Option<serde_json::Value>), String> {
|
||||
let wait_data = if let Some(redis_client) = &state.cache {
|
||||
if let Ok(mut conn) = redis_client.get_multiplexed_async_connection().await {
|
||||
let key = format!("hear:{session_id}:{variable_name}");
|
||||
|
||||
let data: Result<String, _> = redis::cmd("GET").arg(&key).query_async(&mut conn).await;
|
||||
|
||||
match data {
|
||||
Ok(json_str) => serde_json::from_str::<serde_json::Value>(&json_str).ok(),
|
||||
Err(_) => None,
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
let input_type = wait_data
|
||||
.as_ref()
|
||||
.and_then(|d| d.get("type"))
|
||||
.and_then(|t| t.as_str())
|
||||
.unwrap_or("any");
|
||||
|
||||
let options = wait_data
|
||||
.as_ref()
|
||||
.and_then(|d| d.get("options"))
|
||||
.and_then(|o| o.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|v| v.as_str().map(String::from))
|
||||
.collect::<Vec<_>>()
|
||||
});
|
||||
|
||||
let validation_type = if let Some(opts) = options {
|
||||
InputType::Menu(opts)
|
||||
} else {
|
||||
InputType::parse_type(input_type)
|
||||
};
|
||||
|
||||
match validation_type {
|
||||
InputType::Image | InputType::QrCode => {
|
||||
if let Some(atts) = &attachments {
|
||||
if let Some(img) = atts
|
||||
.iter()
|
||||
.find(|a| a.mime_type.as_deref().unwrap_or("").starts_with("image/"))
|
||||
{
|
||||
if validation_type == InputType::QrCode {
|
||||
return process_qrcode(state, &img.url).await;
|
||||
}
|
||||
return Ok((
|
||||
img.url.clone(),
|
||||
Some(serde_json::json!({ "attachment": img })),
|
||||
));
|
||||
}
|
||||
}
|
||||
return Err(validation_type.error_message());
|
||||
}
|
||||
InputType::Audio => {
|
||||
if let Some(atts) = &attachments {
|
||||
if let Some(audio) = atts
|
||||
.iter()
|
||||
.find(|a| a.mime_type.as_deref().unwrap_or("").starts_with("audio/"))
|
||||
{
|
||||
return process_audio_to_text(state, &audio.url).await;
|
||||
}
|
||||
}
|
||||
return Err(validation_type.error_message());
|
||||
}
|
||||
InputType::Video => {
|
||||
if let Some(atts) = &attachments {
|
||||
if let Some(video) = atts
|
||||
.iter()
|
||||
.find(|a| a.mime_type.as_deref().unwrap_or("").starts_with("video/"))
|
||||
{
|
||||
return process_video_description(state, &video.url).await;
|
||||
}
|
||||
}
|
||||
return Err(validation_type.error_message());
|
||||
}
|
||||
InputType::File | InputType::Document => {
|
||||
if let Some(atts) = &attachments {
|
||||
if let Some(doc) = atts.first() {
|
||||
return Ok((
|
||||
doc.url.clone(),
|
||||
Some(serde_json::json!({ "attachment": doc })),
|
||||
));
|
||||
}
|
||||
}
|
||||
return Err(validation_type.error_message());
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
|
||||
let result = validate_input(input, &validation_type);
|
||||
|
||||
if result.is_valid {
|
||||
if let Some(redis_client) = &state.cache {
|
||||
if let Ok(mut conn) = redis_client.get_multiplexed_async_connection().await {
|
||||
let key = format!("hear:{session_id}:{variable_name}");
|
||||
let _: Result<(), _> = redis::cmd("DEL").arg(&key).query_async(&mut conn).await;
|
||||
}
|
||||
}
|
||||
|
||||
Ok((result.normalized_value, result.metadata))
|
||||
} else {
|
||||
Err(result
|
||||
.error_message
|
||||
.unwrap_or_else(|| validation_type.error_message()))
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn process_qrcode(
|
||||
state: &AppState,
|
||||
image_url: &str,
|
||||
) -> Result<(String, Option<serde_json::Value>), String> {
|
||||
let botmodels_url = {
|
||||
let config_url = state.conn.get().ok().and_then(|mut conn| {
|
||||
use crate::core::shared::models::schema::bot_memories::dsl::*;
|
||||
use diesel::prelude::*;
|
||||
bot_memories
|
||||
.filter(key.eq("botmodels-url"))
|
||||
.select(value)
|
||||
.first::<String>(&mut conn)
|
||||
.ok()
|
||||
});
|
||||
config_url.unwrap_or_else(|| {
|
||||
std::env::var("BOTMODELS_URL").unwrap_or_else(|_| "http://localhost:8001".to_string())
|
||||
})
|
||||
};
|
||||
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
let image_data = client
|
||||
.get(image_url)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to download image: {}", e))?
|
||||
.bytes()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to fetch image: {e}"))?;
|
||||
|
||||
let response = client
|
||||
.post(format!("{botmodels_url}/api/vision/qrcode"))
|
||||
.header("Content-Type", "application/octet-stream")
|
||||
.body(image_data.to_vec())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to call botmodels: {}", e))?;
|
||||
|
||||
if response.status().is_success() {
|
||||
let result: serde_json::Value = response
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read image: {e}"))?;
|
||||
|
||||
if let Some(qr_data) = result.get("data").and_then(|d| d.as_str()) {
|
||||
Ok((
|
||||
qr_data.to_string(),
|
||||
Some(serde_json::json!({
|
||||
"type": "qrcode",
|
||||
"raw": result
|
||||
})),
|
||||
))
|
||||
} else {
|
||||
Err("No QR code found in image".to_string())
|
||||
}
|
||||
} else {
|
||||
Err("Failed to read QR code".to_string())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn process_audio_to_text(
|
||||
_state: &AppState,
|
||||
audio_url: &str,
|
||||
) -> Result<(String, Option<serde_json::Value>), String> {
|
||||
let botmodels_url =
|
||||
std::env::var("BOTMODELS_URL").unwrap_or_else(|_| "http://localhost:8001".to_string());
|
||||
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
let audio_data = client
|
||||
.get(audio_url)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to download audio: {}", e))?
|
||||
.bytes()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read audio: {e}"))?;
|
||||
|
||||
let response = client
|
||||
.post(format!("{botmodels_url}/api/speech/to-text"))
|
||||
.header("Content-Type", "application/octet-stream")
|
||||
.body(audio_data.to_vec())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to call botmodels: {}", e))?;
|
||||
|
||||
if response.status().is_success() {
|
||||
let result: serde_json::Value = response
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse response: {}", e))?;
|
||||
|
||||
if let Some(text) = result.get("text").and_then(|t| t.as_str()) {
|
||||
Ok((
|
||||
text.to_string(),
|
||||
Some(serde_json::json!({
|
||||
"type": "audio_transcription",
|
||||
"language": result.get("language"),
|
||||
"confidence": result.get("confidence")
|
||||
})),
|
||||
))
|
||||
} else {
|
||||
Err("Could not transcribe audio".to_string())
|
||||
}
|
||||
} else {
|
||||
Err("Failed to process audio".to_string())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn process_video_description(
|
||||
_state: &AppState,
|
||||
video_url: &str,
|
||||
) -> Result<(String, Option<serde_json::Value>), String> {
|
||||
let botmodels_url =
|
||||
std::env::var("BOTMODELS_URL").unwrap_or_else(|_| "http://localhost:8001".to_string());
|
||||
|
||||
let client = reqwest::Client::new();
|
||||
|
||||
let video_data = client
|
||||
.get(video_url)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to download video: {}", e))?
|
||||
.bytes()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to fetch video: {e}"))?;
|
||||
|
||||
let response = client
|
||||
.post(format!("{botmodels_url}/api/vision/describe-video"))
|
||||
.header("Content-Type", "application/octet-stream")
|
||||
.body(video_data.to_vec())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read video: {e}"))?;
|
||||
|
||||
if response.status().is_success() {
|
||||
let result: serde_json::Value = response
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to parse response: {}", e))?;
|
||||
|
||||
if let Some(description) = result.get("description").and_then(|d| d.as_str()) {
|
||||
Ok((
|
||||
description.to_string(),
|
||||
Some(serde_json::json!({
|
||||
"type": "video_description",
|
||||
"frame_count": result.get("frame_count"),
|
||||
"url": video_url
|
||||
})),
|
||||
))
|
||||
} else {
|
||||
Err("Could not describe video".to_string())
|
||||
}
|
||||
} else {
|
||||
Err("Failed to process video".to_string())
|
||||
}
|
||||
}
|
||||
252
src/basic/keywords/hearing/syntax.rs
Normal file
252
src/basic/keywords/hearing/syntax.rs
Normal file
|
|
@ -0,0 +1,252 @@
|
|||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::trace;
|
||||
use rhai::{Dynamic, Engine, EvalAltResult};
|
||||
use serde_json::json;
|
||||
use std::sync::Arc;
|
||||
|
||||
use super::types::InputType;
|
||||
|
||||
pub fn hear_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
register_hear_basic(Arc::clone(&state), user.clone(), engine);
|
||||
|
||||
register_hear_as_type(Arc::clone(&state), user.clone(), engine);
|
||||
|
||||
register_hear_as_menu(state, user, engine);
|
||||
}
|
||||
|
||||
fn register_hear_basic(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let session_id = user.id;
|
||||
let state_clone = Arc::clone(&state);
|
||||
|
||||
engine
|
||||
.register_custom_syntax(["HEAR", "$ident$"], true, move |_context, inputs| {
|
||||
let variable_name = inputs[0]
|
||||
.get_string_value()
|
||||
.ok_or_else(|| Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Expected identifier as string".into(),
|
||||
rhai::Position::NONE,
|
||||
)))?
|
||||
.to_lowercase();
|
||||
|
||||
trace!(
|
||||
"HEAR command waiting for user input to store in variable: {}",
|
||||
variable_name
|
||||
);
|
||||
|
||||
let state_for_spawn = Arc::clone(&state_clone);
|
||||
let session_id_clone = session_id;
|
||||
|
||||
tokio::spawn(async move {
|
||||
trace!(
|
||||
"HEAR: Setting session {} to wait for input for variable '{}'",
|
||||
session_id_clone,
|
||||
variable_name
|
||||
);
|
||||
|
||||
{
|
||||
let mut session_manager = state_for_spawn.session_manager.lock().await;
|
||||
session_manager.mark_waiting(session_id_clone);
|
||||
}
|
||||
|
||||
if let Some(redis_client) = &state_for_spawn.cache {
|
||||
if let Ok(conn) = redis_client.get_multiplexed_async_connection().await {
|
||||
let mut conn = conn;
|
||||
let key = format!("hear:{session_id_clone}:{variable_name}");
|
||||
let wait_data = json!({
|
||||
"variable": variable_name,
|
||||
"type": "any",
|
||||
"waiting": true,
|
||||
"retry_count": 0
|
||||
});
|
||||
let _: Result<(), _> = redis::cmd("SET")
|
||||
.arg(key)
|
||||
.arg(wait_data.to_string())
|
||||
.arg("EX")
|
||||
.arg(3600)
|
||||
.query_async(&mut conn)
|
||||
.await;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
Err(Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Waiting for user input".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
})
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
fn register_hear_as_type(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let session_id = user.id;
|
||||
let state_clone = Arc::clone(&state);
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["HEAR", "$ident$", "AS", "$ident$"],
|
||||
true,
|
||||
move |_context, inputs| {
|
||||
let variable_name = inputs[0]
|
||||
.get_string_value()
|
||||
.ok_or_else(|| Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Expected identifier for variable".into(),
|
||||
rhai::Position::NONE,
|
||||
)))?
|
||||
.to_lowercase();
|
||||
let type_name = inputs[1]
|
||||
.get_string_value()
|
||||
.ok_or_else(|| Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Expected identifier for type".into(),
|
||||
rhai::Position::NONE,
|
||||
)))?
|
||||
.to_string();
|
||||
|
||||
let _input_type = InputType::parse_type(&type_name);
|
||||
|
||||
trace!("HEAR {variable_name} AS {type_name} - waiting for validated input");
|
||||
|
||||
let state_for_spawn = Arc::clone(&state_clone);
|
||||
let session_id_clone = session_id;
|
||||
let var_name_clone = variable_name;
|
||||
let type_clone = type_name;
|
||||
|
||||
tokio::spawn(async move {
|
||||
{
|
||||
let mut session_manager = state_for_spawn.session_manager.lock().await;
|
||||
session_manager.mark_waiting(session_id_clone);
|
||||
}
|
||||
|
||||
if let Some(redis_client) = &state_for_spawn.cache {
|
||||
if let Ok(mut conn) = redis_client.get_multiplexed_async_connection().await
|
||||
{
|
||||
let key = format!("hear:{session_id_clone}:{var_name_clone}");
|
||||
let wait_data = json!({
|
||||
"variable": var_name_clone,
|
||||
"type": type_clone.to_lowercase(),
|
||||
"waiting": true,
|
||||
"retry_count": 0,
|
||||
"max_retries": 3
|
||||
});
|
||||
let _: Result<(), _> = redis::cmd("SET")
|
||||
.arg(key)
|
||||
.arg(wait_data.to_string())
|
||||
.arg("EX")
|
||||
.arg(3600)
|
||||
.query_async(&mut conn)
|
||||
.await;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
Err(Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Waiting for user input".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
|
||||
fn register_hear_as_menu(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let session_id = user.id;
|
||||
let state_clone = Arc::clone(&state);
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["HEAR", "$ident$", "AS", "$expr$"],
|
||||
true,
|
||||
move |context, inputs| {
|
||||
let variable_name = inputs[0]
|
||||
.get_string_value()
|
||||
.ok_or_else(|| Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Expected identifier for variable".into(),
|
||||
rhai::Position::NONE,
|
||||
)))?
|
||||
.to_lowercase();
|
||||
|
||||
let options_expr = context.eval_expression_tree(&inputs[1])?;
|
||||
let options_str = options_expr.to_string();
|
||||
|
||||
let input_type = InputType::parse_type(&options_str);
|
||||
if input_type != InputType::Any {
|
||||
return Err(Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Use HEAR AS TYPE syntax".into(),
|
||||
rhai::Position::NONE,
|
||||
)));
|
||||
}
|
||||
|
||||
let options: Vec<String> = if options_str.starts_with('[') {
|
||||
serde_json::from_str(&options_str).unwrap_or_default()
|
||||
} else {
|
||||
options_str
|
||||
.split(',')
|
||||
.map(|s| s.trim().trim_matches('"').to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect()
|
||||
};
|
||||
|
||||
if options.is_empty() {
|
||||
return Err(Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Menu requires at least one option".into(),
|
||||
rhai::Position::NONE,
|
||||
)));
|
||||
}
|
||||
|
||||
trace!("HEAR {} AS MENU with options: {:?}", variable_name, options);
|
||||
|
||||
let state_for_spawn = Arc::clone(&state_clone);
|
||||
let session_id_clone = session_id;
|
||||
let var_name_clone = variable_name;
|
||||
let options_clone = options;
|
||||
|
||||
tokio::spawn(async move {
|
||||
{
|
||||
let mut session_manager = state_for_spawn.session_manager.lock().await;
|
||||
session_manager.mark_waiting(session_id_clone);
|
||||
}
|
||||
|
||||
if let Some(redis_client) = &state_for_spawn.cache {
|
||||
if let Ok(mut conn) = redis_client.get_multiplexed_async_connection().await
|
||||
{
|
||||
let key = format!("hear:{session_id_clone}:{var_name_clone}");
|
||||
let wait_data = json!({
|
||||
"variable": var_name_clone,
|
||||
"type": "menu",
|
||||
"options": options_clone,
|
||||
"waiting": true,
|
||||
"retry_count": 0
|
||||
});
|
||||
let _: Result<(), _> = redis::cmd("SET")
|
||||
.arg(key)
|
||||
.arg(wait_data.to_string())
|
||||
.arg("EX")
|
||||
.arg(3600)
|
||||
.query_async(&mut conn)
|
||||
.await;
|
||||
|
||||
let suggestions_key =
|
||||
format!("suggestions:{session_id_clone}:{session_id_clone}");
|
||||
for opt in &options_clone {
|
||||
let suggestion = json!({
|
||||
"text": opt,
|
||||
"value": opt
|
||||
});
|
||||
let _: Result<(), _> = redis::cmd("RPUSH")
|
||||
.arg(&suggestions_key)
|
||||
.arg(suggestion.to_string())
|
||||
.query_async(&mut conn)
|
||||
.await;
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
Err(Box::new(EvalAltResult::ErrorRuntime(
|
||||
"Waiting for user input".into(),
|
||||
rhai::Position::NONE,
|
||||
)))
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
121
src/basic/keywords/hearing/talk.rs
Normal file
121
src/basic/keywords/hearing/talk.rs
Normal file
|
|
@ -0,0 +1,121 @@
|
|||
use crate::core::shared::message_types::MessageType;
|
||||
use crate::core::shared::models::{BotResponse, UserSession};
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
||||
use super::super::universal_messaging::send_message_to_recipient;
|
||||
|
||||
pub async fn execute_talk(
|
||||
state: Arc<AppState>,
|
||||
user_session: UserSession,
|
||||
message: String,
|
||||
) -> Result<BotResponse, Box<dyn std::error::Error + Send + Sync>> {
|
||||
let mut suggestions = Vec::new();
|
||||
|
||||
if let Some(redis_client) = &state.cache {
|
||||
if let Ok(mut conn) = redis_client.get_multiplexed_async_connection().await {
|
||||
let redis_key = format!("suggestions:{}:{}", user_session.user_id, user_session.id);
|
||||
|
||||
let suggestions_json: Result<Vec<String>, _> = redis::cmd("LRANGE")
|
||||
.arg(redis_key.as_str())
|
||||
.arg(0)
|
||||
.arg(-1)
|
||||
.query_async(&mut conn)
|
||||
.await;
|
||||
|
||||
if let Ok(suggestions_list) = suggestions_json {
|
||||
suggestions = suggestions_list
|
||||
.into_iter()
|
||||
.filter_map(|s| serde_json::from_str(&s).ok())
|
||||
.collect();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let response = BotResponse {
|
||||
bot_id: user_session.bot_id.to_string(),
|
||||
user_id: user_session.user_id.to_string(),
|
||||
session_id: user_session.id.to_string(),
|
||||
channel: "web".to_string(),
|
||||
content: message,
|
||||
message_type: MessageType::BOT_RESPONSE,
|
||||
stream_token: None,
|
||||
is_complete: true,
|
||||
suggestions,
|
||||
context_name: None,
|
||||
context_length: 0,
|
||||
context_max_length: 0,
|
||||
};
|
||||
|
||||
let user_id = user_session.id.to_string();
|
||||
let response_clone = response.clone();
|
||||
|
||||
let web_adapter = Arc::clone(&state.web_adapter);
|
||||
tokio::spawn(async move {
|
||||
if let Err(e) = web_adapter
|
||||
.send_message_to_session(&user_id, response_clone)
|
||||
.await
|
||||
{
|
||||
error!("Failed to send TALK message via web adapter: {}", e);
|
||||
} else {
|
||||
trace!("TALK message sent via web adapter");
|
||||
}
|
||||
});
|
||||
|
||||
Ok(response)
|
||||
}
|
||||
|
||||
pub fn talk_keyword(state: Arc<AppState>, user: UserSession, engine: &mut Engine) {
|
||||
let state_clone = Arc::clone(&state);
|
||||
let user_clone = user.clone();
|
||||
|
||||
// Register TALK TO "recipient", "message" syntax FIRST (more specific pattern)
|
||||
let state_clone2 = Arc::clone(&state);
|
||||
let user_clone2 = user.clone();
|
||||
|
||||
engine
|
||||
.register_custom_syntax(
|
||||
["TALK", "TO", "$expr$", ",", "$expr$"],
|
||||
true,
|
||||
move |context, inputs| {
|
||||
let recipient = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let message = context.eval_expression_tree(&inputs[1])?.to_string();
|
||||
|
||||
trace!("TALK TO: Sending message to {}", recipient);
|
||||
|
||||
let state_for_send = Arc::clone(&state_clone2);
|
||||
let user_for_send = user_clone2.clone();
|
||||
|
||||
tokio::spawn(async move {
|
||||
if let Err(e) =
|
||||
send_message_to_recipient(state_for_send, &user_for_send, &recipient, &message)
|
||||
.await
|
||||
{
|
||||
error!("Failed to send TALK TO message: {}", e);
|
||||
}
|
||||
});
|
||||
|
||||
Ok(Dynamic::UNIT)
|
||||
},
|
||||
)
|
||||
.expect("valid syntax registration");
|
||||
|
||||
// Register simple TALK "message" syntax SECOND (fallback pattern)
|
||||
engine
|
||||
.register_custom_syntax(["TALK", "$expr$"], true, move |context, inputs| {
|
||||
let message = context.eval_expression_tree(&inputs[0])?.to_string();
|
||||
let state_for_talk = Arc::clone(&state_clone);
|
||||
let user_for_talk = user_clone.clone();
|
||||
|
||||
tokio::spawn(async move {
|
||||
if let Err(e) = execute_talk(state_for_talk, user_for_talk, message).await {
|
||||
error!("Error executing TALK command: {}", e);
|
||||
}
|
||||
});
|
||||
|
||||
Ok(Dynamic::UNIT)
|
||||
})
|
||||
.expect("valid syntax registration");
|
||||
}
|
||||
141
src/basic/keywords/hearing/types.rs
Normal file
141
src/basic/keywords/hearing/types.rs
Normal file
|
|
@ -0,0 +1,141 @@
|
|||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
|
||||
pub enum InputType {
|
||||
Any,
|
||||
Email,
|
||||
Date,
|
||||
Name,
|
||||
Integer,
|
||||
Float,
|
||||
Boolean,
|
||||
Hour,
|
||||
Money,
|
||||
Mobile,
|
||||
Zipcode,
|
||||
Language,
|
||||
Cpf,
|
||||
Cnpj,
|
||||
QrCode,
|
||||
Login,
|
||||
Menu(Vec<String>),
|
||||
File,
|
||||
Image,
|
||||
Audio,
|
||||
Video,
|
||||
Document,
|
||||
Url,
|
||||
Uuid,
|
||||
Color,
|
||||
CreditCard,
|
||||
Password,
|
||||
}
|
||||
|
||||
impl InputType {
|
||||
#[must_use]
|
||||
pub fn error_message(&self) -> String {
|
||||
match self {
|
||||
Self::Any => String::new(),
|
||||
Self::Email => {
|
||||
"Please enter a valid email address (e.g., user@example.com)".to_string()
|
||||
}
|
||||
Self::Date => "Please enter a valid date (e.g., 25/12/2024 or 2024-12-25)".to_string(),
|
||||
Self::Name => "Please enter a valid name (letters and spaces only)".to_string(),
|
||||
Self::Integer => "Please enter a valid whole number".to_string(),
|
||||
Self::Float => "Please enter a valid number".to_string(),
|
||||
Self::Boolean => "Please answer yes or no".to_string(),
|
||||
Self::Hour => "Please enter a valid time (e.g., 14:30 or 2:30 PM)".to_string(),
|
||||
Self::Money => "Please enter a valid amount (e.g., 100.00 or R$ 100,00)".to_string(),
|
||||
Self::Mobile => "Please enter a valid mobile number".to_string(),
|
||||
Self::Zipcode => "Please enter a valid ZIP/postal code".to_string(),
|
||||
Self::Language => "Please enter a valid language code (e.g., en, pt, es)".to_string(),
|
||||
Self::Cpf => "Please enter a valid CPF (11 digits)".to_string(),
|
||||
Self::Cnpj => "Please enter a valid CNPJ (14 digits)".to_string(),
|
||||
Self::QrCode => "Please send an image containing a QR code".to_string(),
|
||||
Self::Login => "Please complete the authentication process".to_string(),
|
||||
Self::Menu(options) => format!("Please select one of: {}", options.join(", ")),
|
||||
Self::File => "Please upload a file".to_string(),
|
||||
Self::Image => "Please send an image".to_string(),
|
||||
Self::Audio => "Please send an audio file or voice message".to_string(),
|
||||
Self::Video => "Please send a video".to_string(),
|
||||
Self::Document => "Please send a document (PDF, Word, etc.)".to_string(),
|
||||
Self::Url => "Please enter a valid URL".to_string(),
|
||||
Self::Uuid => "Please enter a valid UUID".to_string(),
|
||||
Self::Color => "Please enter a valid color (e.g., #FF0000 or red)".to_string(),
|
||||
Self::CreditCard => "Please enter a valid credit card number".to_string(),
|
||||
Self::Password => "Please enter a password (minimum 8 characters)".to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn parse_type(s: &str) -> Self {
|
||||
match s.to_uppercase().as_str() {
|
||||
"EMAIL" => Self::Email,
|
||||
"DATE" => Self::Date,
|
||||
"NAME" => Self::Name,
|
||||
"INTEGER" | "INT" | "NUMBER" => Self::Integer,
|
||||
"FLOAT" | "DECIMAL" | "DOUBLE" => Self::Float,
|
||||
"BOOLEAN" | "BOOL" => Self::Boolean,
|
||||
"HOUR" | "TIME" => Self::Hour,
|
||||
"MONEY" | "CURRENCY" | "AMOUNT" => Self::Money,
|
||||
"MOBILE" | "PHONE" | "TELEPHONE" => Self::Mobile,
|
||||
"ZIPCODE" | "ZIP" | "CEP" | "POSTALCODE" => Self::Zipcode,
|
||||
"LANGUAGE" | "LANG" => Self::Language,
|
||||
"CPF" => Self::Cpf,
|
||||
"CNPJ" => Self::Cnpj,
|
||||
"QRCODE" | "QR" => Self::QrCode,
|
||||
"LOGIN" | "AUTH" => Self::Login,
|
||||
"FILE" => Self::File,
|
||||
"IMAGE" | "PHOTO" | "PICTURE" => Self::Image,
|
||||
"AUDIO" | "VOICE" | "SOUND" => Self::Audio,
|
||||
"VIDEO" => Self::Video,
|
||||
"DOCUMENT" | "DOC" | "PDF" => Self::Document,
|
||||
"URL" | "LINK" => Self::Url,
|
||||
"UUID" | "GUID" => Self::Uuid,
|
||||
"COLOR" | "COLOUR" => Self::Color,
|
||||
"CREDITCARD" | "CARD" => Self::CreditCard,
|
||||
"PASSWORD" | "PASS" | "SECRET" => Self::Password,
|
||||
_ => Self::Any,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ValidationResult {
|
||||
pub is_valid: bool,
|
||||
pub normalized_value: String,
|
||||
pub error_message: Option<String>,
|
||||
pub metadata: Option<serde_json::Value>,
|
||||
}
|
||||
|
||||
impl ValidationResult {
|
||||
#[must_use]
|
||||
pub fn valid(value: String) -> Self {
|
||||
Self {
|
||||
is_valid: true,
|
||||
normalized_value: value,
|
||||
error_message: None,
|
||||
metadata: None,
|
||||
}
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn valid_with_metadata(value: String, metadata: serde_json::Value) -> Self {
|
||||
Self {
|
||||
is_valid: true,
|
||||
normalized_value: value,
|
||||
error_message: None,
|
||||
metadata: Some(metadata),
|
||||
}
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn invalid(error: String) -> Self {
|
||||
Self {
|
||||
is_valid: false,
|
||||
normalized_value: String::new(),
|
||||
error_message: Some(error),
|
||||
metadata: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
655
src/basic/keywords/hearing/validators.rs
Normal file
655
src/basic/keywords/hearing/validators.rs
Normal file
|
|
@ -0,0 +1,655 @@
|
|||
use super::types::{InputType, ValidationResult};
|
||||
use log::trace;
|
||||
use regex::Regex;
|
||||
use uuid::Uuid;
|
||||
|
||||
#[must_use]
|
||||
pub fn validate_input(input: &str, input_type: &InputType) -> ValidationResult {
|
||||
let trimmed = input.trim();
|
||||
|
||||
match input_type {
|
||||
InputType::Any
|
||||
| InputType::QrCode
|
||||
| InputType::File
|
||||
| InputType::Image
|
||||
| InputType::Audio
|
||||
| InputType::Video
|
||||
| InputType::Document
|
||||
| InputType::Login => ValidationResult::valid(trimmed.to_string()),
|
||||
InputType::Email => validate_email(trimmed),
|
||||
InputType::Date => validate_date(trimmed),
|
||||
InputType::Name => validate_name(trimmed),
|
||||
InputType::Integer => validate_integer(trimmed),
|
||||
InputType::Float => validate_float(trimmed),
|
||||
InputType::Boolean => validate_boolean(trimmed),
|
||||
InputType::Hour => validate_hour(trimmed),
|
||||
InputType::Money => validate_money(trimmed),
|
||||
InputType::Mobile => validate_mobile(trimmed),
|
||||
InputType::Zipcode => validate_zipcode(trimmed),
|
||||
InputType::Language => validate_language(trimmed),
|
||||
InputType::Cpf => validate_cpf(trimmed),
|
||||
InputType::Cnpj => validate_cnpj(trimmed),
|
||||
InputType::Url => validate_url(trimmed),
|
||||
InputType::Uuid => validate_uuid(trimmed),
|
||||
InputType::Color => validate_color(trimmed),
|
||||
InputType::CreditCard => validate_credit_card(trimmed),
|
||||
InputType::Password => validate_password(trimmed),
|
||||
InputType::Menu(options) => validate_menu(trimmed, options),
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_email(input: &str) -> ValidationResult {
|
||||
let email_regex = Regex::new(r"^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$").expect("valid regex");
|
||||
|
||||
if email_regex.is_match(input) {
|
||||
ValidationResult::valid(input.to_lowercase())
|
||||
} else {
|
||||
ValidationResult::invalid(InputType::Email.error_message())
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_date(input: &str) -> ValidationResult {
|
||||
let formats = [
|
||||
"%d/%m/%Y", "%d-%m-%Y", "%Y-%m-%d", "%Y/%m/%d", "%d.%m.%Y", "%m/%d/%Y", "%d %b %Y",
|
||||
"%d %B %Y",
|
||||
];
|
||||
|
||||
for format in &formats {
|
||||
if let Ok(date) = chrono::NaiveDate::parse_from_str(input, format) {
|
||||
return ValidationResult::valid_with_metadata(
|
||||
date.format("%Y-%m-%d").to_string(),
|
||||
serde_json::json!({
|
||||
"original": input,
|
||||
"parsed_format": *format
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
let lower = input.to_lowercase();
|
||||
let today = chrono::Local::now().date_naive();
|
||||
|
||||
if lower == "today" || lower == "hoje" {
|
||||
return ValidationResult::valid(today.format("%Y-%m-%d").to_string());
|
||||
}
|
||||
if lower == "tomorrow" || lower == "amanhã" || lower == "amanha" {
|
||||
return ValidationResult::valid(
|
||||
(today + chrono::Duration::days(1))
|
||||
.format("%Y-%m-%d")
|
||||
.to_string(),
|
||||
);
|
||||
}
|
||||
if lower == "yesterday" || lower == "ontem" {
|
||||
return ValidationResult::valid(
|
||||
(today - chrono::Duration::days(1))
|
||||
.format("%Y-%m-%d")
|
||||
.to_string(),
|
||||
);
|
||||
}
|
||||
|
||||
ValidationResult::invalid(InputType::Date.error_message())
|
||||
}
|
||||
|
||||
fn validate_name(input: &str) -> ValidationResult {
|
||||
let name_regex = Regex::new(r"^[\p{L}\s\-']+$").expect("valid regex");
|
||||
|
||||
if input.len() < 2 {
|
||||
return ValidationResult::invalid("Name must be at least 2 characters".to_string());
|
||||
}
|
||||
|
||||
if input.len() > 100 {
|
||||
return ValidationResult::invalid("Name is too long".to_string());
|
||||
}
|
||||
|
||||
if name_regex.is_match(input) {
|
||||
let normalized = input
|
||||
.split_whitespace()
|
||||
.map(|word| {
|
||||
let mut chars = word.chars();
|
||||
match chars.next() {
|
||||
None => String::new(),
|
||||
Some(first) => first.to_uppercase().collect::<String>() + chars.as_str(),
|
||||
}
|
||||
})
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ");
|
||||
ValidationResult::valid(normalized)
|
||||
} else {
|
||||
ValidationResult::invalid(InputType::Name.error_message())
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_integer(input: &str) -> ValidationResult {
|
||||
let cleaned = input.replace([',', '.', ' '], "").trim().to_string();
|
||||
|
||||
match cleaned.parse::<i64>() {
|
||||
Ok(num) => ValidationResult::valid_with_metadata(
|
||||
num.to_string(),
|
||||
serde_json::json!({ "value": num }),
|
||||
),
|
||||
Err(_) => ValidationResult::invalid(InputType::Integer.error_message()),
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_float(input: &str) -> ValidationResult {
|
||||
let cleaned = input.replace(' ', "").replace(',', ".").trim().to_string();
|
||||
|
||||
match cleaned.parse::<f64>() {
|
||||
Ok(num) => ValidationResult::valid_with_metadata(
|
||||
format!("{:.2}", num),
|
||||
serde_json::json!({ "value": num }),
|
||||
),
|
||||
Err(_) => ValidationResult::invalid(InputType::Float.error_message()),
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_boolean(input: &str) -> ValidationResult {
|
||||
let lower = input.to_lowercase();
|
||||
|
||||
let true_values = [
|
||||
"yes",
|
||||
"y",
|
||||
"true",
|
||||
"1",
|
||||
"sim",
|
||||
"s",
|
||||
"si",
|
||||
"oui",
|
||||
"ja",
|
||||
"da",
|
||||
"ok",
|
||||
"yeah",
|
||||
"yep",
|
||||
"sure",
|
||||
"confirm",
|
||||
"confirmed",
|
||||
"accept",
|
||||
"agreed",
|
||||
"agree",
|
||||
];
|
||||
|
||||
let false_values = [
|
||||
"no", "n", "false", "0", "não", "nao", "non", "nein", "net", "nope", "cancel", "deny",
|
||||
"denied", "reject", "declined", "disagree",
|
||||
];
|
||||
|
||||
if true_values.contains(&lower.as_str()) {
|
||||
ValidationResult::valid_with_metadata(
|
||||
"true".to_string(),
|
||||
serde_json::json!({ "value": true }),
|
||||
)
|
||||
} else if false_values.contains(&lower.as_str()) {
|
||||
ValidationResult::valid_with_metadata(
|
||||
"false".to_string(),
|
||||
serde_json::json!({ "value": false }),
|
||||
)
|
||||
} else {
|
||||
ValidationResult::invalid(InputType::Boolean.error_message())
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_hour(input: &str) -> ValidationResult {
|
||||
let time_24_regex = Regex::new(r"^([01]?\d|2[0-3]):([0-5]\d)$").expect("valid regex");
|
||||
if let Some(caps) = time_24_regex.captures(input) {
|
||||
let hour: u32 = caps[1].parse().unwrap_or_default();
|
||||
let minute: u32 = caps[2].parse().unwrap_or_default();
|
||||
return ValidationResult::valid_with_metadata(
|
||||
format!("{:02}:{:02}", hour, minute),
|
||||
serde_json::json!({ "hour": hour, "minute": minute }),
|
||||
);
|
||||
}
|
||||
|
||||
let time_12_regex =
|
||||
Regex::new(r"^(1[0-2]|0?[1-9]):([0-5]\d)\s*(AM|PM|am|pm|a\.m\.|p\.m\.)$").expect("valid regex");
|
||||
if let Some(caps) = time_12_regex.captures(input) {
|
||||
let mut hour: u32 = caps[1].parse().unwrap_or_default();
|
||||
let minute: u32 = caps[2].parse().unwrap_or_default();
|
||||
let period = caps[3].to_uppercase();
|
||||
|
||||
if period.starts_with('P') && hour != 12 {
|
||||
hour += 12;
|
||||
} else if period.starts_with('A') && hour == 12 {
|
||||
hour = 0;
|
||||
}
|
||||
|
||||
return ValidationResult::valid_with_metadata(
|
||||
format!("{:02}:{:02}", hour, minute),
|
||||
serde_json::json!({ "hour": hour, "minute": minute }),
|
||||
);
|
||||
}
|
||||
|
||||
ValidationResult::invalid(InputType::Hour.error_message())
|
||||
}
|
||||
|
||||
fn validate_money(input: &str) -> ValidationResult {
|
||||
let cleaned = input
|
||||
.replace("R$", "")
|
||||
.replace(['$', '€', '£', '¥', ' '], "")
|
||||
.trim()
|
||||
.to_string();
|
||||
|
||||
let normalized = if cleaned.contains(',') && cleaned.contains('.') {
|
||||
let last_comma = cleaned.rfind(',').unwrap_or(0);
|
||||
let last_dot = cleaned.rfind('.').unwrap_or(0);
|
||||
|
||||
if last_comma > last_dot {
|
||||
cleaned.replace('.', "").replace(',', ".")
|
||||
} else {
|
||||
cleaned.replace(',', "")
|
||||
}
|
||||
} else if cleaned.contains(',') {
|
||||
cleaned.replace(',', ".")
|
||||
} else {
|
||||
cleaned
|
||||
};
|
||||
|
||||
match normalized.parse::<f64>() {
|
||||
Ok(amount) if amount >= 0.0 => ValidationResult::valid_with_metadata(
|
||||
format!("{:.2}", amount),
|
||||
serde_json::json!({ "value": amount }),
|
||||
),
|
||||
_ => ValidationResult::invalid(InputType::Money.error_message()),
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_mobile(input: &str) -> ValidationResult {
|
||||
let digits: String = input.chars().filter(|c| c.is_ascii_digit()).collect();
|
||||
|
||||
if digits.len() < 10 || digits.len() > 15 {
|
||||
return ValidationResult::invalid(InputType::Mobile.error_message());
|
||||
}
|
||||
|
||||
let formatted = match digits.len() {
|
||||
11 => format!("({}) {}-{}", &digits[0..2], &digits[2..7], &digits[7..11]),
|
||||
10 => format!("({}) {}-{}", &digits[0..3], &digits[3..6], &digits[6..10]),
|
||||
_ => format!("+{digits}"),
|
||||
};
|
||||
|
||||
ValidationResult::valid_with_metadata(
|
||||
formatted.clone(),
|
||||
serde_json::json!({ "digits": digits, "formatted": formatted }),
|
||||
)
|
||||
}
|
||||
|
||||
fn validate_zipcode(input: &str) -> ValidationResult {
|
||||
let cleaned: String = input
|
||||
.chars()
|
||||
.filter(|c| c.is_ascii_alphanumeric())
|
||||
.collect();
|
||||
|
||||
if cleaned.len() == 8 && cleaned.chars().all(|c| c.is_ascii_digit()) {
|
||||
let formatted = format!("{}-{}", &cleaned[0..5], &cleaned[5..8]);
|
||||
return ValidationResult::valid_with_metadata(
|
||||
formatted.clone(),
|
||||
serde_json::json!({ "digits": cleaned, "formatted": formatted, "country": "BR" }),
|
||||
);
|
||||
}
|
||||
|
||||
if (cleaned.len() == 5 || cleaned.len() == 9) && cleaned.chars().all(|c| c.is_ascii_digit()) {
|
||||
let formatted = if cleaned.len() == 9 {
|
||||
format!("{}-{}", &cleaned[0..5], &cleaned[5..8])
|
||||
} else {
|
||||
cleaned.clone()
|
||||
};
|
||||
return ValidationResult::valid_with_metadata(
|
||||
formatted.clone(),
|
||||
serde_json::json!({ "digits": cleaned, "formatted": formatted, "country": "US" }),
|
||||
);
|
||||
}
|
||||
|
||||
let uk_regex = Regex::new(r"^[A-Z]{1,2}\d[A-Z\d]?\s?\d[A-Z]{2}$").expect("valid regex");
|
||||
if uk_regex.is_match(&cleaned.to_uppercase()) {
|
||||
return ValidationResult::valid_with_metadata(
|
||||
cleaned.to_uppercase(),
|
||||
serde_json::json!({ "formatted": cleaned.to_uppercase(), "country": "UK" }),
|
||||
);
|
||||
}
|
||||
|
||||
ValidationResult::invalid(InputType::Zipcode.error_message())
|
||||
}
|
||||
|
||||
fn validate_language(input: &str) -> ValidationResult {
|
||||
let lower = input.to_lowercase().trim().to_string();
|
||||
|
||||
let languages = [
|
||||
("en", "english", "inglês", "ingles"),
|
||||
("pt", "portuguese", "português", "portugues"),
|
||||
("es", "spanish", "espanhol", "español"),
|
||||
("fr", "french", "francês", "frances"),
|
||||
("de", "german", "alemão", "alemao"),
|
||||
("it", "italian", "italiano", ""),
|
||||
("ja", "japanese", "japonês", "japones"),
|
||||
("zh", "chinese", "chinês", "chines"),
|
||||
("ko", "korean", "coreano", ""),
|
||||
("ru", "russian", "russo", ""),
|
||||
("ar", "arabic", "árabe", "arabe"),
|
||||
("hi", "hindi", "", ""),
|
||||
("nl", "dutch", "holandês", "holandes"),
|
||||
("pl", "polish", "polonês", "polones"),
|
||||
("tr", "turkish", "turco", ""),
|
||||
];
|
||||
|
||||
for entry in &languages {
|
||||
let code = entry.0;
|
||||
let variants = [entry.1, entry.2, entry.3];
|
||||
if lower.as_str() == code
|
||||
|| variants
|
||||
.iter()
|
||||
.any(|v| !v.is_empty() && lower.as_str() == *v)
|
||||
{
|
||||
return ValidationResult::valid_with_metadata(
|
||||
code.to_string(),
|
||||
serde_json::json!({ "code": code, "input": input }),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if lower.len() == 2 && lower.chars().all(|c| c.is_ascii_lowercase()) {
|
||||
return ValidationResult::valid(lower);
|
||||
}
|
||||
|
||||
ValidationResult::invalid(InputType::Language.error_message())
|
||||
}
|
||||
|
||||
fn validate_cpf(input: &str) -> ValidationResult {
|
||||
let digits: String = input.chars().filter(|c| c.is_ascii_digit()).collect();
|
||||
|
||||
if digits.len() != 11 {
|
||||
return ValidationResult::invalid(InputType::Cpf.error_message());
|
||||
}
|
||||
|
||||
if let Some(first_char) = digits.chars().next() {
|
||||
if digits.chars().all(|c| c == first_char) {
|
||||
return ValidationResult::invalid("Invalid CPF".to_string());
|
||||
}
|
||||
}
|
||||
|
||||
let digits_vec: Vec<u32> = digits.chars().filter_map(|c| c.to_digit(10)).collect();
|
||||
|
||||
let sum1: u32 = digits_vec[0..9]
|
||||
.iter()
|
||||
.enumerate()
|
||||
.map(|(i, &d)| d * (10 - i as u32))
|
||||
.sum();
|
||||
let check1 = (sum1 * 10) % 11;
|
||||
let check1 = if check1 == 10 { 0 } else { check1 };
|
||||
|
||||
if check1 != digits_vec[9] {
|
||||
return ValidationResult::invalid("Invalid CPF".to_string());
|
||||
}
|
||||
|
||||
let sum2: u32 = digits_vec[0..10]
|
||||
.iter()
|
||||
.enumerate()
|
||||
.map(|(i, &d)| d * (11 - i as u32))
|
||||
.sum();
|
||||
let check2 = (sum2 * 10) % 11;
|
||||
let check2 = if check2 == 10 { 0 } else { check2 };
|
||||
|
||||
if check2 != digits_vec[10] {
|
||||
return ValidationResult::invalid("Invalid CPF".to_string());
|
||||
}
|
||||
|
||||
let formatted = format!(
|
||||
"{}.{}.{}-{}",
|
||||
&digits[0..3], &digits[3..6], &digits[6..9], &digits[9..11]
|
||||
);
|
||||
|
||||
ValidationResult::valid_with_metadata(
|
||||
formatted.clone(),
|
||||
serde_json::json!({ "digits": digits, "formatted": formatted }),
|
||||
)
|
||||
}
|
||||
|
||||
fn validate_cnpj(input: &str) -> ValidationResult {
|
||||
let digits: String = input.chars().filter(|c| c.is_ascii_digit()).collect();
|
||||
|
||||
if digits.len() != 14 {
|
||||
return ValidationResult::invalid(InputType::Cnpj.error_message());
|
||||
}
|
||||
|
||||
let digits_vec: Vec<u32> = digits.chars().filter_map(|c| c.to_digit(10)).collect();
|
||||
|
||||
let weights1 = [5, 4, 3, 2, 9, 8, 7, 6, 5, 4, 3, 2];
|
||||
let sum1: u32 = digits_vec[0..12]
|
||||
.iter()
|
||||
.zip(weights1.iter())
|
||||
.map(|(&d, &w)| d * w)
|
||||
.sum();
|
||||
let check1 = sum1 % 11;
|
||||
let check1 = if check1 < 2 { 0 } else { 11 - check1 };
|
||||
|
||||
if check1 != digits_vec[12] {
|
||||
return ValidationResult::invalid("Invalid CNPJ".to_string());
|
||||
}
|
||||
|
||||
let weights2 = [6, 5, 4, 3, 2, 9, 8, 7, 6, 5, 4, 3, 2];
|
||||
let sum2: u32 = digits_vec[0..13]
|
||||
.iter()
|
||||
.zip(weights2.iter())
|
||||
.map(|(&d, &w)| d * w)
|
||||
.sum();
|
||||
let check2 = sum2 % 11;
|
||||
let check2 = if check2 < 2 { 0 } else { 11 - check2 };
|
||||
|
||||
if check2 != digits_vec[13] {
|
||||
return ValidationResult::invalid("Invalid CNPJ".to_string());
|
||||
}
|
||||
|
||||
let formatted = format!(
|
||||
"{}.{}.{}/{}-{}",
|
||||
&digits[0..2], &digits[2..5], &digits[5..8], &digits[8..12], &digits[12..14]
|
||||
);
|
||||
|
||||
ValidationResult::valid_with_metadata(
|
||||
formatted.clone(),
|
||||
serde_json::json!({ "digits": digits, "formatted": formatted }),
|
||||
)
|
||||
}
|
||||
|
||||
fn validate_url(input: &str) -> ValidationResult {
|
||||
let url_str = if !input.starts_with("http://") && !input.starts_with("https://") {
|
||||
format!("https://{input}")
|
||||
} else {
|
||||
input.to_string()
|
||||
};
|
||||
|
||||
let url_regex = Regex::new(r"^https?://[a-zA-Z0-9][-a-zA-Z0-9]*(\.[a-zA-Z0-9][-a-zA-Z0-9]*)+(/[-a-zA-Z0-9()@:%_\+.~#?&/=]*)?$").expect("valid regex");
|
||||
|
||||
if url_regex.is_match(&url_str) {
|
||||
ValidationResult::valid(url_str)
|
||||
} else {
|
||||
ValidationResult::invalid(InputType::Url.error_message())
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_uuid(input: &str) -> ValidationResult {
|
||||
match Uuid::parse_str(input.trim()) {
|
||||
Ok(uuid) => ValidationResult::valid(uuid.to_string()),
|
||||
Err(_) => ValidationResult::invalid(InputType::Uuid.error_message()),
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_color(input: &str) -> ValidationResult {
|
||||
let lower = input.to_lowercase().trim().to_string();
|
||||
|
||||
let named_colors = [
|
||||
("red", "#FF0000"),
|
||||
("green", "#00FF00"),
|
||||
("blue", "#0000FF"),
|
||||
("white", "#FFFFFF"),
|
||||
("black", "#000000"),
|
||||
("yellow", "#FFFF00"),
|
||||
("orange", "#FFA500"),
|
||||
("purple", "#800080"),
|
||||
("pink", "#FFC0CB"),
|
||||
("gray", "#808080"),
|
||||
("grey", "#808080"),
|
||||
("brown", "#A52A2A"),
|
||||
("cyan", "#00FFFF"),
|
||||
("magenta", "#FF00FF"),
|
||||
];
|
||||
|
||||
for (name, hex) in &named_colors {
|
||||
if lower == *name {
|
||||
return ValidationResult::valid_with_metadata(
|
||||
(*hex).to_owned(),
|
||||
serde_json::json!({ "name": name, "hex": hex }),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
let hex_regex = Regex::new(r"^#?([A-Fa-f0-9]{6}|[A-Fa-f0-9]{3})$").expect("valid regex");
|
||||
if let Some(caps) = hex_regex.captures(&lower) {
|
||||
let hex = caps[1].to_uppercase();
|
||||
let full_hex = if hex.len() == 3 {
|
||||
let mut result = String::with_capacity(6);
|
||||
for c in hex.chars() {
|
||||
result.push(c);
|
||||
result.push(c);
|
||||
}
|
||||
result
|
||||
} else {
|
||||
hex
|
||||
};
|
||||
return ValidationResult::valid(format!("#{}", full_hex));
|
||||
}
|
||||
|
||||
let rgb_regex =
|
||||
Regex::new(r"^rgb\s*\(\s*(\d{1,3})\s*,\s*(\d{1,3})\s*,\s*(\d{1,3})\s*\)$").expect("valid regex");
|
||||
if let Some(caps) = rgb_regex.captures(&lower) {
|
||||
let r: u8 = caps[1].parse().unwrap_or(0);
|
||||
let g: u8 = caps[2].parse().unwrap_or(0);
|
||||
let b: u8 = caps[3].parse().unwrap_or(0);
|
||||
return ValidationResult::valid(format!("#{:02X}{:02X}{:02X}", r, g, b));
|
||||
}
|
||||
|
||||
ValidationResult::invalid(InputType::Color.error_message())
|
||||
}
|
||||
|
||||
fn validate_credit_card(input: &str) -> ValidationResult {
|
||||
let digits: String = input.chars().filter(|c| c.is_ascii_digit()).collect();
|
||||
|
||||
if digits.len() < 13 || digits.len() > 19 {
|
||||
return ValidationResult::invalid(InputType::CreditCard.error_message());
|
||||
}
|
||||
|
||||
let mut sum = 0;
|
||||
let mut double = false;
|
||||
|
||||
for c in digits.chars().rev() {
|
||||
let mut digit = c.to_digit(10).unwrap_or(0);
|
||||
if double {
|
||||
digit *= 2;
|
||||
if digit > 9 {
|
||||
digit -= 9;
|
||||
}
|
||||
}
|
||||
sum += digit;
|
||||
double = !double;
|
||||
}
|
||||
|
||||
if sum % 10 != 0 {
|
||||
return ValidationResult::invalid("Invalid card number".to_string());
|
||||
}
|
||||
|
||||
let card_type = if digits.starts_with('4') {
|
||||
"Visa"
|
||||
} else if digits.starts_with("51")
|
||||
|| digits.starts_with("52")
|
||||
|| digits.starts_with("53")
|
||||
|| digits.starts_with("54")
|
||||
|| digits.starts_with("55")
|
||||
{
|
||||
"Mastercard"
|
||||
} else if digits.starts_with("34") || digits.starts_with("37") {
|
||||
"American Express"
|
||||
} else if digits.starts_with("36") || digits.starts_with("38") {
|
||||
"Diners Club"
|
||||
} else if digits.starts_with("6011") || digits.starts_with("65") {
|
||||
"Discover"
|
||||
} else {
|
||||
"Unknown"
|
||||
};
|
||||
|
||||
let masked = format!(
|
||||
"{} **** **** {}",
|
||||
&digits[0..4],
|
||||
&digits[digits.len() - 4..]
|
||||
);
|
||||
|
||||
ValidationResult::valid_with_metadata(
|
||||
masked.clone(),
|
||||
serde_json::json!({
|
||||
"masked": masked,
|
||||
"last_four": &digits[digits.len()-4..],
|
||||
"card_type": card_type
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
fn validate_password(input: &str) -> ValidationResult {
|
||||
if input.len() < 8 {
|
||||
return ValidationResult::invalid("Password must be at least 8 characters".to_string());
|
||||
}
|
||||
|
||||
let has_upper = input.chars().any(|c| c.is_uppercase());
|
||||
let has_lower = input.chars().any(|c| c.is_lowercase());
|
||||
let has_digit = input.chars().any(|c| c.is_ascii_digit());
|
||||
let has_special = input.chars().any(|c| !c.is_alphanumeric());
|
||||
|
||||
let strength = match (has_upper, has_lower, has_digit, has_special) {
|
||||
(true, true, true, true) => "strong",
|
||||
(true, true, true, false) | (true, true, false, true) | (true, false, true, true) => {
|
||||
"medium"
|
||||
}
|
||||
_ => "weak",
|
||||
};
|
||||
|
||||
ValidationResult::valid_with_metadata(
|
||||
"[PASSWORD SET]".to_string(),
|
||||
serde_json::json!({
|
||||
"strength": strength,
|
||||
"length": input.len()
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
fn validate_menu(input: &str, options: &[String]) -> ValidationResult {
|
||||
let lower_input = input.to_lowercase().trim().to_string();
|
||||
|
||||
for (i, opt) in options.iter().enumerate() {
|
||||
if opt.to_lowercase() == lower_input {
|
||||
return ValidationResult::valid_with_metadata(
|
||||
opt.clone(),
|
||||
serde_json::json!({ "index": i, "value": opt }),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(num) = lower_input.parse::<usize>() {
|
||||
if num >= 1 && num <= options.len() {
|
||||
let selected = &options[num - 1];
|
||||
return ValidationResult::valid_with_metadata(
|
||||
selected.clone(),
|
||||
serde_json::json!({ "index": num - 1, "value": selected }),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
let matches: Vec<&String> = options
|
||||
.iter()
|
||||
.filter(|opt| opt.to_lowercase().contains(&lower_input))
|
||||
.collect();
|
||||
|
||||
if matches.len() == 1 {
|
||||
let idx = options.iter().position(|o| o == matches[0]).unwrap_or(0);
|
||||
return ValidationResult::valid_with_metadata(
|
||||
matches[0].clone(),
|
||||
serde_json::json!({ "index": idx, "value": matches[0] }),
|
||||
);
|
||||
}
|
||||
|
||||
let opts = options.join(", ");
|
||||
ValidationResult::invalid(format!("Please select one of: {opts}"))
|
||||
}
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, trace};
|
||||
use reqwest::{header::HeaderMap, header::HeaderName, header::HeaderValue, Client, Method};
|
||||
use rhai::{Dynamic, Engine, Map};
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, trace};
|
||||
use rhai::{Array, Dynamic, Engine, Map};
|
||||
use serde_json::Value;
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
use crate::core::config::ConfigManager;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::shared::utils::create_tls_client;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use crate::core::shared::utils::create_tls_client;
|
||||
use log::{error, trace};
|
||||
use rhai::{Dynamic, Engine};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
|
|
|||
|
|
@ -24,8 +24,8 @@
|
|||
|
||||
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::error;
|
||||
use rhai::{Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -29,8 +29,8 @@
|
|||
\*****************************************************************************/
|
||||
|
||||
use crate::core::config::ConfigManager;
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::{error, trace};
|
||||
use rhai::{Array, Dynamic, Engine, Map};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::{Array, Dynamic, Engine};
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
|
|
@ -5,8 +5,8 @@ pub mod random;
|
|||
pub mod round;
|
||||
pub mod trig;
|
||||
|
||||
use crate::shared::models::UserSession;
|
||||
use crate::shared::state::AppState;
|
||||
use crate::core::shared::models::UserSession;
|
||||
use crate::core::shared::state::AppState;
|
||||
use log::debug;
|
||||
use rhai::Engine;
|
||||
use std::sync::Arc;
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue