- Fixing compilation errors.

This commit is contained in:
Rodrigo Rodriguez (Pragmatismo) 2025-10-11 13:29:38 -03:00
parent 283774aa0f
commit ecddc10244
4 changed files with 71 additions and 76 deletions

View file

@ -1,36 +1,18 @@
You are fixing Rust code in a Cargo project. The user is providing problematic code that needs to be corrected.
You are fixing Rust code in a Cargo project. The user supplies problematic source files that need correction.
## Your Task
Fix ALL compiler errors and logical issues while maintaining the original intent. Return the COMPLETE corrected files as a SINGLE .sh script that can be executed from project root.
Use Cargo.toml as reference, do not change it.
Only return input files, all other files already exists.
If something, need to be added to a external file, inform it separated.
- Detect **all** compiler errors and logical issues in the provided Rust files.
- Use **Cargo.toml** as the single source of truth for dependencies, edition, and feature flags; **do not modify** it.
- Generate a **single, minimal `.diff` patch** per file that needs changes.
- Only modify the lines required to resolve the errors.
- Keep the patch as small as possible to minimise impact.
- Return **only** the patch files; all other project files already exist and should not be echoed back.
- If a new external file must be created, list its name and required content **separately** after the patch list.
## Critical Requirements
1. **Return as SINGLE .sh script** - Output must be a complete shell script using `cat > file << 'EOF'` pattern
2. **Include ALL files** - Every corrected file must be included in the script
3. **Respect Cargo.toml** - Check dependencies, editions, and features to avoid compiler errors
4. **Type safety** - Ensure all types match and trait bounds are satisfied
5. **Ownership rules** - Fix borrowing, ownership, and lifetime issues
1. **Respect Cargo.toml** Verify versions, edition, and enabled features to avoid new compiletime problems.
2. **Type safety** All types must line up; trait bounds must be satisfied.
3. **Ownership & lifetimes** Correct borrowing, moving, and lifetime annotations.
4. **Patch format** Use standard unified diff syntax (`--- a/path.rs`, `+++ b/path.rs`, `@@` hunk headers, `-` removals, `+` additions).
## Output Format Requirements
You MUST return exactly this example format:
```sh
#!/bin/bash
# Restore fixed Rust project
cat > src/<filenamehere>.rs << 'EOF'
use std::io;
// test
cat > src/<anotherfile>.rs << 'EOF'
// Fixed library code
pub fn add(a: i32, b: i32) -> i32 {
a + b
}
EOF
----
**IMPORTANT:** The output must be a plain list of `patch <file>.diff <EOL ` single .sh for patches (and, if needed, a separate list of new files) with no additional explanatory text. This keeps the response minimal and ready for direct application with `git apply` or `patch`.

View file

@ -1,7 +0,0 @@
Return only the modified files as a single `.sh` script using `cat`, so the code can be restored directly.
No placeholders, no comments, no explanations, no filler text.
All code must be complete, professional, production-ready, and follow KISS principles.
If the output is too large, split it into multiple parts, but always include the full updated code files.
Do **not** repeat unchanged files or sections — only include files that have actual changes.
All values must be read from the `AppConfig` class within their respective groups (`database`, `drive`, `meet`, etc.); never use hardcoded or magic values.
Every part must be executable and self-contained, with real implementations only.

View file

@ -7,9 +7,10 @@ rm $OUTPUT_FILE
echo "Consolidated LLM Context" > "$OUTPUT_FILE"
prompts=(
"../../prompts/dev/general.md"
"../../prompts/dev/shared.md"
"../../Cargo.toml"
# "../../prompts/dev/fix.md"
"../../prompts/dev/fix.md"
#"../../prompts/dev/generation.md"
)
for file in "${prompts[@]}"; do
@ -19,25 +20,25 @@ done
dirs=(
#"auth"
#"automation"
#"basic"
"automation"
"basic"
"bot"
#"channels"
"config"
#"config"
"context"
#"email"
#"file"
"llm"
"file"
#"llm"
#"llm_legacy"
#"org"
#"session"
"session"
"shared"
#"tests"
#"tools"
"tools"
#"web_automation"
#"whatsapp"
)
dirs=() # disabled.
for dir in "${dirs[@]}"; do
find "$PROJECT_ROOT/src/$dir" -name "*.rs" | while read file; do
cat "$file" >> "$OUTPUT_FILE"
@ -45,6 +46,21 @@ for dir in "${dirs[@]}"; do
done
done
# Extract unique .rs file paths from error messages and append them
cargo build --message-format=short 2>&1 | grep -E 'error' | grep -oE '[^ ]+\.rs' | sort -u | while read -r file_path; do
# Convert to absolute path if relative
if [[ ! "$file_path" = /* ]]; then
file_path="$PROJECT_ROOT/$file_path"
fi
# Check if file exists and append it
if [[ -f "$file_path" ]]; then
echo "=== Appending error file: $file_path ===" >> "$OUTPUT_FILE"
cat "$file_path" >> "$OUTPUT_FILE"
echo -e "\n\n" >> "$OUTPUT_FILE"
fi
done
# Also append the specific files you mentioned
cat "$PROJECT_ROOT/src/main.rs" >> "$OUTPUT_FILE"
cat "$PROJECT_ROOT/src/basic/keywords/hear_talk.rs" >> "$OUTPUT_FILE"
cat "$PROJECT_ROOT/templates/annoucements.gbai/annoucements.gbdialog/start.bas" >> "$OUTPUT_FILE"
@ -52,13 +68,5 @@ cat "$PROJECT_ROOT/templates/annoucements.gbai/annoucements.gbdialog/start.bas"
echo "" >> "$OUTPUT_FILE"
cd "$PROJECT_ROOT"
find "$PROJECT_ROOT/src" -type f -name "*.rs" ! -path "*/target/*" ! -name "*.lock" -print0 |
while IFS= read -r -d '' file; do
echo "File: ${file#$PROJECT_ROOT/}" >> "$OUTPUT_FILE"
grep -E '^\s*(pub\s+)?(fn|struct)\s' "$file" >> "$OUTPUT_FILE"
echo "" >> "$OUTPUT_FILE"
done
# cargo build 2>> "$OUTPUT_FILE"
cargo build --message-format=short 2>&1 | grep -E 'error' >> "$OUTPUT_FILE"

View file

@ -4,8 +4,8 @@ use rhai::Dynamic;
use rhai::Engine;
use serde_json::{json, Value};
use crate::shared::state::AppState;
use crate::shared::models::UserSession;
use crate::shared::state::AppState;
use crate::shared::utils;
use crate::shared::utils::row_to_json;
use crate::shared::utils::to_array;
@ -13,28 +13,40 @@ use crate::shared::utils::to_array;
pub fn find_keyword(state: &AppState, user: UserSession, engine: &mut Engine) {
let state_clone = state.clone();
engine
.register_custom_syntax(&["FIND", "$expr$", ",", "$expr$"], false, {
move |context, inputs| {
let table_name = context.eval_expression_tree(&inputs[0])?;
let filter = context.eval_expression_tree(&inputs[1])?;
// Register the custom FIND syntax. Any registration error is logged but does not panic.
if let Err(e) = engine.register_custom_syntax(
&["FIND", "$expr$", ",", "$expr$"],
false,
move |context, inputs| {
// Evaluate the two expressions supplied to the FIND command.
let table_name = context.eval_expression_tree(&inputs[0])?;
let filter = context.eval_expression_tree(&inputs[1])?;
let table_str = table_name.to_string();
let filter_str = filter.to_string();
let table_str = table_name.to_string();
let filter_str = filter.to_string();
let conn = state_clone.conn.lock().unwrap().clone();
let result = execute_find(&conn, &table_str, &filter_str)
.map_err(|e| format!("DB error: {}", e))?;
// Acquire a DB connection from the shared state.
let conn = state_clone
.conn
.lock()
.map_err(|e| format!("Lock error: {}", e))?
.clone();
if let Some(results) = result.get("results") {
let array = to_array(utils::json_value_to_dynamic(results));
Ok(Dynamic::from(array))
} else {
Err("No results".into())
}
// Run the actual find query.
let result = execute_find(&conn, &table_str, &filter_str)
.map_err(|e| format!("DB error: {}", e))?;
// Return the results as a Dynamic array, or an error if none were found.
if let Some(results) = result.get("results") {
let array = to_array(utils::json_value_to_dynamic(results));
Ok(Dynamic::from(array))
} else {
Err("No results".into())
}
})
.unwrap();
},
) {
error!("Failed to register FIND syntax: {}", e);
}
}
pub fn execute_find(