Add 'Call OpenRouter' node to LogWhisperer_Ingest workflow: New Node Features: - Model: openai/gpt-4o-mini via OpenRouter API - System prompt with Metodo Sacchi (Safety First, Little Often, Double Check) - Timeout: 10 seconds with AbortController - Log truncation: max 2000 characters - Required headers: Authorization, HTTP-Referer, X-Title - Error handling with graceful fallback response - Output: JSON with ai_analysis, ai_status, ai_timestamp, ai_model Workflow Flow: Webhook → HMAC Validation → Data Validation → Store Log → Call OpenRouter → Critical Severity Check → Response Test Suite (workflows/test_openrouter.js): - 10 comprehensive tests covering: - Input/output structure validation - Log truncation logic - OpenRouter API payload format - Required HTTP headers - AI response structure - Fallback error handling - Timeout configuration - Dangerous command patterns - System Prompt Metodo Sacchi validation - Workflow connections Environment Variables Required: - OPENROUTER_API_KEY - OPENROUTER_SITE_URL (optional, defaults to https://logwhisperer.ai) - OPENROUTER_APP_NAME (optional, defaults to LogWhispererAI) Next Steps: 1. Configure environment variables in n8n 2. Import updated workflow to n8n instance 3. Configure PostgreSQL credentials 4. Test with sample log payload Refs: docs/specs/ai_pipeline.md (section 4.1)
LogWhisperer AI - Workflow n8n
Workflow per l'ingestion sicura dei log con validazione HMAC-SHA256.
📋 Descrizione Workflow
Il workflow LogWhisperer_Ingest riceve i log dal client secure_logwhisperer.sh, valida l'autenticazione HMAC, e li memorizza in PostgreSQL.
Nodi del Workflow
- Webhook Trigger - Riceve POST su
/webhook/logwhisperer/ingest - HMAC Validation - Verifica firma HMAC-SHA256
- HMAC Valid? - Condizione: HMAC valido?
- Data Validation - Validazione campi obbligatori
- Store Log - Inserimento in PostgreSQL
- Critical Severity? - Condizione: severity = critical?
- AI Processing - Placeholder per elaborazione AI (Sprint 3)
- Responses - Risposte HTTP (200, 400, 401)
🚀 Installazione
Prerequisiti
- n8n installato e accessibile (http://192.168.254.12:5678)
- PostgreSQL con credenziali configurate in n8n
- Variabile ambiente
LOGWHISPERER_SECRETconfigurata
Step 1: Configurare Variabile Ambiente
# SSH nel container n8n o imposta via n8n UI
export LOGWHISPERER_SECRET="your-32-char-secret-here-minimum"
Step 2: Importare il Workflow
Metodo A: Via n8n UI
- Accedi a http://192.168.254.12:5678
- Vai su Workflows → Import from File
- Seleziona
workflows/logwhisperer_ingest.json - Clicca Save
Metodo B: Via API (curl)
# Ottieni API key da n8n UI → Settings → API
curl -X POST http://192.168.254.12:5678/api/v1/workflows \
-H "Content-Type: application/json" \
-H "X-N8N-API-KEY: your-api-key" \
-d @workflows/logwhisperer_ingest.json
Step 3: Configurare Credenziali PostgreSQL
- In n8n UI, vai su Settings → Credentials
- Crea nuova credenziale PostgreSQL
- Nome:
PostgreSQL LogWhisperer - Inserisci host, port, database, user, password
- Clicca Save
Step 4: Attivare il Workflow
- Apri il workflow
LogWhisperer_Ingest - Clicca Activate (toggle in alto a destra)
- Verifica che lo stato sia Active
🧪 Testing
Esegui Test Suite
cd /home/google/Sources/LucaSacchiNet/LogWhispererAI
./workflows/test_workflow.sh
Test Manuale con curl
Test HMAC Valido:
# Genera HMAC
TIMESTAMP=$(date +%s)
PAYLOAD='{"client_id":"550e8400-e29b-41d4-a716-446655440000","hostname":"test-server","source":"/var/log/syslog","severity":"critical","raw_log":"Apr 2 10:30:00 kernel: Out of memory","matched_pattern":"OOM"}'
SIGNATURE=$(printf '%s:%s' "$TIMESTAMP" "$PAYLOAD" | openssl dgst -sha256 -hmac "test-secret-32-chars-long-minimum" | sed 's/^.* //')
# Invia richiesta
curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${TIMESTAMP}:${SIGNATURE}" \
-H "X-LogWhisperer-Timestamp: ${TIMESTAMP}" \
-d "$PAYLOAD"
Test HMAC Invalido (deve ritornare 401):
curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: invalid-signature" \
-H "X-LogWhisperer-Timestamp: $(date +%s)" \
-d '{"client_id":"test","severity":"critical","raw_log":"test"}'
🔒 Sicurezza
Validazione HMAC
- Algoritmo: HMAC-SHA256
- Formato:
timestamp:signature - Anti-replay: Timestamp max 5 minuti di differenza
- Timing-safe comparison
Validazione Dati
client_id: UUID v4 obbligatorioraw_log: Non vuotoseverity: Uno dilow,medium,critical
Note Sicurezza
- Non loggare
raw_logcompleto nei nodi AI - Usare HTTPS in produzione
- Rate limiting: max 10 req/min per client_id
LOGWHISPERER_SECRETminimo 32 caratteri
📊 Schema Database
CREATE TABLE logs (
id SERIAL PRIMARY KEY,
client_id VARCHAR(36) NOT NULL,
hostname VARCHAR(255),
source VARCHAR(500),
severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')),
timestamp TIMESTAMP WITH TIME ZONE,
raw_log TEXT,
matched_pattern VARCHAR(100),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE INDEX idx_logs_client_id ON logs(client_id);
CREATE INDEX idx_logs_severity ON logs(severity);
CREATE INDEX idx_logs_timestamp ON logs(timestamp);
🔧 Troubleshooting
Workflow non risponde
- Verifica che n8n sia attivo:
curl http://192.168.254.12:5678/healthz - Controlla che il workflow sia attivato
- Verifica i log di n8n:
docker logs n8n
Errore 401 Unauthorized
- Verifica che
LOGWHISPERER_SECRETsia configurato - Controlla formato signature:
timestamp:signature - Verifica che timestamp sia recente (< 5 min)
Errore 400 Bad Request
- Verifica formato UUID per
client_id - Controlla che
severitysia valido - Assicurati che
raw_lognon sia vuoto
Errore Database
- Verifica credenziali PostgreSQL in n8n
- Controlla che la tabella sia stata creata
- Verifica permessi utente database
📝 Changelog
2026-04-02 - Sprint 2 Feature 2
- Creazione workflow iniziale
- Implementazione validazione HMAC
- Integrazione PostgreSQL
- Conditional alerting per severity critical
🛡️ Metodo Sacchi Applied
- Safety first: Validazione HMAC prima di ogni operazione
- Little often: Un nodo per funzione, testabili individualmente
- Double check: Validazione dati dopo HMAC, verifica salvataggio DB