Add Telegram Bot notification node to n8n workflow: New Features: - Telegram notification node for critical severity logs - Italian message template with emoji and MarkdownV2 formatting - Smart routing: Telegram only for critical logs - Error handling: continueOnFail prevents workflow interruption - Environment-based configuration (TELEGRAM_CHAT_ID) Message Template Includes: - 🚨 Alert header with severity - 📍 Server hostname identification - 📝 AI-generated problem summary - 💡 Suggested bash command in code block - ⚠️ Confirmation required flag - 📝 Additional notes from AI - 📊 AI processing status - 🤖 Model used (openai/gpt-4o-mini) - ⏰ Localized Italian timestamp Workflow Flow: Webhook → HMAC → Validation → PostgreSQL → OpenRouter → Critical? → Telegram → Response ↓ FALSE Success Response Configuration Required: 1. Create Telegram Bot via @BotFather 2. Get TELEGRAM_BOT_TOKEN 3. Get TELEGRAM_CHAT_ID via @userinfobot 4. Configure credentials in n8n UI 5. Set TELEGRAM_CHAT_ID environment variable Documentation: - docs/telegram_setup.md: Setup instructions - .env.example: Environment variables template - .gitignore: Protect sensitive telegram_setup.md - docs/prd.md: Updated Sprint 2 completion status Sprint 2 Complete: ✅ Secure log ingestion (bash) ✅ n8n webhook workflow ✅ OpenRouter AI integration ✅ PostgreSQL storage ✅ HMAC authentication ✅ Telegram notifications Refs: docs/specs/ai_pipeline.md, docs/specs/bash_ingestion_secure.md
LogWhisperer AI - Workflow n8n
Workflow per l'ingestion sicura dei log con validazione HMAC-SHA256.
📋 Descrizione Workflow
Il workflow LogWhisperer_Ingest riceve i log dal client secure_logwhisperer.sh, valida l'autenticazione HMAC, e li memorizza in PostgreSQL.
Nodi del Workflow
- Webhook Trigger - Riceve POST su
/webhook/logwhisperer/ingest - HMAC Validation - Verifica firma HMAC-SHA256
- HMAC Valid? - Condizione: HMAC valido?
- Data Validation - Validazione campi obbligatori
- Store Log - Inserimento in PostgreSQL
- Critical Severity? - Condizione: severity = critical?
- AI Processing - Placeholder per elaborazione AI (Sprint 3)
- Responses - Risposte HTTP (200, 400, 401)
🚀 Installazione
Prerequisiti
- n8n installato e accessibile (http://192.168.254.12:5678)
- PostgreSQL con credenziali configurate in n8n
- Variabile ambiente
LOGWHISPERER_SECRETconfigurata
Step 1: Configurare Variabile Ambiente
# SSH nel container n8n o imposta via n8n UI
export LOGWHISPERER_SECRET="your-32-char-secret-here-minimum"
Step 2: Importare il Workflow
Metodo A: Via n8n UI
- Accedi a http://192.168.254.12:5678
- Vai su Workflows → Import from File
- Seleziona
workflows/logwhisperer_ingest.json - Clicca Save
Metodo B: Via API (curl)
# Ottieni API key da n8n UI → Settings → API
curl -X POST http://192.168.254.12:5678/api/v1/workflows \
-H "Content-Type: application/json" \
-H "X-N8N-API-KEY: your-api-key" \
-d @workflows/logwhisperer_ingest.json
Step 3: Configurare Credenziali PostgreSQL
- In n8n UI, vai su Settings → Credentials
- Crea nuova credenziale PostgreSQL
- Nome:
PostgreSQL LogWhisperer - Inserisci host, port, database, user, password
- Clicca Save
Step 4: Attivare il Workflow
- Apri il workflow
LogWhisperer_Ingest - Clicca Activate (toggle in alto a destra)
- Verifica che lo stato sia Active
🧪 Testing
Esegui Test Suite
cd /home/google/Sources/LucaSacchiNet/LogWhispererAI
./workflows/test_workflow.sh
Test Manuale con curl
Test HMAC Valido:
# Genera HMAC
TIMESTAMP=$(date +%s)
PAYLOAD='{"client_id":"550e8400-e29b-41d4-a716-446655440000","hostname":"test-server","source":"/var/log/syslog","severity":"critical","raw_log":"Apr 2 10:30:00 kernel: Out of memory","matched_pattern":"OOM"}'
SIGNATURE=$(printf '%s:%s' "$TIMESTAMP" "$PAYLOAD" | openssl dgst -sha256 -hmac "test-secret-32-chars-long-minimum" | sed 's/^.* //')
# Invia richiesta
curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${TIMESTAMP}:${SIGNATURE}" \
-H "X-LogWhisperer-Timestamp: ${TIMESTAMP}" \
-d "$PAYLOAD"
Test HMAC Invalido (deve ritornare 401):
curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: invalid-signature" \
-H "X-LogWhisperer-Timestamp: $(date +%s)" \
-d '{"client_id":"test","severity":"critical","raw_log":"test"}'
🔒 Sicurezza
Validazione HMAC
- Algoritmo: HMAC-SHA256
- Formato:
timestamp:signature - Anti-replay: Timestamp max 5 minuti di differenza
- Timing-safe comparison
Validazione Dati
client_id: UUID v4 obbligatorioraw_log: Non vuotoseverity: Uno dilow,medium,critical
Note Sicurezza
- Non loggare
raw_logcompleto nei nodi AI - Usare HTTPS in produzione
- Rate limiting: max 10 req/min per client_id
LOGWHISPERER_SECRETminimo 32 caratteri
📊 Schema Database
CREATE TABLE logs (
id SERIAL PRIMARY KEY,
client_id VARCHAR(36) NOT NULL,
hostname VARCHAR(255),
source VARCHAR(500),
severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')),
timestamp TIMESTAMP WITH TIME ZONE,
raw_log TEXT,
matched_pattern VARCHAR(100),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE INDEX idx_logs_client_id ON logs(client_id);
CREATE INDEX idx_logs_severity ON logs(severity);
CREATE INDEX idx_logs_timestamp ON logs(timestamp);
🔧 Troubleshooting
Workflow non risponde
- Verifica che n8n sia attivo:
curl http://192.168.254.12:5678/healthz - Controlla che il workflow sia attivato
- Verifica i log di n8n:
docker logs n8n
Errore 401 Unauthorized
- Verifica che
LOGWHISPERER_SECRETsia configurato - Controlla formato signature:
timestamp:signature - Verifica che timestamp sia recente (< 5 min)
Errore 400 Bad Request
- Verifica formato UUID per
client_id - Controlla che
severitysia valido - Assicurati che
raw_lognon sia vuoto
Errore Database
- Verifica credenziali PostgreSQL in n8n
- Controlla che la tabella sia stata creata
- Verifica permessi utente database
📝 Changelog
2026-04-02 - Sprint 2 Feature 2
- Creazione workflow iniziale
- Implementazione validazione HMAC
- Integrazione PostgreSQL
- Conditional alerting per severity critical
🛡️ Metodo Sacchi Applied
- Safety first: Validazione HMAC prima di ogni operazione
- Little often: Un nodo per funzione, testabili individualmente
- Double check: Validazione dati dopo HMAC, verifica salvataggio DB