feat: create n8n workflow for secure log ingestion

Implement LogWhisperer_Ingest workflow for Sprint 2 Feature 2:

Workflow Components:
- Webhook trigger: POST /webhook/logwhisperer/ingest
- HMAC-SHA256 validation with timing-safe comparison
- Anti-replay protection (5min timestamp window)
- Data validation: UUID client_id, severity levels, non-empty logs
- PostgreSQL storage with logs table auto-creation
- Conditional routing for critical severity logs

Security Features:
- HMAC signature verification (X-LogWhisperer-Signature header)
- Timestamp validation preventing replay attacks
- Input sanitization before DB insert
- Environment variable LOGWHISPERER_SECRET for shared secret

Documentation:
- workflows/logwhisperer_ingest.json: Export JSON workflow
- workflows/README.md: Installation and usage guide
- workflows/INTEGRATION.md: Bash script integration guide
- workflows/REPORT.md: Implementation report
- workflows/test_workflow.sh: Automated test suite

Metodo Sacchi Applied:
- Safety First: HMAC validation before any processing
- Little Often: Modular nodes, each with single responsibility
- Double Check: Test suite validates all security requirements

Next Steps:
- Configure LOGWHISPERER_SECRET in n8n environment
- Import workflow to n8n instance
- Test end-to-end with secure_logwhisperer.sh
This commit is contained in:
Luca Sacchi Ricciardi
2026-04-02 19:01:40 +02:00
parent 9de40fde2d
commit 3c406ef405
6 changed files with 1427 additions and 0 deletions

353
workflows/INTEGRATION.md Normal file
View File

@@ -0,0 +1,353 @@
# Guida Integrazione: Script Bash ↔ Workflow n8n
Questa guida descrive come integrare `secure_logwhisperer.sh` con il workflow n8n `LogWhisperer_Ingest`.
## 🔄 Flusso di Dati
```
┌─────────────────┐ HMAC-SHA256 ┌──────────────────┐
│ secure_logwhis- │ ───────────────────> │ Webhook n8n │
│ perer.sh │ POST /ingest │ LogWhisperer_ │
│ │ │ Ingest │
└─────────────────┘ └──────────────────┘
┌──────────────────┐
│ PostgreSQL │
│ Table: logs │
└──────────────────┘
```
## ⚙️ Configurazione
### 1. Configurare il Client (Script Bash)
Crea un file `config.env` nella directory del progetto:
```bash
# config.env
CLIENT_ID="550e8400-e29b-41d4-a716-446655440000"
CLIENT_SECRET="your-secret-32-chars-long-minimum-here"
WEBHOOK_URL="https://192.168.254.12:5678/webhook/logwhisperer/ingest"
MAX_LINE_LENGTH=2000
OFFSET_DIR="/var/lib/logwhisperer"
```
**Requisiti:**
- `CLIENT_ID`: UUID v4 valido
- `CLIENT_SECRET`: Minimo 32 caratteri, no spazi
- `WEBHOOK_URL`: Deve usare HTTPS in produzione
### 2. Configurare il Server (n8n)
#### Impostare Variabile Ambiente
**Docker Compose:**
```yaml
services:
n8n:
image: n8nio/n8n
environment:
- LOGWHISPERER_SECRET=your-secret-32-chars-long-minimum-here
- DB_TYPE=postgresdb
- DB_POSTGRESDB_HOST=postgres
- DB_POSTGRESDB_DATABASE=n8n
- DB_POSTGRESDB_USER=n8n
- DB_POSTGRESDB_PASSWORD=password
```
**Docker Run:**
```bash
docker run -d \
--name n8n \
-p 5678:5678 \
-e LOGWHISPERER_SECRET="your-secret-32-chars-long-minimum-here" \
-v ~/.n8n:/home/node/.n8n \
n8nio/n8n
```
#### Configurare Credenziali PostgreSQL
1. Accedi a n8n UI: http://192.168.254.12:5678
2. Vai su **Settings****Credentials**
3. Clicca **Add Credential**
4. Seleziona **PostgreSQL**
5. Configura:
- **Name**: `PostgreSQL LogWhisperer`
- **Host**: `postgres` (o IP del tuo DB)
- **Port**: `5432`
- **Database**: `logwhisperer`
- **User**: `logwhisperer`
- **Password**: `your-password`
- **SSL**: `disable` (per test locale)
### 3. Verificare Segreto Condiviso
Il segreto DEVE essere identico su client e server:
**Client (Bash):**
```bash
# Genera firma di test
./scripts/secure_logwhisperer.sh --generate-hmac '{"test":"data"}' 'test-secret-32-chars-long-minimum' 1234567890
# Output: 1234567890:abc123...
```
**Server (n8n Code Node):**
Il nodo `HMAC Validation` calcola la firma con lo stesso algoritmo:
```javascript
const expectedSignature = crypto
.createHmac('sha256', secret)
.update(`${timestamp}:${payload}`)
.digest('hex');
```
## 🚀 Esempio di Uso Completo
### Step 1: Validare Configurazione
```bash
cd /home/google/Sources/LucaSacchiNet/LogWhispererAI
# Verifica dipendenze
./scripts/secure_logwhisperer.sh --check-deps
# Valida configurazione
./scripts/secure_logwhisperer.sh --validate-config
```
### Step 2: Test Ingezione Singola
```bash
# Sanitizza una linea di log
SANITIZED=$(./scripts/secure_logwhisperer.sh --sanitize-line "Apr 2 10:30:00 kernel: password=secret123 Out of memory")
echo "$SANITIZED"
# Output: Apr 2 10:30:00 kernel: password=*** Out of memory
# Genera payload JSON
PAYLOAD=$(./scripts/secure_logwhisperer.sh --encode-json '{
"client_id": "550e8400-e29b-41d4-a716-446655440000",
"hostname": "web-server-01",
"source": "/var/log/syslog",
"severity": "critical",
"raw_log": "Apr 2 10:30:00 kernel: Out of memory",
"matched_pattern": "OOM"
}')
# Genera firma HMAC
TIMESTAMP=$(date +%s)
SIGNATURE=$(./scripts/secure_logwhisperer.sh --generate-hmac "$PAYLOAD" "$CLIENT_SECRET" "$TIMESTAMP")
# Invia a n8n
curl -X POST "$WEBHOOK_URL" \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: $SIGNATURE" \
-H "X-LogWhisperer-Timestamp: $TIMESTAMP" \
-d "$PAYLOAD"
```
### Step 3: Verifica Salvataggio
```bash
# Connettiti al database PostgreSQL
psql -h localhost -U logwhisperer -d logwhisperer
# Query per verificare inserimento
SELECT * FROM logs ORDER BY created_at DESC LIMIT 5;
# Esci
\q
```
## 🔐 Sicurezza End-to-End
### HMAC Signature Format
```
Header: X-LogWhisperer-Signature: <timestamp>:<signature>
Header: X-LogWhisperer-Timestamp: <timestamp>
Dove:
- timestamp: Unix epoch seconds
- signature: HMAC-SHA256(timestamp:payload, secret)
```
### Esempio Calcolo HMAC (Bash)
```bash
#!/bin/bash
payload='{"client_id":"550e8400-e29b-41d4-a716-446655440000","severity":"critical","raw_log":"test"}'
timestamp=$(date +%s)
secret="test-secret-32-chars-long-minimum"
# Calcola HMAC
signature=$(printf '%s:%s' "$timestamp" "$payload" | \
openssl dgst -sha256 -hmac "$secret" | \
sed 's/^.* //')
echo "Timestamp: $timestamp"
echo "Signature: $signature"
echo "Full: ${timestamp}:${signature}"
```
### Esempio Calcolo HMAC (JavaScript/n8n)
```javascript
const crypto = require('crypto');
const payload = '{"client_id":"550e8400-e29b-41d4-a716-446655440000","severity":"critical","raw_log":"test"}';
const timestamp = Math.floor(Date.now() / 1000);
const secret = "test-secret-32-chars-long-minimum";
const signature = crypto
.createHmac('sha256', secret)
.update(`${timestamp}:${payload}`)
.digest('hex');
console.log(`Timestamp: ${timestamp}`);
console.log(`Signature: ${signature}`);
console.log(`Full: ${timestamp}:${signature}`);
```
## 🧪 Test di Integrazione
### Test 1: Validazione Completa
```bash
./workflows/test_workflow.sh
```
### Test 2: Flusso Completo
```bash
# 1. Crea una linea di log di test
echo "Apr 2 12:00:00 kernel: FATAL: Out of memory: Kill process 1234" > /tmp/test_critical.log
# 2. Processa con secure_logwhisperer.sh
# (Assumendo che lo script legga da file e invii a webhook)
# TODO: Implementare modalità daemon nel prossimo sprint
# 3. Verifica in database
psql -h localhost -U logwhisperer -c "SELECT * FROM logs WHERE severity='critical' ORDER BY created_at DESC LIMIT 1;"
```
## 📊 Monitoraggio
### Log n8n
```bash
# Visualizza log in tempo reale
docker logs -f n8n
# Cerca errori specifici
docker logs n8n 2>&1 | grep -i "logwhisperer\|error\|unauthorized"
```
### Metriche
- **Richieste totali**: Conteggio righe in tabella `logs`
- **Errori 401**: Webhook chiamate rifiutate (HMAC invalido)
- **Errori 400**: Validazione dati fallita
- **Latency**: Tempo medio di risposta del webhook
## 🐛 Troubleshooting Comuni
### "Invalid signature" (401)
**Causa**: Segreti diversi tra client e server
**Soluzione**:
```bash
# Verifica segreto sul client
echo "CLIENT_SECRET: $CLIENT_SECRET"
# Verifica segreto sul server (n8n container)
docker exec n8n echo "$LOGWHISPERER_SECRET"
# Devono essere identici!
```
### "Request timestamp too old" (401)
**Causa**: Clock skew tra client e server
**Soluzione**:
```bash
# Sincronizza orario
sudo ntpdate pool.ntp.org
# O su container n8n
docker exec n8n date
docker exec n8n sh -c "date -s '@$(date +%s)'"
```
### Database connection error
**Causa**: Credenziali PostgreSQL errate o database non raggiungibile
**Soluzione**:
```bash
# Test connessione
docker exec n8n pg_isready -h postgres -p 5432
# Verifica credenziali
docker exec n8n psql -h postgres -U logwhisperer -d logwhisperer -c "SELECT 1;"
```
## 🔄 Workflow CI/CD
Per test automatici in CI/CD:
```yaml
# .github/workflows/integration-test.yml
name: Integration Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:15
env:
POSTGRES_DB: logwhisperer
POSTGRES_USER: logwhisperer
POSTGRES_PASSWORD: test
ports:
- 5432:5432
n8n:
image: n8nio/n8n
env:
LOGWHISPERER_SECRET: test-secret-32-chars-long-minimum
ports:
- 5678:5678
steps:
- uses: actions/checkout@v3
- name: Import Workflow
run: |
curl -X POST http://localhost:5678/api/v1/workflows \
-H "Content-Type: application/json" \
-d @workflows/logwhisperer_ingest.json
- name: Run Tests
run: ./workflows/test_workflow.sh
```
## 📝 Checklist Pre-Deploy
- [ ] `CLIENT_SECRET` configurato su client (min 32 chars)
- [ ] `LOGWHISPERER_SECRET` configurato su server (identico al client)
- [ ] Credenziali PostgreSQL configurate in n8n
- [ ] Workflow importato e attivato
- [ ] Tabella `logs` creata (automatizzato dal workflow)
- [ ] Test suite passati (`./workflows/test_workflow.sh`)
- [ ] HTTPS abilitato (in produzione)
- [ ] Rate limiting configurato
- [ ] Monitoring e alerting attivo

186
workflows/README.md Normal file
View File

@@ -0,0 +1,186 @@
# LogWhisperer AI - Workflow n8n
Workflow per l'ingestion sicura dei log con validazione HMAC-SHA256.
## 📋 Descrizione Workflow
Il workflow `LogWhisperer_Ingest` riceve i log dal client `secure_logwhisperer.sh`, valida l'autenticazione HMAC, e li memorizza in PostgreSQL.
### Nodi del Workflow
1. **Webhook Trigger** - Riceve POST su `/webhook/logwhisperer/ingest`
2. **HMAC Validation** - Verifica firma HMAC-SHA256
3. **HMAC Valid?** - Condizione: HMAC valido?
4. **Data Validation** - Validazione campi obbligatori
5. **Store Log** - Inserimento in PostgreSQL
6. **Critical Severity?** - Condizione: severity = critical?
7. **AI Processing** - Placeholder per elaborazione AI (Sprint 3)
8. **Responses** - Risposte HTTP (200, 400, 401)
## 🚀 Installazione
### Prerequisiti
- n8n installato e accessibile (http://192.168.254.12:5678)
- PostgreSQL con credenziali configurate in n8n
- Variabile ambiente `LOGWHISPERER_SECRET` configurata
### Step 1: Configurare Variabile Ambiente
```bash
# SSH nel container n8n o imposta via n8n UI
export LOGWHISPERER_SECRET="your-32-char-secret-here-minimum"
```
### Step 2: Importare il Workflow
**Metodo A: Via n8n UI**
1. Accedi a http://192.168.254.12:5678
2. Vai su **Workflows****Import from File**
3. Seleziona `workflows/logwhisperer_ingest.json`
4. Clicca **Save**
**Metodo B: Via API (curl)**
```bash
# Ottieni API key da n8n UI → Settings → API
curl -X POST http://192.168.254.12:5678/api/v1/workflows \
-H "Content-Type: application/json" \
-H "X-N8N-API-KEY: your-api-key" \
-d @workflows/logwhisperer_ingest.json
```
### Step 3: Configurare Credenziali PostgreSQL
1. In n8n UI, vai su **Settings****Credentials**
2. Crea nuova credenziale **PostgreSQL**
3. Nome: `PostgreSQL LogWhisperer`
4. Inserisci host, port, database, user, password
5. Clicca **Save**
### Step 4: Attivare il Workflow
1. Apri il workflow `LogWhisperer_Ingest`
2. Clicca **Activate** (toggle in alto a destra)
3. Verifica che lo stato sia **Active**
## 🧪 Testing
### Esegui Test Suite
```bash
cd /home/google/Sources/LucaSacchiNet/LogWhispererAI
./workflows/test_workflow.sh
```
### Test Manuale con curl
**Test HMAC Valido:**
```bash
# Genera HMAC
TIMESTAMP=$(date +%s)
PAYLOAD='{"client_id":"550e8400-e29b-41d4-a716-446655440000","hostname":"test-server","source":"/var/log/syslog","severity":"critical","raw_log":"Apr 2 10:30:00 kernel: Out of memory","matched_pattern":"OOM"}'
SIGNATURE=$(printf '%s:%s' "$TIMESTAMP" "$PAYLOAD" | openssl dgst -sha256 -hmac "test-secret-32-chars-long-minimum" | sed 's/^.* //')
# Invia richiesta
curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${TIMESTAMP}:${SIGNATURE}" \
-H "X-LogWhisperer-Timestamp: ${TIMESTAMP}" \
-d "$PAYLOAD"
```
**Test HMAC Invalido (deve ritornare 401):**
```bash
curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: invalid-signature" \
-H "X-LogWhisperer-Timestamp: $(date +%s)" \
-d '{"client_id":"test","severity":"critical","raw_log":"test"}'
```
## 🔒 Sicurezza
### Validazione HMAC
- Algoritmo: HMAC-SHA256
- Formato: `timestamp:signature`
- Anti-replay: Timestamp max 5 minuti di differenza
- Timing-safe comparison
### Validazione Dati
- `client_id`: UUID v4 obbligatorio
- `raw_log`: Non vuoto
- `severity`: Uno di `low`, `medium`, `critical`
### Note Sicurezza
- Non loggare `raw_log` completo nei nodi AI
- Usare HTTPS in produzione
- Rate limiting: max 10 req/min per client_id
- `LOGWHISPERER_SECRET` minimo 32 caratteri
## 📊 Schema Database
```sql
CREATE TABLE logs (
id SERIAL PRIMARY KEY,
client_id VARCHAR(36) NOT NULL,
hostname VARCHAR(255),
source VARCHAR(500),
severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')),
timestamp TIMESTAMP WITH TIME ZONE,
raw_log TEXT,
matched_pattern VARCHAR(100),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE INDEX idx_logs_client_id ON logs(client_id);
CREATE INDEX idx_logs_severity ON logs(severity);
CREATE INDEX idx_logs_timestamp ON logs(timestamp);
```
## 🔧 Troubleshooting
### Workflow non risponde
1. Verifica che n8n sia attivo: `curl http://192.168.254.12:5678/healthz`
2. Controlla che il workflow sia attivato
3. Verifica i log di n8n: `docker logs n8n`
### Errore 401 Unauthorized
- Verifica che `LOGWHISPERER_SECRET` sia configurato
- Controlla formato signature: `timestamp:signature`
- Verifica che timestamp sia recente (< 5 min)
### Errore 400 Bad Request
- Verifica formato UUID per `client_id`
- Controlla che `severity` sia valido
- Assicurati che `raw_log` non sia vuoto
### Errore Database
- Verifica credenziali PostgreSQL in n8n
- Controlla che la tabella sia stata creata
- Verifica permessi utente database
## 📝 Changelog
### 2026-04-02 - Sprint 2 Feature 2
- Creazione workflow iniziale
- Implementazione validazione HMAC
- Integrazione PostgreSQL
- Conditional alerting per severity critical
## 🛡️ Metodo Sacchi Applied
- **Safety first**: Validazione HMAC prima di ogni operazione
- **Little often**: Un nodo per funzione, testabili individualmente
- **Double check**: Validazione dati dopo HMAC, verifica salvataggio DB

271
workflows/REPORT.md Normal file
View File

@@ -0,0 +1,271 @@
# 📊 Report: Workflow LogWhisperer_Ingest Creato
## ✅ Stato: COMPLETATO
Il workflow n8n per l'ingestion sicura dei log è stato creato con successo seguendo il **Metodo Sacchi**.
---
## 📁 File Creati
| File | Descrizione | Dimensione |
|------|-------------|------------|
| `workflows/logwhisperer_ingest.json` | Export JSON del workflow n8n | 12.7 KB |
| `workflows/test_workflow.sh` | Test suite automatizzata | 6.3 KB |
| `workflows/README.md` | Documentazione workflow | 5.3 KB |
| `workflows/INTEGRATION.md` | Guida integrazione Bash ↔ n8n | 9.2 KB |
---
## 🔧 Architettura Workflow
```
┌──────────────────┐
│ Webhook Trigger │ ◄── POST /webhook/logwhisperer/ingest
│ (POST /ingest) │
└────────┬─────────┘
┌──────────────────┐
│ HMAC Validation │ ◄── Verifica firma HMAC-SHA256
│ (Code Node) │ Anti-replay (max 5 min)
└────────┬─────────┘
┌─────────┐
│Valid? │
└────┬────┘
Yes │ No
┌─┴─┐
▼ ▼
┌──────────┐ ┌──────────┐
│Data Val. │ │ 401 Resp │
└────┬─────┘ └──────────┘
┌──────────────────┐
│ Store Log │ ◄── PostgreSQL INSERT
│ (PostgreSQL) │ Tabella: logs
└────────┬─────────┘
┌───────────┐
│Critical? │
└─────┬─────┘
Yes │ No
┌─┴─┐
▼ ▼
┌──────────┐ ┌──────────┐
│AI Process│ │200 OK │
│(Sprint 3)│ └──────────┘
└──────────┘
```
---
## 🔐 Sicurezza Implementata
### HMAC Validation
- **Algoritmo**: HMAC-SHA256
- **Formato**: `timestamp:signature` in header `X-LogWhisperer-Signature`
- **Anti-replay**: Timestamp max 5 minuti di differenza
- **Timing-safe**: Comparazione signature con `crypto.timingSafeEqual()`
### Data Validation
- `client_id`: UUID v4 obbligatorio (regex validazione)
- `raw_log`: Non vuoto dopo trim
- `severity`: Solo `low`, `medium`, `critical`
- **Normalizzazione**: Severity convertito a lowercase
### Database
- **Tabella**: `logs` con vincoli CHECK su severity
- **Indici**: client_id, severity, timestamp
- **Audit**: created_at automatico
---
## 🚀 Istruzioni per Attivazione
### 1. Importa il Workflow
```bash
# Via n8n UI
curl -X POST http://192.168.254.12:5678/api/v1/workflows \
-H "Content-Type: application/json" \
-H "X-N8N-API-KEY: your-api-key" \
-d @workflows/logwhisperer_ingest.json
```
### 2. Configura Variabile Ambiente
```bash
# Nel container n8n
docker exec n8n sh -c 'export LOGWHISPERER_SECRET="your-32-char-secret-here"'
# O in docker-compose.yml
environment:
- LOGWHISPERER_SECRET=your-32-char-secret-here
```
### 3. Configura Credenziali PostgreSQL
1. Vai su http://192.168.254.12:5678/settings/credentials
2. Crea credenziale **PostgreSQL**
3. Nome: `PostgreSQL LogWhisperer`
4. Inserisci host, port, database, user, password
### 4. Attiva il Workflow
1. Apri il workflow in n8n UI
2. Clicca **Activate** (toggle in alto a destra)
3. Verifica stato **Active**
---
## 🧪 Test Suite
```bash
# Esegui tutti i test
cd /home/google/Sources/LucaSacchiNet/LogWhispererAI
./workflows/test_workflow.sh
# Output atteso:
# ==========================================
# LogWhisperer AI - Workflow Test Suite
# Target: http://192.168.254.12:5678
# ==========================================
#
# [INFO] Test 1: Invio log con HMAC valido...
# [INFO] ✓ Test 1 PASSATO: Risposta 200 OK
#
# [INFO] Test 2: Invio log con HMAC invalido...
# [INFO] ✓ Test 2 PASSATO: Risposta 401 Unauthorized (atteso)
#
# [INFO] Test 3: Invio log con dati invalidi...
# [INFO] ✓ Test 3 PASSATO: Risposta 400 Bad Request (atteso)
#
# [INFO] Test 4: Invio log con severity=medium...
# [INFO] ✓ Test 4 PASSATO: Risposta 200 OK (no AI trigger)
#
# ==========================================
# [INFO] Tutti i test PASSATI! ✓
```
---
## 🔗 Integrazione con Script Bash
### Configurazione Script
```bash
# config.env
CLIENT_ID="550e8400-e29b-41d4-a716-446655440000"
CLIENT_SECRET="your-32-char-secret-here"
WEBHOOK_URL="http://192.168.254.12:5678/webhook/logwhisperer/ingest"
```
**⚠️ IMPORTANTE**: `CLIENT_SECRET` deve essere identico a `LOGWHISPERER_SECRET` su n8n!
### Esempio di Chiamata
```bash
# Genera payload
PAYLOAD=$(./scripts/secure_logwhisperer.sh --encode-json '{
"client_id": "550e8400-e29b-41d4-a716-446655440000",
"hostname": "server-01",
"source": "/var/log/syslog",
"severity": "critical",
"raw_log": "kernel: Out of memory",
"matched_pattern": "OOM"
}')
# Genera firma HMAC
TIMESTAMP=$(date +%s)
SIGNATURE=$(./scripts/secure_logwhisperer.sh --generate-hmac "$PAYLOAD" "$CLIENT_SECRET" "$TIMESTAMP")
# Invia a n8n
curl -X POST "$WEBHOOK_URL" \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${TIMESTAMP}:${SIGNATURE}" \
-H "X-LogWhisperer-Timestamp: ${TIMESTAMP}" \
-d "$PAYLOAD"
```
---
## 📊 Schema Database
```sql
CREATE TABLE IF NOT EXISTS logs (
id SERIAL PRIMARY KEY,
client_id VARCHAR(36) NOT NULL,
hostname VARCHAR(255),
source VARCHAR(500),
severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')),
timestamp TIMESTAMP WITH TIME ZONE,
raw_log TEXT,
matched_pattern VARCHAR(100),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE INDEX idx_logs_client_id ON logs(client_id);
CREATE INDEX idx_logs_severity ON logs(severity);
CREATE INDEX idx_logs_timestamp ON logs(timestamp);
```
---
## ⚡ Codici Risposta
| HTTP | Significato | Causa |
|------|-------------|-------|
| 200 | Success | Log salvato correttamente |
| 400 | Bad Request | Validazione dati fallita |
| 401 | Unauthorized | HMAC invalido o timestamp troppo vecchio |
| 500 | Server Error | Errore database o nodo code |
---
## 🛡️ Metodo Sacchi Applied
**Safety First**: Validazione HMAC prima di qualsiasi operazione
- Il nodo HMAC Validation è il primo filtro
- Nessun dato viene processato senza autenticazione valida
**Little Often**: Un nodo per funzione
- Webhook trigger separato
- HMAC validation isolata
- Data validation dedicata
- Storage separato
- Conditional alerting
**Double Check**: Verifiche multiple
- Validazione UUID formato
- Validazione severity values
- Controllo non-empty raw_log
- Verifica timestamp (anti-replay)
- Timing-safe HMAC comparison
---
## 📝 Note per Sprint 3
Il nodo **AI Processing (Placeholder)** è pronto per essere esteso:
- Riceve i dati del log quando severity = critical
- Non espone raw_log nei log (sicurezza)
- Pronto per integrazione con LLM/API AI
---
## 📚 Documentazione
- **README**: `workflows/README.md` - Guida completa workflow
- **Integrazione**: `workflows/INTEGRATION.md` - Integrazione Bash ↔ n8n
- **Test**: `workflows/test_workflow.sh` - Test suite
- **Changelog**: Aggiornato in `CHANGELOG.md`
---
**Creato da**: @n8n-specialist
**Data**: 2026-04-02
**Status**: ✅ Pronto per deployment

View File

@@ -0,0 +1,355 @@
{
"name": "LogWhisperer_Ingest",
"nodes": [
{
"parameters": {},
"id": "trigger-node",
"name": "Webhook Trigger",
"type": "n8n-nodes-base.webhook",
"typeVersion": 1,
"position": [
250,
300
],
"webhookId": "logwhisperer-ingest",
"path": "logwhisperer/ingest",
"responseMode": "responseNode",
"options": {}
},
{
"parameters": {
"jsCode": "// HMAC Validation Node\n// Verifica la firma HMAC-SHA256 secondo il Metodo Sacchi: Safety First\n\nconst crypto = require('crypto');\n\n// Recupera headers\nconst signatureHeader = $headers['x-logwhisperer-signature'];\nconst timestampHeader = $headers['x-logwhisperer-timestamp'];\n\n// Recupera secret da variabile ambiente\nconst secret = process.env.LOGWHISPERER_SECRET;\n\nif (!secret) {\n throw new Error('LOGWHISPERER_SECRET not configured');\n}\n\nif (!signatureHeader || !timestampHeader) {\n return [{\n json: {\n valid: false,\n error: 'Missing authentication headers',\n statusCode: 401\n }\n }];\n}\n\n// Estrai timestamp e signature dal formato: timestamp:signature\nconst parts = signatureHeader.split(':');\nif (parts.length !== 2) {\n return [{\n json: {\n valid: false,\n error: 'Invalid signature format',\n statusCode: 401\n }\n }];\n}\n\nconst receivedTimestamp = parts[0];\nconst receivedSignature = parts[1];\n\n// Verifica timestamp (anti-replay: max 5 minuti di differenza)\nconst now = Math.floor(Date.now() / 1000);\nconst requestTime = parseInt(timestampHeader, 10);\nconst timeDiff = Math.abs(now - requestTime);\n\nif (timeDiff > 300) {\n return [{\n json: {\n valid: false,\n error: 'Request timestamp too old',\n statusCode: 401\n }\n }];\n}\n\n// Ottieni il payload raw\nconst payload = JSON.stringify($input.first().json);\n\n// Calcola HMAC atteso: HMAC-SHA256(timestamp:payload)\nconst expectedSignature = crypto\n .createHmac('sha256', secret)\n .update(`${timestampHeader}:${payload}`)\n .digest('hex');\n\n// Comparazione timing-safe\nconst isValid = crypto.timingSafeEqual(\n Buffer.from(receivedSignature, 'hex'),\n Buffer.from(expectedSignature, 'hex')\n);\n\nif (!isValid) {\n return [{\n json: {\n valid: false,\n error: 'Invalid signature',\n statusCode: 401\n }\n }];\n}\n\n// Validazione HMAC passata - restituisci payload per il prossimo nodo\nreturn [{\n json: {\n valid: true,\n data: $input.first().json\n }\n}];"
},
"id": "hmac-validation-node",
"name": "HMAC Validation",
"type": "n8n-nodes-base.code",
"typeVersion": 2,
"position": [
450,
300
]
},
{
"parameters": {
"conditions": {
"options": {
"caseSensitive": true,
"leftValue": "",
"typeValidation": "strict"
},
"conditions": [
{
"id": "condition-1",
"leftValue": "={{ $json.valid }}",
"rightValue": "true",
"operator": {
"type": "boolean",
"operation": "equals"
}
}
],
"combinator": "and"
}
},
"id": "hmac-check-node",
"name": "HMAC Valid?",
"type": "n8n-nodes-base.if",
"typeVersion": 2,
"position": [
650,
300
]
},
{
"parameters": {
"jsCode": "// Data Validation Node\n// Validazione campi obbligatori secondo il Metodo Sacchi: Double Check\n\nconst data = $input.first().json.data;\nconst errors = [];\n\n// Validazione UUID per client_id\nfunction isValidUUID(uuid) {\n const uuidRegex = /^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$/;\n return uuidRegex.test(uuid);\n}\n\n// Validazione campi obbligatori\nif (!data.client_id) {\n errors.push('Missing client_id');\n} else if (!isValidUUID(data.client_id)) {\n errors.push('Invalid client_id format (must be UUID)');\n}\n\nif (!data.raw_log || data.raw_log.trim() === '') {\n errors.push('Missing or empty raw_log');\n}\n\nconst validSeverities = ['low', 'medium', 'critical'];\nif (!data.severity) {\n errors.push('Missing severity');\n} else if (!validSeverities.includes(data.severity.toLowerCase())) {\n errors.push(`Invalid severity: ${data.severity} (must be one of: ${validSeverities.join(', ')})`);\n}\n\nif (errors.length > 0) {\n return [{\n json: {\n valid: false,\n errors: errors,\n received: data\n }\n }];\n}\n\n// Normalizza severity a lowercase\nconst normalizedData = {\n ...data,\n severity: data.severity.toLowerCase()\n};\n\nreturn [{\n json: {\n valid: true,\n data: normalizedData\n }\n}];"
},
"id": "data-validation-node",
"name": "Data Validation",
"type": "n8n-nodes-base.code",
"typeVersion": 2,
"position": [
850,
200
]
},
{
"parameters": {
"operation": "executeQuery",
"query": "INSERT INTO logs (client_id, hostname, source, severity, timestamp, raw_log, matched_pattern)\nVALUES ($1, $2, $3, $4, $5, $6, $7)\nRETURNING id;",
"options": {
"queryParams": "={{ JSON.stringify([$json.data.client_id, $json.data.hostname, $json.data.source, $json.data.severity, $json.data.timestamp, $json.data.raw_log, $json.data.matched_pattern]) }}"
}
},
"id": "postgres-insert-node",
"name": "Store Log",
"type": "n8n-nodes-base.postgres",
"typeVersion": 2.2,
"position": [
1050,
200
],
"credentials": {
"postgres": {
"id": "postgres-credentials",
"name": "PostgreSQL LogWhisperer"
}
}
},
{
"parameters": {
"conditions": {
"options": {
"caseSensitive": false,
"leftValue": "",
"typeValidation": "strict"
},
"conditions": [
{
"id": "condition-1",
"leftValue": "={{ $json.data.severity }}",
"rightValue": "critical",
"operator": {
"type": "string",
"operation": "equals"
}
}
],
"combinator": "or"
}
},
"id": "severity-check-node",
"name": "Critical Severity?",
"type": "n8n-nodes-base.if",
"typeVersion": 2,
"position": [
1250,
100
]
},
{
"parameters": {
"jsCode": "// AI Processing Placeholder\n// Per Sprint 2 Feature 2: prepara i dati per l'elaborazione AI\n// Il nodo vero e proprio sarà implementato nello Sprint 3\n\nconst logData = $input.first().json;\n\n// Log di sicurezza: non esporre raw_log completo nei log\nconsole.log('AI Processing requested for log ID:', logData.id);\nconsole.log('Client:', logData.data.client_id);\nconsole.log('Severity:', logData.data.severity);\nconsole.log('Pattern:', logData.data.matched_pattern);\n\nreturn [{\n json: {\n status: 'queued_for_ai_processing',\n log_id: logData.id,\n client_id: logData.data.client_id,\n severity: logData.data.severity,\n pattern: logData.data.matched_pattern,\n // raw_log escluso per sicurezza\n timestamp: new Date().toISOString()\n }\n}];"
},
"id": "ai-processing-node",
"name": "AI Processing (Placeholder)",
"type": "n8n-nodes-base.code",
"typeVersion": 2,
"position": [
1450,
50
]
},
{
"parameters": {
"respondWith": "json",
"options": {}
},
"id": "success-response-node",
"name": "Success Response",
"type": "n8n-nodes-base.respondToWebhook",
"typeVersion": 1.1,
"position": [
1450,
250
]
},
{
"parameters": {
"respondWith": "json",
"responseBody": "={\"error\": \"Unauthorized\", \"message\": \"Invalid HMAC signature\"}",
"options": {
"responseCode": 401
}
},
"id": "unauthorized-response-node",
"name": "401 Unauthorized",
"type": "n8n-nodes-base.respondToWebhook",
"typeVersion": 1.1,
"position": [
850,
400
]
},
{
"parameters": {
"respondWith": "json",
"responseBody": "={\"error\": \"Bad Request\", \"message\": $json.errors.join(\", \")}",
"options": {
"responseCode": 400
}
},
"id": "validation-error-node",
"name": "400 Validation Error",
"type": "n8n-nodes-base.respondToWebhook",
"typeVersion": 1.1,
"position": [
1050,
400
]
},
{
"parameters": {
"jsCode": "// Ensure Table Exists\n// Crea la tabella logs se non esiste già\n\nreturn [{\n json: {\n sql: `\n CREATE TABLE IF NOT EXISTS logs (\n id SERIAL PRIMARY KEY,\n client_id VARCHAR(36) NOT NULL,\n hostname VARCHAR(255),\n source VARCHAR(500),\n severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')),\n timestamp TIMESTAMP WITH TIME ZONE,\n raw_log TEXT,\n matched_pattern VARCHAR(100),\n created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP\n );\n \n CREATE INDEX IF NOT EXISTS idx_logs_client_id ON logs(client_id);\n CREATE INDEX IF NOT EXISTS idx_logs_severity ON logs(severity);\n CREATE INDEX IF NOT EXISTS idx_logs_timestamp ON logs(timestamp);\n `\n }\n}];"
},
"id": "ensure-table-node",
"name": "Ensure Table SQL",
"type": "n8n-nodes-base.code",
"typeVersion": 2,
"position": [
650,
100
]
},
{
"parameters": {
"operation": "executeQuery",
"query": "={{ $json.sql }}",
"options": {}
},
"id": "create-table-node",
"name": "Create Table",
"type": "n8n-nodes-base.postgres",
"typeVersion": 2.2,
"position": [
850,
100
],
"credentials": {
"postgres": {
"id": "postgres-credentials",
"name": "PostgreSQL LogWhisperer"
}
}
}
],
"connections": {
"Webhook Trigger": {
"main": [
[
{
"node": "HMAC Validation",
"type": "main",
"index": 0
},
{
"node": "Ensure Table SQL",
"type": "main",
"index": 0
}
]
]
},
"HMAC Validation": {
"main": [
[
{
"node": "HMAC Valid?",
"type": "main",
"index": 0
}
]
]
},
"HMAC Valid?": {
"main": [
[
{
"node": "Data Validation",
"type": "main",
"index": 0
}
],
[
{
"node": "401 Unauthorized",
"type": "main",
"index": 0
}
]
]
},
"Data Validation": {
"main": [
[
{
"node": "Store Log",
"type": "main",
"index": 0
}
],
[
{
"node": "400 Validation Error",
"type": "main",
"index": 0
}
]
]
},
"Store Log": {
"main": [
[
{
"node": "Critical Severity?",
"type": "main",
"index": 0
},
{
"node": "Success Response",
"type": "main",
"index": 0
}
]
]
},
"Critical Severity?": {
"main": [
[
{
"node": "AI Processing (Placeholder)",
"type": "main",
"index": 0
}
],
[
{
"node": "Success Response",
"type": "main",
"index": 0
}
]
]
},
"Ensure Table SQL": {
"main": [
[
{
"node": "Create Table",
"type": "main",
"index": 0
}
]
]
}
},
"settings": {
"executionOrder": "v1"
},
"staticData": null,
"tags": [
{
"name": "logwhisperer",
"id": "tag-logwhisperer",
"createdAt": "2026-04-02T00:00:00.000Z",
"updatedAt": "2026-04-02T00:00:00.000Z"
},
{
"name": "security",
"id": "tag-security",
"createdAt": "2026-04-02T00:00:00.000Z",
"updatedAt": "2026-04-02T00:00:00.000Z"
},
"ingestion"
]
}

250
workflows/test_workflow.sh Executable file
View File

@@ -0,0 +1,250 @@
#!/bin/bash
#
# LogWhisperer AI - Workflow Test Script
# Verifica che il workflow n8n risponda correttamente
#
set -euo pipefail
N8N_URL="${N8N_URL:-http://192.168.254.12:5678}"
WEBHOOK_PATH="/webhook/logwhisperer/ingest"
CLIENT_SECRET="${CLIENT_SECRET:-test-secret-32-chars-long-minimum}"
# Colori per output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
log_info() {
echo -e "${GREEN}[INFO]${NC} $1"
}
log_error() {
echo -e "${RED}[ERROR]${NC} $1"
}
log_warn() {
echo -e "${YELLOW}[WARN]${NC} $1"
}
# Funzione per generare HMAC
generate_hmac() {
local payload="$1"
local timestamp="$2"
local secret="$3"
printf '%s:%s' "$timestamp" "$payload" | \
openssl dgst -sha256 -hmac "$secret" | \
sed 's/^.* //'
}
# Test 1: HMAC Valido
test_valid_hmac() {
log_info "Test 1: Invio log con HMAC valido..."
local timestamp
timestamp=$(date +%s)
local payload
payload=$(cat <<EOF
{
"client_id": "550e8400-e29b-41d4-a716-446655440000",
"hostname": "test-server",
"source": "/var/log/syslog",
"severity": "critical",
"raw_log": "Apr 2 10:30:00 kernel: Out of memory",
"matched_pattern": "OOM"
}
EOF
)
local signature
signature=$(generate_hmac "$payload" "$timestamp" "$CLIENT_SECRET")
local response
response=$(curl -s -w "\n%{http_code}" -X POST "${N8N_URL}${WEBHOOK_PATH}" \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${timestamp}:${signature}" \
-H "X-LogWhisperer-Timestamp: ${timestamp}" \
-d "$payload" || echo "000")
local http_code
http_code=$(echo "$response" | tail -n1)
local body
body=$(echo "$response" | sed '$d')
if [[ "$http_code" == "200" ]]; then
log_info "✓ Test 1 PASSATO: Risposta 200 OK"
echo " Risposta: $body"
return 0
else
log_error "✗ Test 1 FALLITO: Atteso 200, ricevuto $http_code"
echo " Risposta: $body"
return 1
fi
}
# Test 2: HMAC Invalido
test_invalid_hmac() {
log_info "Test 2: Invio log con HMAC invalido..."
local timestamp
timestamp=$(date +%s)
local payload
payload=$(cat <<EOF
{
"client_id": "550e8400-e29b-41d4-a716-446655440000",
"severity": "critical",
"raw_log": "test error"
}
EOF
)
local response
response=$(curl -s -w "\n%{http_code}" -X POST "${N8N_URL}${WEBHOOK_PATH}" \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: invalid-signature" \
-H "X-LogWhisperer-Timestamp: ${timestamp}" \
-d "$payload" || echo "000")
local http_code
http_code=$(echo "$response" | tail -n1)
if [[ "$http_code" == "401" ]]; then
log_info "✓ Test 2 PASSATO: Risposta 401 Unauthorized (atteso)"
return 0
else
log_error "✗ Test 2 FALLITO: Atteso 401, ricevuto $http_code"
return 1
fi
}
# Test 3: Dati invalidi (client_id non UUID)
test_invalid_data() {
log_info "Test 3: Invio log con dati invalidi (client_id non UUID)..."
local timestamp
timestamp=$(date +%s)
local payload
payload=$(cat <<EOF
{
"client_id": "not-a-uuid",
"severity": "critical",
"raw_log": "test error"
}
EOF
)
local signature
signature=$(generate_hmac "$payload" "$timestamp" "$CLIENT_SECRET")
local response
response=$(curl -s -w "\n%{http_code}" -X POST "${N8N_URL}${WEBHOOK_PATH}" \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${timestamp}:${signature}" \
-H "X-LogWhisperer-Timestamp: ${timestamp}" \
-d "$payload" || echo "000")
local http_code
http_code=$(echo "$response" | tail -n1)
if [[ "$http_code" == "400" ]]; then
log_info "✓ Test 3 PASSATO: Risposta 400 Bad Request (atteso)"
return 0
else
log_error "✗ Test 3 FALLITO: Atteso 400, ricevuto $http_code"
return 1
fi
}
# Test 4: Severity=medium (non dovrebbe triggerare AI)
test_medium_severity() {
log_info "Test 4: Invio log con severity=medium..."
local timestamp
timestamp=$(date +%s)
local payload
payload=$(cat <<EOF
{
"client_id": "550e8400-e29b-41d4-a716-446655440001",
"hostname": "test-server",
"source": "/var/log/syslog",
"severity": "medium",
"raw_log": "Normal operation log",
"matched_pattern": "INFO"
}
EOF
)
local signature
signature=$(generate_hmac "$payload" "$timestamp" "$CLIENT_SECRET")
local response
response=$(curl -s -w "\n%{http_code}" -X POST "${N8N_URL}${WEBHOOK_PATH}" \
-H "Content-Type: application/json" \
-H "X-LogWhisperer-Signature: ${timestamp}:${signature}" \
-H "X-LogWhisperer-Timestamp: ${timestamp}" \
-d "$payload" || echo "000")
local http_code
http_code=$(echo "$response" | tail -n1)
if [[ "$http_code" == "200" ]]; then
log_info "✓ Test 4 PASSATO: Risposta 200 OK (no AI trigger)"
return 0
else
log_error "✗ Test 4 FALLITO: Atteso 200, ricevuto $http_code"
return 1
fi
}
# Main
main() {
echo "=========================================="
echo "LogWhisperer AI - Workflow Test Suite"
echo "Target: ${N8N_URL}"
echo "=========================================="
echo ""
# Verifica dipendenze
if ! command -v curl &> /dev/null; then
log_error "curl non trovato. Installare curl."
exit 1
fi
if ! command -v openssl &> /dev/null; then
log_error "openssl non trovato. Installare openssl."
exit 1
fi
local failed=0
# Esegui test
test_valid_hmac || failed=$((failed + 1))
echo ""
test_invalid_hmac || failed=$((failed + 1))
echo ""
test_invalid_data || failed=$((failed + 1))
echo ""
test_medium_severity || failed=$((failed + 1))
echo ""
# Report finale
echo "=========================================="
if [[ $failed -eq 0 ]]; then
log_info "Tutti i test PASSATI! ✓"
exit 0
else
log_error "$failed test FALLITI! ✗"
exit 1
fi
}
main "$@"