diff --git a/CHANGELOG.md b/CHANGELOG.md index 235564c..84f791d 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -9,6 +9,18 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Added +- feat: Create n8n workflow `LogWhisperer_Ingest` for secure log ingestion + - Webhook trigger on POST `/webhook/logwhisperer/ingest` + - HMAC-SHA256 signature validation with anti-replay protection + - Data validation (UUID, severity levels, non-empty raw_log) + - PostgreSQL storage with automatic table creation + - Conditional AI processing for critical severity logs + - JSON export at `workflows/logwhisperer_ingest.json` + - Test suite at `workflows/test_workflow.sh` + - Integration guide at `workflows/INTEGRATION.md` + - Documentation at `workflows/README.md` + - Implements Metodo Sacchi: Safety First, Little Often, Double Check + - feat: Configure MCP servers for enhanced AI capabilities - sequential-thinking MCP for structured problem solving - context7 MCP for contextual library documentation retrieval diff --git a/workflows/INTEGRATION.md b/workflows/INTEGRATION.md new file mode 100644 index 0000000..af4fc72 --- /dev/null +++ b/workflows/INTEGRATION.md @@ -0,0 +1,353 @@ +# Guida Integrazione: Script Bash ↔ Workflow n8n + +Questa guida descrive come integrare `secure_logwhisperer.sh` con il workflow n8n `LogWhisperer_Ingest`. + +## πŸ”„ Flusso di Dati + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” HMAC-SHA256 β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ secure_logwhis- β”‚ ───────────────────> β”‚ Webhook n8n β”‚ +β”‚ perer.sh β”‚ POST /ingest β”‚ LogWhisperer_ β”‚ +β”‚ β”‚ β”‚ Ingest β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β”‚ + β–Ό + β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” + β”‚ PostgreSQL β”‚ + β”‚ Table: logs β”‚ + β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +## βš™οΈ Configurazione + +### 1. Configurare il Client (Script Bash) + +Crea un file `config.env` nella directory del progetto: + +```bash +# config.env +CLIENT_ID="550e8400-e29b-41d4-a716-446655440000" +CLIENT_SECRET="your-secret-32-chars-long-minimum-here" +WEBHOOK_URL="https://192.168.254.12:5678/webhook/logwhisperer/ingest" +MAX_LINE_LENGTH=2000 +OFFSET_DIR="/var/lib/logwhisperer" +``` + +**Requisiti:** +- `CLIENT_ID`: UUID v4 valido +- `CLIENT_SECRET`: Minimo 32 caratteri, no spazi +- `WEBHOOK_URL`: Deve usare HTTPS in produzione + +### 2. Configurare il Server (n8n) + +#### Impostare Variabile Ambiente + +**Docker Compose:** + +```yaml +services: + n8n: + image: n8nio/n8n + environment: + - LOGWHISPERER_SECRET=your-secret-32-chars-long-minimum-here + - DB_TYPE=postgresdb + - DB_POSTGRESDB_HOST=postgres + - DB_POSTGRESDB_DATABASE=n8n + - DB_POSTGRESDB_USER=n8n + - DB_POSTGRESDB_PASSWORD=password +``` + +**Docker Run:** + +```bash +docker run -d \ + --name n8n \ + -p 5678:5678 \ + -e LOGWHISPERER_SECRET="your-secret-32-chars-long-minimum-here" \ + -v ~/.n8n:/home/node/.n8n \ + n8nio/n8n +``` + +#### Configurare Credenziali PostgreSQL + +1. Accedi a n8n UI: http://192.168.254.12:5678 +2. Vai su **Settings** β†’ **Credentials** +3. Clicca **Add Credential** +4. Seleziona **PostgreSQL** +5. Configura: + - **Name**: `PostgreSQL LogWhisperer` + - **Host**: `postgres` (o IP del tuo DB) + - **Port**: `5432` + - **Database**: `logwhisperer` + - **User**: `logwhisperer` + - **Password**: `your-password` + - **SSL**: `disable` (per test locale) + +### 3. Verificare Segreto Condiviso + +Il segreto DEVE essere identico su client e server: + +**Client (Bash):** +```bash +# Genera firma di test +./scripts/secure_logwhisperer.sh --generate-hmac '{"test":"data"}' 'test-secret-32-chars-long-minimum' 1234567890 +# Output: 1234567890:abc123... +``` + +**Server (n8n Code Node):** + +Il nodo `HMAC Validation` calcola la firma con lo stesso algoritmo: + +```javascript +const expectedSignature = crypto + .createHmac('sha256', secret) + .update(`${timestamp}:${payload}`) + .digest('hex'); +``` + +## πŸš€ Esempio di Uso Completo + +### Step 1: Validare Configurazione + +```bash +cd /home/google/Sources/LucaSacchiNet/LogWhispererAI + +# Verifica dipendenze +./scripts/secure_logwhisperer.sh --check-deps + +# Valida configurazione +./scripts/secure_logwhisperer.sh --validate-config +``` + +### Step 2: Test Ingezione Singola + +```bash +# Sanitizza una linea di log +SANITIZED=$(./scripts/secure_logwhisperer.sh --sanitize-line "Apr 2 10:30:00 kernel: password=secret123 Out of memory") +echo "$SANITIZED" +# Output: Apr 2 10:30:00 kernel: password=*** Out of memory + +# Genera payload JSON +PAYLOAD=$(./scripts/secure_logwhisperer.sh --encode-json '{ + "client_id": "550e8400-e29b-41d4-a716-446655440000", + "hostname": "web-server-01", + "source": "/var/log/syslog", + "severity": "critical", + "raw_log": "Apr 2 10:30:00 kernel: Out of memory", + "matched_pattern": "OOM" +}') + +# Genera firma HMAC +TIMESTAMP=$(date +%s) +SIGNATURE=$(./scripts/secure_logwhisperer.sh --generate-hmac "$PAYLOAD" "$CLIENT_SECRET" "$TIMESTAMP") + +# Invia a n8n +curl -X POST "$WEBHOOK_URL" \ + -H "Content-Type: application/json" \ + -H "X-LogWhisperer-Signature: $SIGNATURE" \ + -H "X-LogWhisperer-Timestamp: $TIMESTAMP" \ + -d "$PAYLOAD" +``` + +### Step 3: Verifica Salvataggio + +```bash +# Connettiti al database PostgreSQL +psql -h localhost -U logwhisperer -d logwhisperer + +# Query per verificare inserimento +SELECT * FROM logs ORDER BY created_at DESC LIMIT 5; + +# Esci +\q +``` + +## πŸ” Sicurezza End-to-End + +### HMAC Signature Format + +``` +Header: X-LogWhisperer-Signature: : +Header: X-LogWhisperer-Timestamp: + +Dove: +- timestamp: Unix epoch seconds +- signature: HMAC-SHA256(timestamp:payload, secret) +``` + +### Esempio Calcolo HMAC (Bash) + +```bash +#!/bin/bash + +payload='{"client_id":"550e8400-e29b-41d4-a716-446655440000","severity":"critical","raw_log":"test"}' +timestamp=$(date +%s) +secret="test-secret-32-chars-long-minimum" + +# Calcola HMAC +signature=$(printf '%s:%s' "$timestamp" "$payload" | \ + openssl dgst -sha256 -hmac "$secret" | \ + sed 's/^.* //') + +echo "Timestamp: $timestamp" +echo "Signature: $signature" +echo "Full: ${timestamp}:${signature}" +``` + +### Esempio Calcolo HMAC (JavaScript/n8n) + +```javascript +const crypto = require('crypto'); + +const payload = '{"client_id":"550e8400-e29b-41d4-a716-446655440000","severity":"critical","raw_log":"test"}'; +const timestamp = Math.floor(Date.now() / 1000); +const secret = "test-secret-32-chars-long-minimum"; + +const signature = crypto + .createHmac('sha256', secret) + .update(`${timestamp}:${payload}`) + .digest('hex'); + +console.log(`Timestamp: ${timestamp}`); +console.log(`Signature: ${signature}`); +console.log(`Full: ${timestamp}:${signature}`); +``` + +## πŸ§ͺ Test di Integrazione + +### Test 1: Validazione Completa + +```bash +./workflows/test_workflow.sh +``` + +### Test 2: Flusso Completo + +```bash +# 1. Crea una linea di log di test +echo "Apr 2 12:00:00 kernel: FATAL: Out of memory: Kill process 1234" > /tmp/test_critical.log + +# 2. Processa con secure_logwhisperer.sh +# (Assumendo che lo script legga da file e invii a webhook) +# TODO: Implementare modalitΓ  daemon nel prossimo sprint + +# 3. Verifica in database +psql -h localhost -U logwhisperer -c "SELECT * FROM logs WHERE severity='critical' ORDER BY created_at DESC LIMIT 1;" +``` + +## πŸ“Š Monitoraggio + +### Log n8n + +```bash +# Visualizza log in tempo reale +docker logs -f n8n + +# Cerca errori specifici +docker logs n8n 2>&1 | grep -i "logwhisperer\|error\|unauthorized" +``` + +### Metriche + +- **Richieste totali**: Conteggio righe in tabella `logs` +- **Errori 401**: Webhook chiamate rifiutate (HMAC invalido) +- **Errori 400**: Validazione dati fallita +- **Latency**: Tempo medio di risposta del webhook + +## πŸ› Troubleshooting Comuni + +### "Invalid signature" (401) + +**Causa**: Segreti diversi tra client e server + +**Soluzione**: +```bash +# Verifica segreto sul client +echo "CLIENT_SECRET: $CLIENT_SECRET" + +# Verifica segreto sul server (n8n container) +docker exec n8n echo "$LOGWHISPERER_SECRET" + +# Devono essere identici! +``` + +### "Request timestamp too old" (401) + +**Causa**: Clock skew tra client e server + +**Soluzione**: +```bash +# Sincronizza orario +sudo ntpdate pool.ntp.org + +# O su container n8n +docker exec n8n date +docker exec n8n sh -c "date -s '@$(date +%s)'" +``` + +### Database connection error + +**Causa**: Credenziali PostgreSQL errate o database non raggiungibile + +**Soluzione**: +```bash +# Test connessione +docker exec n8n pg_isready -h postgres -p 5432 + +# Verifica credenziali +docker exec n8n psql -h postgres -U logwhisperer -d logwhisperer -c "SELECT 1;" +``` + +## πŸ”„ Workflow CI/CD + +Per test automatici in CI/CD: + +```yaml +# .github/workflows/integration-test.yml +name: Integration Tests + +on: [push, pull_request] + +jobs: + test: + runs-on: ubuntu-latest + services: + postgres: + image: postgres:15 + env: + POSTGRES_DB: logwhisperer + POSTGRES_USER: logwhisperer + POSTGRES_PASSWORD: test + ports: + - 5432:5432 + n8n: + image: n8nio/n8n + env: + LOGWHISPERER_SECRET: test-secret-32-chars-long-minimum + ports: + - 5678:5678 + + steps: + - uses: actions/checkout@v3 + + - name: Import Workflow + run: | + curl -X POST http://localhost:5678/api/v1/workflows \ + -H "Content-Type: application/json" \ + -d @workflows/logwhisperer_ingest.json + + - name: Run Tests + run: ./workflows/test_workflow.sh +``` + +## πŸ“ Checklist Pre-Deploy + +- [ ] `CLIENT_SECRET` configurato su client (min 32 chars) +- [ ] `LOGWHISPERER_SECRET` configurato su server (identico al client) +- [ ] Credenziali PostgreSQL configurate in n8n +- [ ] Workflow importato e attivato +- [ ] Tabella `logs` creata (automatizzato dal workflow) +- [ ] Test suite passati (`./workflows/test_workflow.sh`) +- [ ] HTTPS abilitato (in produzione) +- [ ] Rate limiting configurato +- [ ] Monitoring e alerting attivo diff --git a/workflows/README.md b/workflows/README.md new file mode 100644 index 0000000..9301c3e --- /dev/null +++ b/workflows/README.md @@ -0,0 +1,186 @@ +# LogWhisperer AI - Workflow n8n + +Workflow per l'ingestion sicura dei log con validazione HMAC-SHA256. + +## πŸ“‹ Descrizione Workflow + +Il workflow `LogWhisperer_Ingest` riceve i log dal client `secure_logwhisperer.sh`, valida l'autenticazione HMAC, e li memorizza in PostgreSQL. + +### Nodi del Workflow + +1. **Webhook Trigger** - Riceve POST su `/webhook/logwhisperer/ingest` +2. **HMAC Validation** - Verifica firma HMAC-SHA256 +3. **HMAC Valid?** - Condizione: HMAC valido? +4. **Data Validation** - Validazione campi obbligatori +5. **Store Log** - Inserimento in PostgreSQL +6. **Critical Severity?** - Condizione: severity = critical? +7. **AI Processing** - Placeholder per elaborazione AI (Sprint 3) +8. **Responses** - Risposte HTTP (200, 400, 401) + +## πŸš€ Installazione + +### Prerequisiti + +- n8n installato e accessibile (http://192.168.254.12:5678) +- PostgreSQL con credenziali configurate in n8n +- Variabile ambiente `LOGWHISPERER_SECRET` configurata + +### Step 1: Configurare Variabile Ambiente + +```bash +# SSH nel container n8n o imposta via n8n UI +export LOGWHISPERER_SECRET="your-32-char-secret-here-minimum" +``` + +### Step 2: Importare il Workflow + +**Metodo A: Via n8n UI** + +1. Accedi a http://192.168.254.12:5678 +2. Vai su **Workflows** β†’ **Import from File** +3. Seleziona `workflows/logwhisperer_ingest.json` +4. Clicca **Save** + +**Metodo B: Via API (curl)** + +```bash +# Ottieni API key da n8n UI β†’ Settings β†’ API +curl -X POST http://192.168.254.12:5678/api/v1/workflows \ + -H "Content-Type: application/json" \ + -H "X-N8N-API-KEY: your-api-key" \ + -d @workflows/logwhisperer_ingest.json +``` + +### Step 3: Configurare Credenziali PostgreSQL + +1. In n8n UI, vai su **Settings** β†’ **Credentials** +2. Crea nuova credenziale **PostgreSQL** +3. Nome: `PostgreSQL LogWhisperer` +4. Inserisci host, port, database, user, password +5. Clicca **Save** + +### Step 4: Attivare il Workflow + +1. Apri il workflow `LogWhisperer_Ingest` +2. Clicca **Activate** (toggle in alto a destra) +3. Verifica che lo stato sia **Active** + +## πŸ§ͺ Testing + +### Esegui Test Suite + +```bash +cd /home/google/Sources/LucaSacchiNet/LogWhispererAI +./workflows/test_workflow.sh +``` + +### Test Manuale con curl + +**Test HMAC Valido:** + +```bash +# Genera HMAC +TIMESTAMP=$(date +%s) +PAYLOAD='{"client_id":"550e8400-e29b-41d4-a716-446655440000","hostname":"test-server","source":"/var/log/syslog","severity":"critical","raw_log":"Apr 2 10:30:00 kernel: Out of memory","matched_pattern":"OOM"}' +SIGNATURE=$(printf '%s:%s' "$TIMESTAMP" "$PAYLOAD" | openssl dgst -sha256 -hmac "test-secret-32-chars-long-minimum" | sed 's/^.* //') + +# Invia richiesta +curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \ + -H "Content-Type: application/json" \ + -H "X-LogWhisperer-Signature: ${TIMESTAMP}:${SIGNATURE}" \ + -H "X-LogWhisperer-Timestamp: ${TIMESTAMP}" \ + -d "$PAYLOAD" +``` + +**Test HMAC Invalido (deve ritornare 401):** + +```bash +curl -X POST http://192.168.254.12:5678/webhook/logwhisperer/ingest \ + -H "Content-Type: application/json" \ + -H "X-LogWhisperer-Signature: invalid-signature" \ + -H "X-LogWhisperer-Timestamp: $(date +%s)" \ + -d '{"client_id":"test","severity":"critical","raw_log":"test"}' +``` + +## πŸ”’ Sicurezza + +### Validazione HMAC + +- Algoritmo: HMAC-SHA256 +- Formato: `timestamp:signature` +- Anti-replay: Timestamp max 5 minuti di differenza +- Timing-safe comparison + +### Validazione Dati + +- `client_id`: UUID v4 obbligatorio +- `raw_log`: Non vuoto +- `severity`: Uno di `low`, `medium`, `critical` + +### Note Sicurezza + +- Non loggare `raw_log` completo nei nodi AI +- Usare HTTPS in produzione +- Rate limiting: max 10 req/min per client_id +- `LOGWHISPERER_SECRET` minimo 32 caratteri + +## πŸ“Š Schema Database + +```sql +CREATE TABLE logs ( + id SERIAL PRIMARY KEY, + client_id VARCHAR(36) NOT NULL, + hostname VARCHAR(255), + source VARCHAR(500), + severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')), + timestamp TIMESTAMP WITH TIME ZONE, + raw_log TEXT, + matched_pattern VARCHAR(100), + created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP +); + +CREATE INDEX idx_logs_client_id ON logs(client_id); +CREATE INDEX idx_logs_severity ON logs(severity); +CREATE INDEX idx_logs_timestamp ON logs(timestamp); +``` + +## πŸ”§ Troubleshooting + +### Workflow non risponde + +1. Verifica che n8n sia attivo: `curl http://192.168.254.12:5678/healthz` +2. Controlla che il workflow sia attivato +3. Verifica i log di n8n: `docker logs n8n` + +### Errore 401 Unauthorized + +- Verifica che `LOGWHISPERER_SECRET` sia configurato +- Controlla formato signature: `timestamp:signature` +- Verifica che timestamp sia recente (< 5 min) + +### Errore 400 Bad Request + +- Verifica formato UUID per `client_id` +- Controlla che `severity` sia valido +- Assicurati che `raw_log` non sia vuoto + +### Errore Database + +- Verifica credenziali PostgreSQL in n8n +- Controlla che la tabella sia stata creata +- Verifica permessi utente database + +## πŸ“ Changelog + +### 2026-04-02 - Sprint 2 Feature 2 + +- Creazione workflow iniziale +- Implementazione validazione HMAC +- Integrazione PostgreSQL +- Conditional alerting per severity critical + +## πŸ›‘οΈ Metodo Sacchi Applied + +- **Safety first**: Validazione HMAC prima di ogni operazione +- **Little often**: Un nodo per funzione, testabili individualmente +- **Double check**: Validazione dati dopo HMAC, verifica salvataggio DB diff --git a/workflows/REPORT.md b/workflows/REPORT.md new file mode 100644 index 0000000..a4e9b46 --- /dev/null +++ b/workflows/REPORT.md @@ -0,0 +1,271 @@ +# πŸ“Š Report: Workflow LogWhisperer_Ingest Creato + +## βœ… Stato: COMPLETATO + +Il workflow n8n per l'ingestion sicura dei log Γ¨ stato creato con successo seguendo il **Metodo Sacchi**. + +--- + +## πŸ“ File Creati + +| File | Descrizione | Dimensione | +|------|-------------|------------| +| `workflows/logwhisperer_ingest.json` | Export JSON del workflow n8n | 12.7 KB | +| `workflows/test_workflow.sh` | Test suite automatizzata | 6.3 KB | +| `workflows/README.md` | Documentazione workflow | 5.3 KB | +| `workflows/INTEGRATION.md` | Guida integrazione Bash ↔ n8n | 9.2 KB | + +--- + +## πŸ”§ Architettura Workflow + +``` +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ Webhook Trigger β”‚ ◄── POST /webhook/logwhisperer/ingest +β”‚ (POST /ingest) β”‚ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β”‚ + β–Ό +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ HMAC Validation β”‚ ◄── Verifica firma HMAC-SHA256 +β”‚ (Code Node) β”‚ Anti-replay (max 5 min) +β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β”‚ + β–Ό + β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” + β”‚Valid? β”‚ + β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜ + Yes β”‚ No + β”Œβ”€β”΄β”€β” + β–Ό β–Ό +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚Data Val. β”‚ β”‚ 401 Resp β”‚ +β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β”‚ + β–Ό +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚ Store Log β”‚ ◄── PostgreSQL INSERT +β”‚ (PostgreSQL) β”‚ Tabella: logs +β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ + β”‚ + β–Ό + β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” + β”‚Critical? β”‚ + β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜ + Yes β”‚ No + β”Œβ”€β”΄β”€β” + β–Ό β–Ό +β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” +β”‚AI Processβ”‚ β”‚200 OK β”‚ +β”‚(Sprint 3)β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ +``` + +--- + +## πŸ” Sicurezza Implementata + +### HMAC Validation +- **Algoritmo**: HMAC-SHA256 +- **Formato**: `timestamp:signature` in header `X-LogWhisperer-Signature` +- **Anti-replay**: Timestamp max 5 minuti di differenza +- **Timing-safe**: Comparazione signature con `crypto.timingSafeEqual()` + +### Data Validation +- `client_id`: UUID v4 obbligatorio (regex validazione) +- `raw_log`: Non vuoto dopo trim +- `severity`: Solo `low`, `medium`, `critical` +- **Normalizzazione**: Severity convertito a lowercase + +### Database +- **Tabella**: `logs` con vincoli CHECK su severity +- **Indici**: client_id, severity, timestamp +- **Audit**: created_at automatico + +--- + +## πŸš€ Istruzioni per Attivazione + +### 1. Importa il Workflow + +```bash +# Via n8n UI +curl -X POST http://192.168.254.12:5678/api/v1/workflows \ + -H "Content-Type: application/json" \ + -H "X-N8N-API-KEY: your-api-key" \ + -d @workflows/logwhisperer_ingest.json +``` + +### 2. Configura Variabile Ambiente + +```bash +# Nel container n8n +docker exec n8n sh -c 'export LOGWHISPERER_SECRET="your-32-char-secret-here"' + +# O in docker-compose.yml +environment: + - LOGWHISPERER_SECRET=your-32-char-secret-here +``` + +### 3. Configura Credenziali PostgreSQL + +1. Vai su http://192.168.254.12:5678/settings/credentials +2. Crea credenziale **PostgreSQL** +3. Nome: `PostgreSQL LogWhisperer` +4. Inserisci host, port, database, user, password + +### 4. Attiva il Workflow + +1. Apri il workflow in n8n UI +2. Clicca **Activate** (toggle in alto a destra) +3. Verifica stato **Active** + +--- + +## πŸ§ͺ Test Suite + +```bash +# Esegui tutti i test +cd /home/google/Sources/LucaSacchiNet/LogWhispererAI +./workflows/test_workflow.sh + +# Output atteso: +# ========================================== +# LogWhisperer AI - Workflow Test Suite +# Target: http://192.168.254.12:5678 +# ========================================== +# +# [INFO] Test 1: Invio log con HMAC valido... +# [INFO] βœ“ Test 1 PASSATO: Risposta 200 OK +# +# [INFO] Test 2: Invio log con HMAC invalido... +# [INFO] βœ“ Test 2 PASSATO: Risposta 401 Unauthorized (atteso) +# +# [INFO] Test 3: Invio log con dati invalidi... +# [INFO] βœ“ Test 3 PASSATO: Risposta 400 Bad Request (atteso) +# +# [INFO] Test 4: Invio log con severity=medium... +# [INFO] βœ“ Test 4 PASSATO: Risposta 200 OK (no AI trigger) +# +# ========================================== +# [INFO] Tutti i test PASSATI! βœ“ +``` + +--- + +## πŸ”— Integrazione con Script Bash + +### Configurazione Script + +```bash +# config.env +CLIENT_ID="550e8400-e29b-41d4-a716-446655440000" +CLIENT_SECRET="your-32-char-secret-here" +WEBHOOK_URL="http://192.168.254.12:5678/webhook/logwhisperer/ingest" +``` + +**⚠️ IMPORTANTE**: `CLIENT_SECRET` deve essere identico a `LOGWHISPERER_SECRET` su n8n! + +### Esempio di Chiamata + +```bash +# Genera payload +PAYLOAD=$(./scripts/secure_logwhisperer.sh --encode-json '{ + "client_id": "550e8400-e29b-41d4-a716-446655440000", + "hostname": "server-01", + "source": "/var/log/syslog", + "severity": "critical", + "raw_log": "kernel: Out of memory", + "matched_pattern": "OOM" +}') + +# Genera firma HMAC +TIMESTAMP=$(date +%s) +SIGNATURE=$(./scripts/secure_logwhisperer.sh --generate-hmac "$PAYLOAD" "$CLIENT_SECRET" "$TIMESTAMP") + +# Invia a n8n +curl -X POST "$WEBHOOK_URL" \ + -H "Content-Type: application/json" \ + -H "X-LogWhisperer-Signature: ${TIMESTAMP}:${SIGNATURE}" \ + -H "X-LogWhisperer-Timestamp: ${TIMESTAMP}" \ + -d "$PAYLOAD" +``` + +--- + +## πŸ“Š Schema Database + +```sql +CREATE TABLE IF NOT EXISTS logs ( + id SERIAL PRIMARY KEY, + client_id VARCHAR(36) NOT NULL, + hostname VARCHAR(255), + source VARCHAR(500), + severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')), + timestamp TIMESTAMP WITH TIME ZONE, + raw_log TEXT, + matched_pattern VARCHAR(100), + created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP +); + +CREATE INDEX idx_logs_client_id ON logs(client_id); +CREATE INDEX idx_logs_severity ON logs(severity); +CREATE INDEX idx_logs_timestamp ON logs(timestamp); +``` + +--- + +## ⚑ Codici Risposta + +| HTTP | Significato | Causa | +|------|-------------|-------| +| 200 | Success | Log salvato correttamente | +| 400 | Bad Request | Validazione dati fallita | +| 401 | Unauthorized | HMAC invalido o timestamp troppo vecchio | +| 500 | Server Error | Errore database o nodo code | + +--- + +## πŸ›‘οΈ Metodo Sacchi Applied + +βœ… **Safety First**: Validazione HMAC prima di qualsiasi operazione +- Il nodo HMAC Validation Γ¨ il primo filtro +- Nessun dato viene processato senza autenticazione valida + +βœ… **Little Often**: Un nodo per funzione +- Webhook trigger separato +- HMAC validation isolata +- Data validation dedicata +- Storage separato +- Conditional alerting + +βœ… **Double Check**: Verifiche multiple +- Validazione UUID formato +- Validazione severity values +- Controllo non-empty raw_log +- Verifica timestamp (anti-replay) +- Timing-safe HMAC comparison + +--- + +## πŸ“ Note per Sprint 3 + +Il nodo **AI Processing (Placeholder)** Γ¨ pronto per essere esteso: +- Riceve i dati del log quando severity = critical +- Non espone raw_log nei log (sicurezza) +- Pronto per integrazione con LLM/API AI + +--- + +## πŸ“š Documentazione + +- **README**: `workflows/README.md` - Guida completa workflow +- **Integrazione**: `workflows/INTEGRATION.md` - Integrazione Bash ↔ n8n +- **Test**: `workflows/test_workflow.sh` - Test suite +- **Changelog**: Aggiornato in `CHANGELOG.md` + +--- + +**Creato da**: @n8n-specialist +**Data**: 2026-04-02 +**Status**: βœ… Pronto per deployment diff --git a/workflows/logwhisperer_ingest.json b/workflows/logwhisperer_ingest.json new file mode 100644 index 0000000..f556f61 --- /dev/null +++ b/workflows/logwhisperer_ingest.json @@ -0,0 +1,355 @@ +{ + "name": "LogWhisperer_Ingest", + "nodes": [ + { + "parameters": {}, + "id": "trigger-node", + "name": "Webhook Trigger", + "type": "n8n-nodes-base.webhook", + "typeVersion": 1, + "position": [ + 250, + 300 + ], + "webhookId": "logwhisperer-ingest", + "path": "logwhisperer/ingest", + "responseMode": "responseNode", + "options": {} + }, + { + "parameters": { + "jsCode": "// HMAC Validation Node\n// Verifica la firma HMAC-SHA256 secondo il Metodo Sacchi: Safety First\n\nconst crypto = require('crypto');\n\n// Recupera headers\nconst signatureHeader = $headers['x-logwhisperer-signature'];\nconst timestampHeader = $headers['x-logwhisperer-timestamp'];\n\n// Recupera secret da variabile ambiente\nconst secret = process.env.LOGWHISPERER_SECRET;\n\nif (!secret) {\n throw new Error('LOGWHISPERER_SECRET not configured');\n}\n\nif (!signatureHeader || !timestampHeader) {\n return [{\n json: {\n valid: false,\n error: 'Missing authentication headers',\n statusCode: 401\n }\n }];\n}\n\n// Estrai timestamp e signature dal formato: timestamp:signature\nconst parts = signatureHeader.split(':');\nif (parts.length !== 2) {\n return [{\n json: {\n valid: false,\n error: 'Invalid signature format',\n statusCode: 401\n }\n }];\n}\n\nconst receivedTimestamp = parts[0];\nconst receivedSignature = parts[1];\n\n// Verifica timestamp (anti-replay: max 5 minuti di differenza)\nconst now = Math.floor(Date.now() / 1000);\nconst requestTime = parseInt(timestampHeader, 10);\nconst timeDiff = Math.abs(now - requestTime);\n\nif (timeDiff > 300) {\n return [{\n json: {\n valid: false,\n error: 'Request timestamp too old',\n statusCode: 401\n }\n }];\n}\n\n// Ottieni il payload raw\nconst payload = JSON.stringify($input.first().json);\n\n// Calcola HMAC atteso: HMAC-SHA256(timestamp:payload)\nconst expectedSignature = crypto\n .createHmac('sha256', secret)\n .update(`${timestampHeader}:${payload}`)\n .digest('hex');\n\n// Comparazione timing-safe\nconst isValid = crypto.timingSafeEqual(\n Buffer.from(receivedSignature, 'hex'),\n Buffer.from(expectedSignature, 'hex')\n);\n\nif (!isValid) {\n return [{\n json: {\n valid: false,\n error: 'Invalid signature',\n statusCode: 401\n }\n }];\n}\n\n// Validazione HMAC passata - restituisci payload per il prossimo nodo\nreturn [{\n json: {\n valid: true,\n data: $input.first().json\n }\n}];" + }, + "id": "hmac-validation-node", + "name": "HMAC Validation", + "type": "n8n-nodes-base.code", + "typeVersion": 2, + "position": [ + 450, + 300 + ] + }, + { + "parameters": { + "conditions": { + "options": { + "caseSensitive": true, + "leftValue": "", + "typeValidation": "strict" + }, + "conditions": [ + { + "id": "condition-1", + "leftValue": "={{ $json.valid }}", + "rightValue": "true", + "operator": { + "type": "boolean", + "operation": "equals" + } + } + ], + "combinator": "and" + } + }, + "id": "hmac-check-node", + "name": "HMAC Valid?", + "type": "n8n-nodes-base.if", + "typeVersion": 2, + "position": [ + 650, + 300 + ] + }, + { + "parameters": { + "jsCode": "// Data Validation Node\n// Validazione campi obbligatori secondo il Metodo Sacchi: Double Check\n\nconst data = $input.first().json.data;\nconst errors = [];\n\n// Validazione UUID per client_id\nfunction isValidUUID(uuid) {\n const uuidRegex = /^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$/;\n return uuidRegex.test(uuid);\n}\n\n// Validazione campi obbligatori\nif (!data.client_id) {\n errors.push('Missing client_id');\n} else if (!isValidUUID(data.client_id)) {\n errors.push('Invalid client_id format (must be UUID)');\n}\n\nif (!data.raw_log || data.raw_log.trim() === '') {\n errors.push('Missing or empty raw_log');\n}\n\nconst validSeverities = ['low', 'medium', 'critical'];\nif (!data.severity) {\n errors.push('Missing severity');\n} else if (!validSeverities.includes(data.severity.toLowerCase())) {\n errors.push(`Invalid severity: ${data.severity} (must be one of: ${validSeverities.join(', ')})`);\n}\n\nif (errors.length > 0) {\n return [{\n json: {\n valid: false,\n errors: errors,\n received: data\n }\n }];\n}\n\n// Normalizza severity a lowercase\nconst normalizedData = {\n ...data,\n severity: data.severity.toLowerCase()\n};\n\nreturn [{\n json: {\n valid: true,\n data: normalizedData\n }\n}];" + }, + "id": "data-validation-node", + "name": "Data Validation", + "type": "n8n-nodes-base.code", + "typeVersion": 2, + "position": [ + 850, + 200 + ] + }, + { + "parameters": { + "operation": "executeQuery", + "query": "INSERT INTO logs (client_id, hostname, source, severity, timestamp, raw_log, matched_pattern)\nVALUES ($1, $2, $3, $4, $5, $6, $7)\nRETURNING id;", + "options": { + "queryParams": "={{ JSON.stringify([$json.data.client_id, $json.data.hostname, $json.data.source, $json.data.severity, $json.data.timestamp, $json.data.raw_log, $json.data.matched_pattern]) }}" + } + }, + "id": "postgres-insert-node", + "name": "Store Log", + "type": "n8n-nodes-base.postgres", + "typeVersion": 2.2, + "position": [ + 1050, + 200 + ], + "credentials": { + "postgres": { + "id": "postgres-credentials", + "name": "PostgreSQL LogWhisperer" + } + } + }, + { + "parameters": { + "conditions": { + "options": { + "caseSensitive": false, + "leftValue": "", + "typeValidation": "strict" + }, + "conditions": [ + { + "id": "condition-1", + "leftValue": "={{ $json.data.severity }}", + "rightValue": "critical", + "operator": { + "type": "string", + "operation": "equals" + } + } + ], + "combinator": "or" + } + }, + "id": "severity-check-node", + "name": "Critical Severity?", + "type": "n8n-nodes-base.if", + "typeVersion": 2, + "position": [ + 1250, + 100 + ] + }, + { + "parameters": { + "jsCode": "// AI Processing Placeholder\n// Per Sprint 2 Feature 2: prepara i dati per l'elaborazione AI\n// Il nodo vero e proprio sarΓ  implementato nello Sprint 3\n\nconst logData = $input.first().json;\n\n// Log di sicurezza: non esporre raw_log completo nei log\nconsole.log('AI Processing requested for log ID:', logData.id);\nconsole.log('Client:', logData.data.client_id);\nconsole.log('Severity:', logData.data.severity);\nconsole.log('Pattern:', logData.data.matched_pattern);\n\nreturn [{\n json: {\n status: 'queued_for_ai_processing',\n log_id: logData.id,\n client_id: logData.data.client_id,\n severity: logData.data.severity,\n pattern: logData.data.matched_pattern,\n // raw_log escluso per sicurezza\n timestamp: new Date().toISOString()\n }\n}];" + }, + "id": "ai-processing-node", + "name": "AI Processing (Placeholder)", + "type": "n8n-nodes-base.code", + "typeVersion": 2, + "position": [ + 1450, + 50 + ] + }, + { + "parameters": { + "respondWith": "json", + "options": {} + }, + "id": "success-response-node", + "name": "Success Response", + "type": "n8n-nodes-base.respondToWebhook", + "typeVersion": 1.1, + "position": [ + 1450, + 250 + ] + }, + { + "parameters": { + "respondWith": "json", + "responseBody": "={\"error\": \"Unauthorized\", \"message\": \"Invalid HMAC signature\"}", + "options": { + "responseCode": 401 + } + }, + "id": "unauthorized-response-node", + "name": "401 Unauthorized", + "type": "n8n-nodes-base.respondToWebhook", + "typeVersion": 1.1, + "position": [ + 850, + 400 + ] + }, + { + "parameters": { + "respondWith": "json", + "responseBody": "={\"error\": \"Bad Request\", \"message\": $json.errors.join(\", \")}", + "options": { + "responseCode": 400 + } + }, + "id": "validation-error-node", + "name": "400 Validation Error", + "type": "n8n-nodes-base.respondToWebhook", + "typeVersion": 1.1, + "position": [ + 1050, + 400 + ] + }, + { + "parameters": { + "jsCode": "// Ensure Table Exists\n// Crea la tabella logs se non esiste giΓ \n\nreturn [{\n json: {\n sql: `\n CREATE TABLE IF NOT EXISTS logs (\n id SERIAL PRIMARY KEY,\n client_id VARCHAR(36) NOT NULL,\n hostname VARCHAR(255),\n source VARCHAR(500),\n severity VARCHAR(20) CHECK (severity IN ('low', 'medium', 'critical')),\n timestamp TIMESTAMP WITH TIME ZONE,\n raw_log TEXT,\n matched_pattern VARCHAR(100),\n created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP\n );\n \n CREATE INDEX IF NOT EXISTS idx_logs_client_id ON logs(client_id);\n CREATE INDEX IF NOT EXISTS idx_logs_severity ON logs(severity);\n CREATE INDEX IF NOT EXISTS idx_logs_timestamp ON logs(timestamp);\n `\n }\n}];" + }, + "id": "ensure-table-node", + "name": "Ensure Table SQL", + "type": "n8n-nodes-base.code", + "typeVersion": 2, + "position": [ + 650, + 100 + ] + }, + { + "parameters": { + "operation": "executeQuery", + "query": "={{ $json.sql }}", + "options": {} + }, + "id": "create-table-node", + "name": "Create Table", + "type": "n8n-nodes-base.postgres", + "typeVersion": 2.2, + "position": [ + 850, + 100 + ], + "credentials": { + "postgres": { + "id": "postgres-credentials", + "name": "PostgreSQL LogWhisperer" + } + } + } + ], + "connections": { + "Webhook Trigger": { + "main": [ + [ + { + "node": "HMAC Validation", + "type": "main", + "index": 0 + }, + { + "node": "Ensure Table SQL", + "type": "main", + "index": 0 + } + ] + ] + }, + "HMAC Validation": { + "main": [ + [ + { + "node": "HMAC Valid?", + "type": "main", + "index": 0 + } + ] + ] + }, + "HMAC Valid?": { + "main": [ + [ + { + "node": "Data Validation", + "type": "main", + "index": 0 + } + ], + [ + { + "node": "401 Unauthorized", + "type": "main", + "index": 0 + } + ] + ] + }, + "Data Validation": { + "main": [ + [ + { + "node": "Store Log", + "type": "main", + "index": 0 + } + ], + [ + { + "node": "400 Validation Error", + "type": "main", + "index": 0 + } + ] + ] + }, + "Store Log": { + "main": [ + [ + { + "node": "Critical Severity?", + "type": "main", + "index": 0 + }, + { + "node": "Success Response", + "type": "main", + "index": 0 + } + ] + ] + }, + "Critical Severity?": { + "main": [ + [ + { + "node": "AI Processing (Placeholder)", + "type": "main", + "index": 0 + } + ], + [ + { + "node": "Success Response", + "type": "main", + "index": 0 + } + ] + ] + }, + "Ensure Table SQL": { + "main": [ + [ + { + "node": "Create Table", + "type": "main", + "index": 0 + } + ] + ] + } + }, + "settings": { + "executionOrder": "v1" + }, + "staticData": null, + "tags": [ + { + "name": "logwhisperer", + "id": "tag-logwhisperer", + "createdAt": "2026-04-02T00:00:00.000Z", + "updatedAt": "2026-04-02T00:00:00.000Z" + }, + { + "name": "security", + "id": "tag-security", + "createdAt": "2026-04-02T00:00:00.000Z", + "updatedAt": "2026-04-02T00:00:00.000Z" + }, + "ingestion" + ] +} diff --git a/workflows/test_workflow.sh b/workflows/test_workflow.sh new file mode 100755 index 0000000..bb62ce0 --- /dev/null +++ b/workflows/test_workflow.sh @@ -0,0 +1,250 @@ +#!/bin/bash +# +# LogWhisperer AI - Workflow Test Script +# Verifica che il workflow n8n risponda correttamente +# + +set -euo pipefail + +N8N_URL="${N8N_URL:-http://192.168.254.12:5678}" +WEBHOOK_PATH="/webhook/logwhisperer/ingest" +CLIENT_SECRET="${CLIENT_SECRET:-test-secret-32-chars-long-minimum}" + +# Colori per output +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +NC='\033[0m' # No Color + +log_info() { + echo -e "${GREEN}[INFO]${NC} $1" +} + +log_error() { + echo -e "${RED}[ERROR]${NC} $1" +} + +log_warn() { + echo -e "${YELLOW}[WARN]${NC} $1" +} + +# Funzione per generare HMAC +generate_hmac() { + local payload="$1" + local timestamp="$2" + local secret="$3" + + printf '%s:%s' "$timestamp" "$payload" | \ + openssl dgst -sha256 -hmac "$secret" | \ + sed 's/^.* //' +} + +# Test 1: HMAC Valido +test_valid_hmac() { + log_info "Test 1: Invio log con HMAC valido..." + + local timestamp + timestamp=$(date +%s) + + local payload + payload=$(cat < /dev/null; then + log_error "curl non trovato. Installare curl." + exit 1 + fi + + if ! command -v openssl &> /dev/null; then + log_error "openssl non trovato. Installare openssl." + exit 1 + fi + + local failed=0 + + # Esegui test + test_valid_hmac || failed=$((failed + 1)) + echo "" + + test_invalid_hmac || failed=$((failed + 1)) + echo "" + + test_invalid_data || failed=$((failed + 1)) + echo "" + + test_medium_severity || failed=$((failed + 1)) + echo "" + + # Report finale + echo "==========================================" + if [[ $failed -eq 0 ]]; then + log_info "Tutti i test PASSATI! βœ“" + exit 0 + else + log_error "$failed test FALLITI! βœ—" + exit 1 + fi +} + +main "$@"