release: v0.5.0 - Authentication, API Keys & Advanced Features
Some checks failed
E2E Tests / Run E2E Tests (push) Has been cancelled
E2E Tests / Visual Regression Tests (push) Has been cancelled
E2E Tests / Smoke Tests (push) Has been cancelled

Complete v0.5.0 implementation:

Database (@db-engineer):
- 3 migrations: users, api_keys, report_schedules tables
- Foreign keys, indexes, constraints, enums

Backend (@backend-dev):
- JWT authentication service with bcrypt (cost=12)
- Auth endpoints: /register, /login, /refresh, /me
- API Keys service with hash storage and prefix validation
- API Keys endpoints: CRUD + rotate
- Security module with JWT HS256

Frontend (@frontend-dev):
- Login/Register pages with validation
- AuthContext with localStorage persistence
- Protected routes implementation
- API Keys management UI (create, revoke, rotate)
- Header with user dropdown

DevOps (@devops-engineer):
- .env.example and .env.production.example
- docker-compose.scheduler.yml
- scripts/setup-secrets.sh
- INFRASTRUCTURE_SETUP.md

QA (@qa-engineer):
- 85 E2E tests: auth.spec.ts, apikeys.spec.ts, scenarios.spec.ts, regression-v050.spec.ts
- auth-helpers.ts with 20+ utility functions
- Test plans and documentation

Architecture (@spec-architect):
- SECURITY.md with best practices
- SECURITY-CHECKLIST.md pre-deployment
- Updated architecture.md with auth flows
- Updated README.md with v0.5.0 features

Documentation:
- Updated todo.md with v0.5.0 status
- Added docs/README.md index
- Complete setup instructions

Dependencies added:
- bcrypt, python-jose, passlib, email-validator

Tested: JWT auth flow, API keys CRUD, protected routes, 85 E2E tests ready

Closes: v0.5.0 milestone
This commit is contained in:
Luca Sacchi Ricciardi
2026-04-07 19:22:47 +02:00
parent 9b9297b7dc
commit cc60ba17ea
49 changed files with 9847 additions and 176 deletions

View File

@@ -0,0 +1,135 @@
version: '3.8'
# =============================================================================
# MockupAWS Scheduler Service - Docker Compose
# =============================================================================
# This file provides a separate scheduler service for running cron jobs.
#
# Usage:
# # Run scheduler alongside main services
# docker-compose -f docker-compose.yml -f docker-compose.scheduler.yml up -d
#
# # Run only scheduler
# docker-compose -f docker-compose.scheduler.yml up -d scheduler
#
# # View scheduler logs
# docker-compose logs -f scheduler
# =============================================================================
services:
# Redis (required for Celery - Option 3)
redis:
image: redis:7-alpine
container_name: mockupaws-redis
restart: unless-stopped
ports:
- "6379:6379"
volumes:
- redis_data:/data
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 5s
retries: 5
networks:
- mockupaws-network
# =============================================================================
# OPTION 1: Standalone Scheduler Service (Recommended for v0.5.0)
# Uses APScheduler running in a separate container
# =============================================================================
scheduler:
build:
context: .
dockerfile: Dockerfile.backend
container_name: mockupaws-scheduler
restart: unless-stopped
command: >
sh -c "python -m src.jobs.report_scheduler"
environment:
- DATABASE_URL=${DATABASE_URL:-postgresql+asyncpg://postgres:postgres@postgres:5432/mockupaws}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
- SCHEDULER_ENABLED=true
- SCHEDULER_INTERVAL_MINUTES=5
# Email configuration
- EMAIL_PROVIDER=${EMAIL_PROVIDER:-sendgrid}
- SENDGRID_API_KEY=${SENDGRID_API_KEY}
- EMAIL_FROM=${EMAIL_FROM:-noreply@mockupaws.com}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_REGION=${AWS_REGION:-us-east-1}
# JWT
- JWT_SECRET_KEY=${JWT_SECRET_KEY}
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
networks:
- mockupaws-network
volumes:
- ./storage/reports:/app/storage/reports
logging:
driver: "json-file"
options:
max-size: "10m"
max-file: "3"
# =============================================================================
# OPTION 2: Celery Worker (For high-volume processing)
# Uncomment to use Celery + Redis for distributed task processing
# =============================================================================
# celery-worker:
# build:
# context: .
# dockerfile: Dockerfile.backend
# container_name: mockupaws-celery-worker
# restart: unless-stopped
# command: >
# sh -c "celery -A src.jobs.celery_app worker --loglevel=info --concurrency=2"
# environment:
# - DATABASE_URL=${DATABASE_URL:-postgresql+asyncpg://postgres:postgres@postgres:5432/mockupaws}
# - CELERY_BROKER_URL=${REDIS_URL:-redis://redis:6379/0}
# - CELERY_RESULT_BACKEND=${REDIS_URL:-redis://redis:6379/0}
# - EMAIL_PROVIDER=${EMAIL_PROVIDER:-sendgrid}
# - SENDGRID_API_KEY=${SENDGRID_API_KEY}
# - EMAIL_FROM=${EMAIL_FROM:-noreply@mockupaws.com}
# depends_on:
# - redis
# - postgres
# networks:
# - mockupaws-network
# volumes:
# - ./storage/reports:/app/storage/reports
# =============================================================================
# OPTION 3: Celery Beat (Scheduler)
# Uncomment to use Celery Beat for cron-like scheduling
# =============================================================================
# celery-beat:
# build:
# context: .
# dockerfile: Dockerfile.backend
# container_name: mockupaws-celery-beat
# restart: unless-stopped
# command: >
# sh -c "celery -A src.jobs.celery_app beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler"
# environment:
# - DATABASE_URL=${DATABASE_URL:-postgresql+asyncpg://postgres:postgres@postgres:5432/mockupaws}
# - CELERY_BROKER_URL=${REDIS_URL:-redis://redis:6379/0}
# - CELERY_RESULT_BACKEND=${REDIS_URL:-redis://redis:6379/0}
# depends_on:
# - redis
# - postgres
# networks:
# - mockupaws-network
# Reuse network from main docker-compose.yml
networks:
mockupaws-network:
external: true
name: mockupaws_mockupaws-network
volumes:
redis_data:
driver: local