Add comprehensive Pydantic schemas for statistics management: - UsageStatsCreate: input validation for creating usage stats - UsageStatsResponse: orm_mode response schema - StatsSummary: aggregated statistics with totals and averages - StatsByModel: per-model breakdown with percentages - StatsByDate: daily usage aggregation - DashboardResponse: complete dashboard data structure All schemas use Decimal for cost precision and proper validation. Test: 16 unit tests, 100% coverage on stats.py
16 KiB
Prompt di Ingaggio: Dashboard & Statistiche (T30-T34)
🎯 MISSIONE
Implementare la fase Dashboard & Statistiche del progetto OpenRouter API Key Monitor seguendo rigorosamente TDD.
Task da completare: T30, T31, T32, T33, T34
📋 CONTESTO
AGENTE: @tdd-developer
Repository: /home/google/Sources/LucaSacchiNet/openrouter-watcher
Stato Attuale:
- ✅ Setup (T01-T05): 59 test
- ✅ Database & Models (T06-T11): 73 test
- ✅ Security Services (T12-T16): 70 test
- ✅ User Authentication (T17-T22): 34 test
- ✅ Gestione API Keys (T23-T29): 61 test
- 🎯 Totale: 297 test, ~98% coverage
Servizi Pronti:
EncryptionService- Cifratura/decifraturaget_current_user()- AutenticazioneApiKey,UsageStatsmodels - Datiget_db()- Database session
Documentazione:
- PRD:
/home/google/Sources/LucaSacchiNet/openrouter-watcher/prd.md - Architecture:
/home/google/Sources/LucaSacchiNet/openrouter-watcher/export/architecture.md(sezione 5.2, 7)
🔧 TASK DA IMPLEMENTARE
T30: Creare Pydantic Schemas per Statistiche
File: src/openrouter_monitor/schemas/stats.py
Requisiti:
UsageStatsCreate: api_key_id, date, model, requests_count, tokens_input, tokens_output, costUsageStatsResponse: id, api_key_id, date, model, requests_count, tokens_input, tokens_output, cost, created_atStatsSummary: total_requests, total_cost, total_tokens_input, total_tokens_output, avg_cost_per_requestStatsByModel: model, requests_count, cost, percentageStatsByDate: date, requests_count, costStatsFilter: start_date, end_date, api_key_id (optional), model (optional)DashboardResponse: summary, by_model (list), by_date (list), trends
Implementazione:
from pydantic import BaseModel, Field
from datetime import date, datetime
from typing import List, Optional
from decimal import Decimal
class UsageStatsCreate(BaseModel):
api_key_id: int
date: date
model: str = Field(..., min_length=1, max_length=100)
requests_count: int = Field(..., ge=0)
tokens_input: int = Field(..., ge=0)
tokens_output: int = Field(..., ge=0)
cost: Decimal = Field(..., ge=0, decimal_places=6)
class UsageStatsResponse(BaseModel):
id: int
api_key_id: int
date: date
model: str
requests_count: int
tokens_input: int
tokens_output: int
cost: Decimal
created_at: datetime
class Config:
from_attributes = True
class StatsSummary(BaseModel):
total_requests: int
total_cost: Decimal
total_tokens_input: int
total_tokens_output: int
avg_cost_per_request: Decimal
period_days: int
class StatsByModel(BaseModel):
model: str
requests_count: int
cost: Decimal
percentage_requests: float
percentage_cost: float
class StatsByDate(BaseModel):
date: date
requests_count: int
cost: Decimal
class StatsFilter(BaseModel):
start_date: date
end_date: date
api_key_id: Optional[int] = None
model: Optional[str] = None
class DashboardResponse(BaseModel):
summary: StatsSummary
by_model: List[StatsByModel]
by_date: List[StatsByDate]
top_models: List[StatsByModel]
Test: tests/unit/schemas/test_stats_schemas.py (10+ test)
T31: Implementare Servizio Aggregazione Statistiche
File: src/openrouter_monitor/services/stats.py
Requisiti:
- Funzioni per aggregare dati usage_stats:
get_summary(db, user_id, start_date, end_date, api_key_id=None) -> StatsSummaryget_by_model(db, user_id, start_date, end_date) -> List[StatsByModel]get_by_date(db, user_id, start_date, end_date) -> List[StatsByDate]get_dashboard_data(db, user_id, days=30) -> DashboardResponse
- Query SQLAlchemy con group_by, sum, avg
- Filtra per user_id attraverso join con ApiKey
- Gestione timezone (UTC)
Implementazione:
from sqlalchemy.orm import Session
from sqlalchemy import func, desc, and_
from datetime import date, timedelta
from typing import List, Optional
from decimal import Decimal
from openrouter_monitor.models import UsageStats, ApiKey
from openrouter_monitor.schemas import (
StatsSummary, StatsByModel, StatsByDate,
DashboardResponse, StatsFilter
)
async def get_summary(
db: Session,
user_id: int,
start_date: date,
end_date: date,
api_key_id: Optional[int] = None
) -> StatsSummary:
"""Get summary statistics for user."""
query = db.query(
func.sum(UsageStats.requests_count).label('total_requests'),
func.sum(UsageStats.cost).label('total_cost'),
func.sum(UsageStats.tokens_input).label('total_tokens_input'),
func.sum(UsageStats.tokens_output).label('total_tokens_output'),
func.avg(UsageStats.cost).label('avg_cost')
).join(ApiKey).filter(
ApiKey.user_id == user_id,
UsageStats.date >= start_date,
UsageStats.date <= end_date
)
if api_key_id:
query = query.filter(UsageStats.api_key_id == api_key_id)
result = query.first()
period_days = (end_date - start_date).days + 1
return StatsSummary(
total_requests=result.total_requests or 0,
total_cost=Decimal(str(result.total_cost or 0)),
total_tokens_input=result.total_tokens_input or 0,
total_tokens_output=result.total_tokens_output or 0,
avg_cost_per_request=Decimal(str(result.avg_cost or 0)),
period_days=period_days
)
async def get_by_model(
db: Session,
user_id: int,
start_date: date,
end_date: date
) -> List[StatsByModel]:
"""Get statistics grouped by model."""
results = db.query(
UsageStats.model,
func.sum(UsageStats.requests_count).label('requests_count'),
func.sum(UsageStats.cost).label('cost')
).join(ApiKey).filter(
ApiKey.user_id == user_id,
UsageStats.date >= start_date,
UsageStats.date <= end_date
).group_by(UsageStats.model).order_by(desc('cost')).all()
# Calculate percentages
total_requests = sum(r.requests_count for r in results) or 1
total_cost = sum(r.cost for r in results) or 1
return [
StatsByModel(
model=r.model,
requests_count=r.requests_count,
cost=Decimal(str(r.cost)),
percentage_requests=(r.requests_count / total_requests) * 100,
percentage_cost=(r.cost / total_cost) * 100
)
for r in results
]
async def get_by_date(
db: Session,
user_id: int,
start_date: date,
end_date: date
) -> List[StatsByDate]:
"""Get statistics grouped by date."""
results = db.query(
UsageStats.date,
func.sum(UsageStats.requests_count).label('requests_count'),
func.sum(UsageStats.cost).label('cost')
).join(ApiKey).filter(
ApiKey.user_id == user_id,
UsageStats.date >= start_date,
UsageStats.date <= end_date
).group_by(UsageStats.date).order_by(UsageStats.date).all()
return [
StatsByDate(
date=r.date,
requests_count=r.requests_count,
cost=Decimal(str(r.cost))
)
for r in results
]
async def get_dashboard_data(
db: Session,
user_id: int,
days: int = 30
) -> DashboardResponse:
"""Get complete dashboard data."""
end_date = date.today()
start_date = end_date - timedelta(days=days-1)
summary = await get_summary(db, user_id, start_date, end_date)
by_model = await get_by_model(db, user_id, start_date, end_date)
by_date = await get_by_date(db, user_id, start_date, end_date)
return DashboardResponse(
summary=summary,
by_model=by_model,
by_date=by_date,
top_models=by_model[:5] # Top 5 models
)
Test: tests/unit/services/test_stats.py (15+ test)
T32: Implementare Endpoint GET /api/stats (Dashboard)
File: src/openrouter_monitor/routers/stats.py
Requisiti:
- Endpoint:
GET /api/stats - Auth: Richiede
current_user - Query params: days (default 30, max 365)
- Ritorna:
DashboardResponse - Usa servizio
get_dashboard_data()
Implementazione:
from fastapi import APIRouter, Depends, Query
from sqlalchemy.orm import Session
from datetime import date
from openrouter_monitor.database import get_db
from openrouter_monitor.dependencies import get_current_user
from openrouter_monitor.models import User
from openrouter_monitor.schemas import DashboardResponse
from openrouter_monitor.services.stats import get_dashboard_data
router = APIRouter(prefix="/api/stats", tags=["stats"])
@router.get("/dashboard", response_model=DashboardResponse)
async def get_dashboard(
days: int = Query(default=30, ge=1, le=365),
current_user: User = Depends(get_current_user),
db: Session = Depends(get_db)
):
"""Get dashboard statistics for current user.
Returns summary, usage by model, usage by date for the specified period.
"""
return await get_dashboard_data(db, current_user.id, days)
Test:
- Test dashboard default 30 giorni
- Test dashboard con days custom
- Test dashboard limitato a 365 giorni
- Test senza autenticazione (401)
T33: Implementare Endpoint GET /api/usage (Dettaglio)
File: src/openrouter_monitor/routers/stats.py
Requisiti:
- Endpoint:
GET /api/usage - Auth: Richiede
current_user - Query params:
- start_date (required)
- end_date (required)
- api_key_id (optional)
- model (optional)
- skip (default 0)
- limit (default 100, max 1000)
- Ritorna: lista
UsageStatsResponsecon paginazione - Ordinamento: date DESC, poi model
Implementazione:
from fastapi import Query
from typing import List, Optional
@router.get("/usage", response_model=List[UsageStatsResponse])
async def get_usage_details(
start_date: date,
end_date: date,
api_key_id: Optional[int] = None,
model: Optional[str] = None,
skip: int = Query(default=0, ge=0),
limit: int = Query(default=100, ge=1, le=1000),
current_user: User = Depends(get_current_user),
db: Session = Depends(get_db)
):
"""Get detailed usage statistics with filtering and pagination.
Returns raw usage data aggregated by date and model.
"""
from sqlalchemy import and_
query = db.query(UsageStats).join(ApiKey).filter(
ApiKey.user_id == current_user.id,
UsageStats.date >= start_date,
UsageStats.date <= end_date
)
if api_key_id:
query = query.filter(UsageStats.api_key_id == api_key_id)
if model:
query = query.filter(UsageStats.model == model)
usage = query.order_by(
UsageStats.date.desc(),
UsageStats.model
).offset(skip).limit(limit).all()
return usage
Test:
- Test filtro per date
- Test filtro per api_key_id
- Test filtro per model
- Test paginazione (skip, limit)
- Test combinazione filtri
T34: Scrivere Test per Stats Endpoints
File: tests/unit/routers/test_stats.py
Requisiti:
- Test integrazione per dashboard e usage endpoints
- Mock dati usage_stats per test consistenti
- Test coverage >= 90%
Test da implementare:
-
Dashboard Tests:
- GET /api/stats/dashboard default 30 giorni
- GET /api/stats/dashboard con days param
- GET /api/stats/dashboard dati corretti
- GET /api/stats/dashboard top models
-
Usage Tests:
- GET /api/usage filtro date
- GET /api/usage filtro api_key_id
- GET /api/usage filtro model
- GET /api/usage paginazione
-
Security Tests:
- Utente A non vede usage di utente B
- Filtro api_key_id di altro utente ritorna vuoto
- Senza autenticazione (401)
🔄 WORKFLOW TDD
Per OGNI task:
- RED: Scrivi test che fallisce (prima del codice!)
- GREEN: Implementa codice minimo per passare il test
- REFACTOR: Migliora codice, test rimangono verdi
📁 STRUTTURA FILE DA CREARE
src/openrouter_monitor/
├── schemas/
│ ├── __init__.py # Aggiungi export stats schemas
│ └── stats.py # T30
├── routers/
│ ├── __init__.py # Aggiungi stats router
│ └── stats.py # T32, T33
├── services/
│ ├── __init__.py # Aggiungi export stats
│ └── stats.py # T31
└── main.py # Registra stats router
tests/unit/
├── schemas/
│ └── test_stats_schemas.py # T30 + T34
├── services/
│ └── test_stats.py # T31 + T34
└── routers/
└── test_stats.py # T32, T33 + T34
🧪 ESEMPI TEST
Test Schema
def test_stats_summary_calculates_correctly():
summary = StatsSummary(
total_requests=1000,
total_cost=Decimal("125.50"),
total_tokens_input=50000,
total_tokens_output=20000,
avg_cost_per_request=Decimal("0.1255"),
period_days=30
)
assert summary.total_requests == 1000
assert summary.total_cost == Decimal("125.50")
Test Servizio
@pytest.mark.asyncio
async def test_get_summary_returns_correct_totals(db_session, test_user, sample_usage_stats):
summary = await get_summary(
db_session,
test_user.id,
date(2024, 1, 1),
date(2024, 1, 31)
)
assert summary.total_requests > 0
assert summary.total_cost > 0
Test Endpoint
def test_dashboard_returns_summary_and_charts(client, auth_token, db_session):
response = client.get(
"/api/stats/dashboard",
headers={"Authorization": f"Bearer {auth_token}"}
)
assert response.status_code == 200
data = response.json()
assert "summary" in data
assert "by_model" in data
assert "by_date" in data
✅ CRITERI DI ACCETTAZIONE
- T30: Schemas stats con validazione completa
- T31: Servizio aggregazione con query SQLAlchemy
- T32: Endpoint /api/stats/dashboard con parametri
- T33: Endpoint /api/usage con filtri e paginazione
- T34: Test completi coverage >= 90%
- Tutti i test passano:
pytest tests/unit/ -v - Utenti vedono solo proprie statistiche
- Aggregazioni corrette (sum, avg, group_by)
- 5 commit atomici con conventional commits
- progress.md aggiornato
📝 COMMIT MESSAGES
feat(schemas): T30 add Pydantic statistics schemas
feat(services): T31 implement statistics aggregation service
feat(stats): T32 implement dashboard endpoint
feat(stats): T33 implement usage details endpoint with filters
test(stats): T34 add comprehensive statistics endpoint tests
🚀 VERIFICA FINALE
cd /home/google/Sources/LucaSacchiNet/openrouter-watcher
# Test schemas
pytest tests/unit/schemas/test_stats_schemas.py -v
# Test services
pytest tests/unit/services/test_stats.py -v --cov=src/openrouter_monitor/services
# Test routers
pytest tests/unit/routers/test_stats.py -v --cov=src/openrouter_monitor/routers
# Test completo
pytest tests/unit/ -v --cov=src/openrouter_monitor
📊 ESEMPI RISPOSTE API
Dashboard Response
{
"summary": {
"total_requests": 15234,
"total_cost": "125.50",
"total_tokens_input": 450000,
"total_tokens_output": 180000,
"avg_cost_per_request": "0.0082",
"period_days": 30
},
"by_model": [
{
"model": "anthropic/claude-3-opus",
"requests_count": 5234,
"cost": "89.30",
"percentage_requests": 34.3,
"percentage_cost": 71.2
}
],
"by_date": [
{
"date": "2024-01-15",
"requests_count": 523,
"cost": "4.23"
}
],
"top_models": [...]
}
Usage Response
[
{
"id": 1,
"api_key_id": 1,
"date": "2024-01-15",
"model": "anthropic/claude-3-opus",
"requests_count": 234,
"tokens_input": 45000,
"tokens_output": 12000,
"cost": "8.92",
"created_at": "2024-01-15T12:00:00Z"
}
]
📝 NOTE IMPORTANTI
- Path assoluti: Usa sempre
/home/google/Sources/LucaSacchiNet/openrouter-watcher/ - Timezone: Usa UTC per tutte le date
- Decimal: Usa Decimal per costi (precisione 6 decimali)
- Performance: Query con indici (date, api_key_id, model)
- Isolation: Utenti vedono solo proprie statistiche (filtro user_id via ApiKey join)
- Limiti: Max 365 giorni per dashboard, max 1000 risultati per usage
AGENTE: @tdd-developer
INIZIA CON: T30 - Pydantic statistics schemas
QUANDO FINITO: Conferma completamento, coverage >= 90%, aggiorna progress.md