Commit Graph

5 Commits

Author SHA1 Message Date
Luca Sacchi Ricciardi
fbbd3ed50c docs: enhance local LLM providers documentation with network setup
Some checks failed
CI / test (3.10) (push) Has been cancelled
CI / test (3.11) (push) Has been cancelled
CI / test (3.12) (push) Has been cancelled
CI / lint (push) Has been cancelled
Expand SKILL.md LLM Providers section with comprehensive network configuration:

New Sections:
- Configurazione URL Personalizzato: Table of scenarios (localhost, LAN, VPN, Docker)
- Setup Ollama per Rete: 3 configuration options (env var, systemd, docker)
- Setup LM Studio per Rete: Step-by-step with CORS configuration
- Architetture Comuni: ASCII diagrams for common setups
  * Server AI dedicato
  * Multi-client configuration
  * VPN Remote access
- Sicurezza & Firewall: UFW and iptables rules
- SSH Tunnel: Secure alternative to direct exposure
- Troubleshooting Rete: Common issues and solutions

Security Notes:
- Warning about exposing to public IP
- Firewall rules for LAN-only access
- SSH tunnel as secure alternative

Examples:
- Docker deployment
- WireGuard VPN setup
- Multi-user configurations
- CORS troubleshooting

This documentation enables users to deploy Ollama/LM Studio
on dedicated servers and access from multiple clients.
2026-04-06 18:31:57 +02:00
Luca Sacchi Ricciardi
0b33cd1619 feat: add support for local LLM providers (Ollama & LM Studio)
Some checks failed
CI / test (3.10) (push) Has been cancelled
CI / test (3.11) (push) Has been cancelled
CI / test (3.12) (push) Has been cancelled
CI / lint (push) Has been cancelled
Implement local LLM inference support for Ollama and LM Studio:

New Clients:
- OllamaClient: Interface to Ollama API (default: localhost:11434)
- LMStudioClient: Interface to LM Studio API (default: localhost:1234)

Factory Updates:
- Added OLLAMA and LMSTUDIO to LLMProvider enum
- Updated create_client() to instantiate local clients
- Updated list_available_providers() with is_local flag

Configuration:
- Added ollama_base_url and lmstudio_base_url settings
- Local providers return configured for API key check

Tests:
- Comprehensive test suite (250+ lines)
- Tests for client initialization and invocation
- Factory integration tests

Documentation:
- Added LLM Providers section to SKILL.md
- Documented setup for Ollama and LM Studio
- Added usage examples and configuration guide

Usage:
  provider: ollama, model: llama3.2
  provider: lmstudio, model: local-model
2026-04-06 18:28:21 +02:00
Luca Sacchi Ricciardi
568489cae4 docs: comprehensive documentation for NotebookLM-RAG integration
Some checks failed
CI / test (3.10) (push) Has been cancelled
CI / test (3.11) (push) Has been cancelled
CI / test (3.12) (push) Has been cancelled
CI / lint (push) Has been cancelled
Update documentation to reflect new integration features:

README.md:
- Add 'Integrazione NotebookLM + RAG' section after Overview
- Update DocuMente component section with new endpoints
- Add notebooklm_sync.py and notebooklm_indexer.py to architecture
- Add integration API examples
- Add link to docs/integration.md

SKILL.md:
- Add RAG Integration to Capabilities table
- Update Autonomy Rules with new endpoints
- Add RAG Integration section to Quick Reference
- Add Sprint 2 changelog with integration features
- Update Skill Version to 1.2.0

docs/integration.md (NEW):
- Complete integration guide with architecture diagram
- API reference for all sync and query endpoints
- Usage examples and workflows
- Best practices and troubleshooting
- Performance considerations and limitations
- Roadmap for future features

All documentation now accurately reflects the unified
NotebookLM + RAG agent capabilities.
2026-04-06 18:01:50 +02:00
Luca Sacchi Ricciardi
fe88bf2ca1 refactor: fix linting issues and code quality
- Fix import ordering in __init__.py
- Remove unused imports from dependencies.py
- Fix import sorting across multiple files
- Apply ruff auto-fixes

No functional changes
2026-04-06 01:19:38 +02:00
Luca Sacchi Ricciardi
4b7a419a98 feat(api): implement notebook management CRUD endpoints
Implement Sprint 1: Notebook Management CRUD

- Add NotebookService with full CRUD operations
- Add POST /api/v1/notebooks (create notebook)
- Add GET /api/v1/notebooks (list with pagination)
- Add GET /api/v1/notebooks/{id} (get by ID)
- Add PATCH /api/v1/notebooks/{id} (partial update)
- Add DELETE /api/v1/notebooks/{id} (delete)
- Add Pydantic models for requests/responses
- Add custom exceptions (ValidationError, NotFoundError, NotebookLMError)
- Add comprehensive unit tests (31 tests, 97% coverage)
- Add API integration tests (26 tests)
- Fix router prefix duplication
- Fix JSON serialization in error responses

BREAKING CHANGE: None
2026-04-06 01:13:13 +02:00