Files
documente/tests/unit
Luca Sacchi Ricciardi 0b33cd1619
Some checks failed
CI / test (3.10) (push) Has been cancelled
CI / test (3.11) (push) Has been cancelled
CI / test (3.12) (push) Has been cancelled
CI / lint (push) Has been cancelled
feat: add support for local LLM providers (Ollama & LM Studio)
Implement local LLM inference support for Ollama and LM Studio:

New Clients:
- OllamaClient: Interface to Ollama API (default: localhost:11434)
- LMStudioClient: Interface to LM Studio API (default: localhost:1234)

Factory Updates:
- Added OLLAMA and LMSTUDIO to LLMProvider enum
- Updated create_client() to instantiate local clients
- Updated list_available_providers() with is_local flag

Configuration:
- Added ollama_base_url and lmstudio_base_url settings
- Local providers return configured for API key check

Tests:
- Comprehensive test suite (250+ lines)
- Tests for client initialization and invocation
- Factory integration tests

Documentation:
- Added LLM Providers section to SKILL.md
- Documented setup for Ollama and LM Studio
- Added usage examples and configuration guide

Usage:
  provider: ollama, model: llama3.2
  provider: lmstudio, model: local-model
2026-04-06 18:28:21 +02:00
..