Ollama
Run large language models (LLMs) locally. Compatible with Llama 2, Mistral, Gemma, and more. Your own private AI without depending on external services.
Acerca de Ollama
Run large language models (LLMs) locally. Compatible with Llama 2, Mistral, Gemma, and more. Your own private AI without depending on external services.
Características principales
- Open source models
- OpenAI compatible API
- No usage limits
- Complete privacy
- Multiple models available
- Web interface included
Requisitos del sistema
| Recurso | Mínimo | Recomendado |
|---|---|---|
| RAM | 8 GB | 16 GB |
| CPU | 4 vCPU | 8 vCPU |
| Almacenamiento | 20 GB | 50 GB |