Embeddable AI Runtime stable
LocalAI
Drop-in OpenAI API replacement running locally
28.0K stars
100 contributors
Since 2023
Drop-in OpenAI API replacement running locally
License
MIT
Min RAM
2 GB
Min CPUs
1 core
Scaling
single_node
Complexity
intermediate
Performance
medium
Self-hostable
✓
K8s native
✕
Offline
✓
Pricing
fully free
Docs quality
good
Vendor lock-in
none
Use cases
- ✓ Drop-in OpenAI API replacement running locally
- ✓ Run multiple AI models (LLM+TTS+STT+Image)
- ✓ Privacy-preserving AI API endpoint
- ✓ Development without API costs
Anti-patterns / when NOT to use
- ✕ Slower than vLLM for pure LLM serving
- ✕ Model compatibility varies
- ✕ Configuration can be complex
Compare with alternatives
Replaces / alternatives to
Technical specs
Language
GoC++
API type
SDK
Protocols
HTTPgRPC
Deployment
dockerbinary
SDKs
pythonjavascriptgo
Community
GitHub stars 28.0K
Contributors 100
Commit frequency weekly
Plugin ecosystem none
Backing Mudler
Funding community
Release
Latest version
— Last release —
Since 2023
Best fit
Team size
solosmallmedium
Industries
general