LLMs & AI Infra Open LLM stable
DeepSeek V3
Frontier open-source reasoning LLM with MoE architecture
15.0K stars
100 contributors
Since 2023
Open-weight LLM: Sparse attention, RL training, 85k+ agentic tasks, MoE, MIT license, GPT-5-level math benchmarks
License
MIT
Min RAM
16 GB
Min CPUs
4 cores
Scaling
distributed
Complexity
advanced
Performance
enterprise grade
Self-hostable
✓
K8s native
✓
Offline
✓
Pricing
fully free
Docs quality
good
Vendor lock-in
none
Use cases
- ✓ Self-hosted AI chatbot with full data privacy
- ✓ Fine-tune for domain-specific tasks
- ✓ Code generation and review assistant
- ✓ Document analysis and summarization
Anti-patterns / when NOT to use
- ✕ Requires multi-GPU for large models
- ✕ Self-hosting needs significant DevOps
- ✕ Smaller models trade quality for speed
- ✕ Not as capable as frontier closed models for hardest tasks
Integrates with
Compare with alternatives
Replaces / alternatives to
Technical specs
Language
Python
API type
SDK
Protocols
HTTP
Deployment
dockerbinary
SDKs
python
Community
GitHub stars 15.0K
Contributors 100
Commit frequency monthly
Plugin ecosystem none
Backing DeepSeek
Funding corporate
Release
Latest version
— Last release —
Since 2023
Best fit
Team size
smallmediumenterprise
Industries
generalsaashealthcarefintechlegaleducation