DeepSeek
Cheapest frontier LLM API
★ 4.3
Ultra-low-cost LLM API with OpenAI-compatible interface. DeepSeek-V3 models offer state-of-the-art performance at a fraction of competitor pricing. MIT-licensed models available for self-hosting.
Fonctionnalités
✓ DeepSeek-Chat: Non-thinking mode, 128K context
✓ DeepSeek-Reasoner: Deep thinking mode, 128K context
✓ OpenAI-compatible API format
✓ JSON output mode
✓ Function/tool calling
✓ Cache hit pricing (90% cheaper)
✓ MIT-licensed models for self-hosting
✓ Multi-round conversations
Avantages
- + 10-50x cheaper than OpenAI/Claude
- + OpenAI-compatible API (easy migration)
- + MIT-licensed models for self-hosting
- + Strong reasoning capabilities
- + 128K context window
Inconvénients
- − Newer provider, less established
- − Based in China (data considerations)
- − Smaller ecosystem/community
- − Limited model selection vs OpenAI