Zum Hauptinhalt springen
DeepSeek logo

DeepSeek

Cheapest frontier LLM API

4.3

Ultra-low-cost LLM API with OpenAI-compatible interface. DeepSeek-V3 models offer state-of-the-art performance at a fraction of competitor pricing. MIT-licensed models available for self-hosting.

Funktionen

DeepSeek-Chat: Non-thinking mode, 128K context
DeepSeek-Reasoner: Deep thinking mode, 128K context
OpenAI-compatible API format
JSON output mode
Function/tool calling
Cache hit pricing (90% cheaper)
MIT-licensed models for self-hosting
Multi-round conversations

Vorteile

  • + 10-50x cheaper than OpenAI/Claude
  • + OpenAI-compatible API (easy migration)
  • + MIT-licensed models for self-hosting
  • + Strong reasoning capabilities
  • + 128K context window

Nachteile

  • Newer provider, less established
  • Based in China (data considerations)
  • Smaller ecosystem/community
  • Limited model selection vs OpenAI