DeepSeek Free tierOpen weights

DeepSeek V3.2

DeepSeek V3.2 is a DeepSeek model from DeepSeek, released in 2025-09. It costs $0.28 / $0.42 per 1M, has a 128k-token context window, and is best for cheap-reasoning, cheap-production, open-weights. Last verified 2026-04-19.

Spec sheet

Pricing

Input
$0.28 / 1M
Output
$0.42 / 1M
Cached input
$0.028 / 1M
Free tier
OpenRouter

Context & speed

Context window
128k tokens
Max output
64k tokens
Throughput
~85 tok/s
Time to first token
~700 ms
Speed tier
balanced

Capabilities

Tool use
Yes
Structured output
Yes
Prompt caching
Yes
Extended thinking
Yes
Vision input
No
Audio in / out
No
Fine-tuning
Yes

Deployment

Open weights
Yes
On-prem
Yes
HIPAA eligible
No
Zero retention
No
Regions
apac, us

Estimated monthly cost

Assumes typical token shape: 2k input, 600 output per call. Prompt caching is excluded from these figures.

10k calls/mo
$8.12
per month
100k calls/mo
$81.20
per month
1M calls/mo
$812.00
per month

When to use DeepSeek V3.2

Sweet spot

  • cheap reasoning
  • cheap production
  • open weights

Known trade-offs

  • data-routing via China for hosted API
  • below top-20 on arena leaderboard

Works with

DeepSeek SDKOpenAI-compatible APIOpenRouterOllamavLLMLangChain

FAQ — DeepSeek V3.2

How much does DeepSeek V3.2 cost?

DeepSeek V3.2 costs $0.28 / $0.42 per 1M tokens on the DeepSeek API. Cached input reads cost $0.028 per 1M, cutting the input bill by roughly 90% on repeat system prompts.

What is the context window of DeepSeek V3.2?

DeepSeek V3.2 has a 128k-token context window with up to 64k tokens of output. That's enough for typical chat and short-document tasks.

Does DeepSeek V3.2 have a free tier?

Yes — Often available free via OpenRouter; official API is very cheap ($0.28 cache miss, $0.028 cached input). Start at https://openrouter.ai.

Is DeepSeek V3.2 HIPAA / EU / on-prem friendly?

DeepSeek V3.2 is not HIPAA-eligible, not available in an EU region, and offers open weights for self-hosting. Zero data retention is not available.

What is DeepSeek V3.2 best for?

DeepSeek V3.2 is best for cheap reasoning, cheap production, open weights. Trade-offs to be aware of: data-routing via China for hosted API; below top-20 on arena leaderboard.

Which tools and SDKs work with DeepSeek V3.2?

DeepSeek V3.2 integrates with DeepSeek SDK, OpenAI-compatible API, OpenRouter, Ollama, vLLM, LangChain. Most major AI frameworks support it either natively or through OpenAI-compatible endpoints.