DeepSeek open-source V4 model, with a parameter scale of 1.6 trillion

robot
Abstract generation in progress

Odaily Planet Daily News: DeepSeek releases a preview version of the V4 series open-source models, licensed under MIT, with weights available on Hugging Face and ModelScope.

This series includes two MoE models, with V4-Pro having approximately 1.6 trillion total parameters and 49 billion parameters activated per token, and V4-Flash with a total of 284 billion parameters and 13 billion parameters activated, both supporting around 1 million token context. The official states that compared to version V3.2, it significantly reduces memory usage and computational overhead in long text reasoning.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin