DeepSeek Unveils V4 Model with 1.6 Trillion Parameters, Advancing AI Capabilities
DeepSeek, a Chinese laboratory, has introduced its latest large language models, the DeepSeek V4 Flash and V4 Pro, which feature significant architectural improvements over previous versions. The V4 Pro model, with 1.6 trillion parameters, is currently the largest open-weights model globally, surpassing competitors like Moonshot AI's Kimi K 2.6. These models support a context window of up to 1 million tokens, enabling users to handle extensive volumes of documents and code. The V4-Pro-Max model reportedly outperforms leading systems like OpenAI's GPT-5.2 in logical tasks, although it still trails behind U.S. models in knowledge-based evaluations.