
China’s DeepSeek introduced the V3.1 model two weeks after the GPT-5 was released. Among its advantages, the company notes a hybrid mode of thinking and smarter tool calling.
The DeepSeek V3.1 was quietly announced in a post on one of the company’s WeChat groups, and the model presented on the Hugging Face platform. Next a step in improving the V3 model uses 685 billion parameters, which makes V3.1 one of the world’s largest artificial intelligence systems. However, the context window is only 128 thousand tokens. DeepSeek uses a «expert mix» design, activating only the necessary parts of the model for each query. This results in lower computational costs, which attracts lean developers with a combination of power and efficiency.
The new model combines rapid response capabilities with advanced thinking, which is a technical step forward and makes it more versatile than many alternatives with open source. The hybrid architecture is the biggest feature of V3.1 that sets it apart from previous iterations and other models. The company emphasizes three key advantages:
- Hybrid thinking mode — one model supports both thinking and non-thinking mode by changing the chat template
- Smarter tool invocation — thanks to post-training optimization, the model’s performance in tool use and agent tasks has improved significantly
- Better Thinking Performance — DeepSeek-V3.1-Think achieves comparable response quality to DeepSeek-R1-0528 while responding faster
Many AI developers, particularly from the US, are increasingly creating custom applications based on of the previous DeepSeek R1 model. This is happening despite the fears of the disseminating Chinese narratives and collecting user data.


Pros information of TechSpot, industry experts noted that while the latest DeepSeek release is not as big as the R1 release earlier this year, it is a major achievement. William Falcon, founder and CEO of the Lightning AI platform, called DeepSeek’s steady progress exceptional, pointing to the potential challenge it poses for OpenAI if its own open source offerings fail to keep pace with the Chinese competitor.
Spelling error report
The following text will be sent to our editors: