Like a heavy battle, major AI research labs are constantly striving to improve their models. The latest blow comes from Chinese research lab DeepSeek, which has repeatedly shaken up the industry with its launches of high-performance, low-cost models. DeepSeek-V3.2,
According to the company’s technical report, the new release introduces an innovative architecture designed to radically improve efficiency while maintaining top-tier logic capabilities. On some benchmarks, it outperforms GPT-5.
To understand if DeepSeek is ready to disrupt the market once again, I discussed the new release with Paul Roetzer, Founder and CEO of SmarterX and the Marketing AI Institute. Episode 183 of Artificial Intelligence Show,
competition with china
Introduces a new mechanism called DeepSeek-V3.2 “DeepSeek Sparse Attention” (DSA).
This allows models to process long streams of information with significantly less computational complexity than traditional models. The result is a system that balances high efficiency with deep reasoning capabilities, especially in “long context” scenarios where other models might struggle or become prohibitively expensive.
For Roetzer, the release confirms that the battle for AI dominance is global.
“Obviously DeepSeq is a major player in this and could be a disruptive force for what the models are doing in American AI labs,” Roetzer says.
gold medal show
The release includes two different variants: the standard DeepSeek-v3.2 and a high-compute version deepseek-v3.2-special,
The capabilities outlined in the technical paper are eye-opening:
- Agentic Thinking: The model integrates a “thinking process” directly into the use of the device, allowing it to reason through complex tasks that use external software or code.
- GPT-5 Level Reasoning: The standard V3.2 model outperforms “GPT-5-High” on several logical benchmarks.
- Gold Medal Performance: V3.2-Special outperformed GPT-5 and Google’s Gemini-3.0-Pro on several benchmarks and won gold medals at both the 2025 International Mathematical Olympiad and the International Olympiad in Informatics.
“One-upping” meta?
DeepSeek’s aggressive open-source strategy is also having an impact on other major players, especially Meta.
Meta CEO Mark Zuckerberg has spent billions of dollars establishing the company as a champion of open-source AI, aiming to commoditize the model. But DeepSeek continues to release models that rival or beat the best Western open-source alternatives, often with greater efficiency.
“That’s probably the biggest danger: DeepSeek is surpassing Zuckerberg in doing what he wanted to do, which was to commoditize the model market with open source models,” Roetzer says. “And they’ve already beaten them many times.”
bottom line
The gap between the open model and the proprietary frontier model may be narrowing faster than expected.
“Definitely noteworthy,” Roetzer says. “Usually when DeepSeek does something, it has an immediate impact.”
