China's DeepSeek has launched a new experimental model with faster performance called DeepSeek-V3.2-Exp. This new model is built on V3.1-Terminus and debuts DeepSeek Sparse Attention (DSA) for faster, more efficient training and inference on long context, the company said. The new DeepSeek V3.2-Exp performs on par with DeepSeek V3.1-Terminus with reduced compute costs. It is available as open source on Hugging Face and GitHub. ChatGPT New Feature Update: Sam Altman-Run OpenAI Introduces Parental Controls, Rolling Out to All ChatGPT Users on Web, Coming Soon for Mobile Users.
DeepSeek-V3.2-Exp Released With Faster Performance
🚀 Introducing DeepSeek-V3.2-Exp — our latest experimental model!
✨ Built on V3.1-Terminus, it debuts DeepSeek Sparse Attention(DSA) for faster, more efficient training & inference on long context.
👉 Now live on App, Web, and API.
💰 API prices cut by 50%+!
1/n
— DeepSeek (@deepseek_ai) September 29, 2025
(SocialLY brings you all the latest breaking news, fact checks and information from social media world, including Twitter (X), Instagram and Youtube. The above post contains publicly available embedded media, directly from the user's social media account and the views appearing in the social media post do not reflect the opinions of LatestLY.)













Quickly


