China's DeepSeek has launched a new experimental model with faster performance called DeepSeek-V3.2-Exp. This new model is built on V3.1-Terminus and debuts DeepSeek Sparse Attention (DSA) for faster, more efficient training and inference on long context, the company said. The new DeepSeek V3.2-Exp performs on par with DeepSeek V3.1-Terminus with reduced compute costs. It is available as open source on Hugging Face and GitHub. ChatGPT New Feature Update: Sam Altman-Run OpenAI Introduces Parental Controls, Rolling Out to All ChatGPT Users on Web, Coming Soon for Mobile Users.

DeepSeek-V3.2-Exp Released With Faster Performance

Rating:5

TruLY Score 5 – Trustworthy | On a Trust Scale of 0-5 this article has scored 5 on LatestLY. It is verified through official sources (DeepSeek X Account). The information is thoroughly cross-checked and confirmed. You can confidently share this article with your friends and family, knowing it is trustworthy and reliable.

(SocialLY brings you all the latest breaking news, fact checks and information from social media world, including Twitter (X), Instagram and Youtube. The above post contains publicly available embedded media, directly from the user's social media account and the views appearing in the social media post do not reflect the opinions of LatestLY.)