Alibaba’s Qwen has introduced its latest innovation in AI, the Qwen3-VL model, described by the company as “the most powerful vision-language model in the Qwen family to date.”  In a blog post, the Qwen team highlighted major upgrades in this generation, covering from improved text and image understanding to reasoning in dynamic video content. Qwen said, "Whether it’s understanding and generating text, perceiving and reasoning about visual content, supporting longer context lengths, understanding spatial relationships and dynamic videos, or interacting with AI agents — Qwen3-VL shows clear and significant progress in every area." The model, Qwen3-VL-235B-A22B, is open-sourced in two versions, which include Instruct and Thinking. Qwen claimed, "The Instruct version matches or even exceeds Gemini 2.5 Pro in major visual perception benchmarks. The Thinking version achieves state-of-the-art results across many multimodal reasoning benchmarks." The model has come with key breakthroughs like Visual Agents that can operate GUIs on PC or Phone, and Visual Coding, which converts screenshots into HTML, CSS, or JavaScript. It supports advanced spatial reasoning and Thinking Mode, leading in performance in STEM or Math. WAN 2.5 Preview Launched: Alibaba's WAN Launches Next-Gen AI Platform With Advanced Video Generate and Image Capabilities.

Qwen3-VL AI Model

Rating:5

TruLY Score 5 – Trustworthy | On a Trust Scale of 0-5 this article has scored 5 on LatestLY. It is verified through official sources (Official X Account of Qwen). The information is thoroughly cross-checked and confirmed. You can confidently share this article with your friends and family, knowing it is trustworthy and reliable.

(SocialLY brings you all the latest breaking news, fact checks and information from social media world, including Twitter (X), Instagram and Youtube. The above post contains publicly available embedded media, directly from the user's social media account and the views appearing in the social media post do not reflect the opinions of LatestLY.)