New Delhi, October 9: Samsung’s Advanced Institute of Technology (SAIT) has reportedly unveiled a new AI model called the Tiny Recursion Model ( TRM). The Samsung TRM model is said to be extremely small, containing 7 million parameters, yet it reportedly performs comparably with some of the largest language models (LLMs) available. Despite its compact size, TRM is said to compete with models that are 10,000 times larger, demonstrating its abilities.
Samsung TRM is developed under the guidance of Alexia Jolicoeur-Martineau, a senior AI researcher at Samsung Advanced Institute of Technology. As per reports, Samsung’s new open reasoning model, TRM, is said to perform better than models that are 10,000 times its size. The model is said to outperform major language models, which include Deepseek R1, OpenAI’s o3-mini and Google’s Gemini 2.5 Pro, on challenging reasoning tests. Grok Code Fast 1 Now Available on Microsoft Visual Studio: Elon Musk.
Samsung Tiny Recursive Model (TRM)
My brain broke when I read this paper.
A tiny 7 Million parameter model just beat DeepSeek-R1, Gemini 2.5 pro, and o3-mini at reasoning on both ARG-AGI 1 and ARC-AGI 2.
It's called Tiny Recursive Model (TRM) from Samsung.
How can a model 10,000x smaller be smarter?
Here's how… pic.twitter.com/MD2ZWYI1AQ
— Jackson Atkins (@JacksonAtkinsX) October 7, 2025
Samsung TRM Outscored Several Models, Including DeepSeek R1, Gemini 2.5 Pro, OpenAI o3 Mini
A 7 million parameter network from Samsung; Tiny Recursive Model (TRM); just outscored several headline models (DeepSeek R1, Gemini 2.5 Pro, o3 mini) on the ARC AGI reasoning tests. With roughly 1,000 training examples, it hits ~44.6% on ARC 1 and ~7.8% on ARC 2, using less than… pic.twitter.com/L706xL1Rf8
— Dustin (@r0ck3t23) October 8, 2025
Samsung TRM With 7 Million Parameter
Honestly, this is the most exciting paper I've read this month,
a 7M parameter model just crushed DeepSeek-R1, Gemini 2.5 Pro, and o3-mini on ARC-AGI benchmarks.
Samsung's Tiny Recursive Model (TRM) is 10,000x smaller but smarter.
It doesn't just generate text, it thinks… pic.twitter.com/f3mZDzxd0N
— Augustin 🐓 (@augustinabele) October 8, 2025
What is Samsung TRM?
Samsung's senior AI researcher has introduced the TRM, which uses a simpler recursive reasoning method. TRM is said to achieve better generalisation than the Hierarchical Reasoning Model (HRM) while operating with a single tiny network with two layers. The HRMl is described as using two compact neural networks that recurse at different frequencies. This approach outperforms large language models on tasks like Sudoku, Maze, and ARC-AGI, even when trained with small models of 27 million parameters. While HRM shows solving complex problems with small networks, it is reportedly not fully understood and may not be fully optimised. Grok Imagine V0.9 Released: Elon Musk Asks Grok Users To Generate Videos From Still Images Using Newly Improved Version.
As per a paper submitted by arXiv.org, the TRM model with 7 million parameters achieves 45% test accuracy on ARC-AGI1 and 8% on ARC-AGI-2. The performance is said to surpass many large language models, which include Deepseek R1, OpenAI o3-mini, and Google Gemini 2.5 Pro, despite TRM having less than 0.01% of their parameter size. The arXiv paper noted, "Contrary to HRM, TRM requires no complex mathematical theorem, hierarchy, nor biological Arguments."
(The above story first appeared on LatestLY on Oct 09, 2025 11:10 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).













Quickly


