Google SensorLM: Google Research Introduces AI Model To Connect Multimodal Wearable Sensor Signals to Natural Language for Deeper Understanding of Health and Activities
Google Research has unveiled SensorLM, a new sensor-language model trained on 59.7 million hours of data from over 103,000 individuals. Designed to translate wearable sensor data into natural language. Google said, 'We evaluated SensorLM on a wide range of real-world tasks in human activity recognition and healthcare.'
Google Research shared a post on X (formerly Twitter) on July 29, 2025, and announced the introduction of SensorLM, a new family of sensor-language foundation models. These models can help wearable data “speak” for itself to make the wearable data understandable in natural language. SensorLM was trained on around 59.7 million hours of multimodal sensor data collected from over 1,03,000 individuals. Google said, “We evaluated SensorLM on a wide range of real-world tasks in human activity recognition and healthcare. The results demonstrate significant advances over previous state-of-the-art models.” SensorLM sets a new benchmark in sensor data interpretation, which can generate human-readable descriptions. Andhra Pradesh CM N Chandrababu Naidu Seeks AI Singapore’s Support To Establish AI Research and Innovation Centres in State.
Google SensorLM
Let your wearable data "speak" for itself! Introducing SensorLM, a family of sensor-language foundation models trained on ~60 million hours of data, enabling robust wearable data understanding with natural language. → https://t.co/1vL6df5pMa pic.twitter.com/NxqQ58f1Bl
— Google Research (@GoogleResearch) July 28, 2025
(The above story first appeared on LatestLY on Jul 29, 2025 05:07 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).