Technology

Google SensorLM: Google Research Introduces AI Model To Connect Multimodal Wearable Sensor Signals to Natural Language for Deeper Understanding of Health and Activities

Google Research has unveiled SensorLM, a new sensor-language model trained on 59.7 million hours of data from over 103,000 individuals. Designed to translate wearable sensor data into natural language. Google said, 'We evaluated SensorLM on a wide range of real-world tasks in human activity recognition and healthcare.'

Google SensorLM: Google Research Introduces AI Model To Connect Multimodal Wearable Sensor Signals to Natural Language for Deeper Understanding of Health and Activities

Google Research shared a post on X (formerly Twitter) on July 29, 2025, and announced the introduction of SensorLM, a new family of sensor-language foundation models. These models can help wearable data “speak” for itself to make the wearable data understandable in natural language. SensorLM was trained on around 59.7 million hours of multimodal sensor data collected from over 1,03,000 individuals. Google said, “We evaluated SensorLM on a wide range of real-world tasks in human activity recognition and healthcare. The results demonstrate significant advances over previous state-of-the-art models.” SensorLM sets a new benchmark in sensor data interpretation, which can generate human-readable descriptions. Andhra Pradesh CM N Chandrababu Naidu Seeks AI Singapore’s Support To Establish AI Research and Innovation Centres in State.

Google SensorLM

(The above story first appeared on LatestLY on Jul 29, 2025 05:07 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).