Mumbai, February 26: Law enforcement agencies have raised a high-level alert regarding the "AI-based biometric scam," a sophisticated evolution in cybercrime where attackers prioritize stealing a person's physical identity over their immediate financial assets. Unlike traditional phishing, this method uses generative artificial intelligence to harvest facial and vocal data, allowing criminals to create "digital clones" that can bypass advanced security systems.

The Hyderabad Cyber Crimes Police recently detailed a common operational pattern where fraudsters frequent crowded public hubs like metro stations and shopping malls. These individuals often pose as vulnerable citizens—such as the elderly or those unfamiliar with technology—and approach victims with requests for minor assistance, such as checking a pension status or fixing a mobile app. CBI Arrests 3 Across 6 States in INR 1.86 Crore ‘Digital Arrest’ Fraud Investigation.

AI-Based Biometric Scam: The Mechanics of Identity Harvesting

The primary danger in these interactions is the state of the scammer's mobile device. While the victim is trying to help, the phone is typically already engaged in a high-definition video call or a screen-recording session with hidden permissions enabled. Within seconds, the victim's facial movements, expressions, and voice patterns are recorded in a format high enough for AI processing.

Once captured, this biometric data is fed into generative AI models to produce deepfakes. These synthetic recreations are then used for "social engineering" attacks, where the criminal impersonates the victim to trick their family members into sending money, or for "liveness detection" fraud, where the AI clone attempts to fool banking apps that require a face-match for transaction authorisation.

Rising Threat of Synthetic Impersonation

Industry experts note that 2026 has become a "pivotal year" for biometric security. While tools like Face ID and fingerprints are designed to be more secure than passwords, the ease with which AI can now replicate human traits has created a new vulnerability. This shift has forced financial institutions to move toward "multi-modal" authentication, which requires multiple types of biometric markers simultaneously to verify a user.

Despite the high-tech nature of the eventual fraud, the initial theft often relies on simple human empathy. By targeting people in a "quick help" scenario, scammers bypass the natural suspicion many users have toward unknown links or emails. Police emphasize that once biometric data is compromised, it cannot be "reset" like a password, making the long-term risk to the victim significantly higher.

Precautionary Measures and Reporting

To counter this emerging trend, the police have issued a strict set of guidelines for the public. Citizens are advised never to handle or operate mobile phones belonging to strangers and to be wary of looking directly into the camera of an unknown person's device. Maintaining a "digital distance" in public spaces is now considered an essential safety practice. Seedance 2.0: RGV Declares ByteDance AI Video Tool ‘Murderer’ of Film Industry Leading to ‘Liberation’ and ‘Ultimate Democratisation’ (See Post).

If an individual suspects their image or voice has been surreptitiously recorded, they are urged to act within the "golden hour" to prevent further damage. Victims should immediately contact the national cybercrime helpline at 1930 or log a formal complaint at the official portal, cybercrime.gov.in. Authorities also recommend enabling app-based authenticators rather than relying solely on SMS-based verification, which is more susceptible to interception.

(The above story first appeared on LatestLY on Feb 26, 2026 03:41 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).