Instagram Nudity Protection Feature: Meta-Owned Photo and Video Sharing App To Protect Users From Sextortion With Nude Image Filter for Direct Messages; Check Details

Instagram is working to target teen sextortion scammers with its new nude image filter for direct messages. Check for further details here.

Technology Team Latestly|
Instagram Nudity Protection Feature: Meta-Owned Photo and Video Sharing App To Protect Users From Sextortion With Nude Image Filter for Direct Messages; Check Details
Instagram Logo (Photo Credits: Wikimedia Commons)

New Delhi, April 12: Instagram is in the process of testing new features aimed at safeguarding users from intimate image abuse and sextortion. Instagram is taking serious steps to keep its platform safe, especially for younger users. The platform is working on a nudity protection feature that will filter out inappropriate content in Instagram Direct Messages (DMs).

This means when someone tries to send a nude image via Instagram DM, the filter is expected to block or blur it. This step from Instagram is particularly aimed at combating teen sextortion by preventing to share of intimate images. The new safety feature is a significant step in protecting individuals from sextortion and other privacy violations within Instagram DMs. Meta Testing New Tools To Protect Youth From Sextortion and Other Forms of Intimate Image Abuse on Their Platforms.

As per a report of Forbes, Instagram is targeting teen sextortion scammers with its new nude image filter for direct messages. This is a part of Meta's plan to test new tools dedicated to safeguarding the youth on their platforms from sextortion and other forms of intimate image abuse. The introduction of such features underscores the platform's responsibility to offer a secure environment for all users and it demonstrates proactive measures to address the complex issues that social media users particularly teens, can face online.

What Is the Nude Protection Feature and How Will It Work on Instagram?

As per a report of Times of India, the nude protection feature on Instagram is a safety tool designed to detect and blur images that contain nudity. The feature will be turned on by default for accounts belonging to those below eighteen years globally while prompting adults to switch it on. The feature will make sender think before carrying out the act. Meta and OpenAI To Roll Out Next-Gen AI Models With Advanced Reasoning Abilities; Check Details.

The feature was introduced in January is called “nudity protection system,” which uses an artificial intelligence (AI) technology that identifies nude images being sent through direct messages (DMs) and allows users the option to view, block or report them as appropriate. The main purpose of this feature is to prevent unwanted exposure to nudity and protect against sextortion scammers by a warning which can be acted upon such as viewing, blocking the sender or reporting chat.

(The above story first appeared on LatestLY on Apr 12, 2024 11:34 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).

  • INR
  • USD
  • EUR
View all
City Petrol Diesel
New Delhi 96.72 89.62
Kolkata 106.03 92.76
Mumbai 106.31 94.27
Chennai 102.74 94.33
View all
Currency Price Change
  • INR
  • USD
  • EUR
View all
City Petrol Diesel
New Delhi 96.72 89.62
Kolkata 106.03 92.76
Mumbai 106.31 94.27
Chennai 102.74 94.33
View all
Currency Price Change
Google News Telegram Bot
Close
Latestly whatsapp channel
Close
Latestly whatsapp channel