
Instagram is preparing to test a feature that would cover images containing nudity in Direct Messages and protect users from unpleasant content exposure.
The early news about “nudity protection” by Instagram was shared by Alessandro Paluzzi, a developer who is well known for reverse engineering apps .
This new nudity protection feature would enable Instagram to detect the nudity element in iOS, which filters messages on a user’s gadget to detect potential nudes in attached images.
If this nudity protection feature is adopted, Instagram will automatically blur an image if the app finds a photo with nudity in Direct Messages. A notification will be sent to the user intimating that they have received an image that may contain nudity. Further more giving an option to see the blurred images.
According to the screenshot shared by the developer, nudity protection is an option that can be turned on and off in iOS settings.
In the screenshot shared by Paluzzi on Twitter, Instagram tries to reassure users that the company “can’t access the photos” and it is merely “technology on your device that blurrs obscene images.
This message intimates that iOS technology on an Apple device will access messages and filter the obscene content while Instagram will not examine images in direct messages.
However, Apple has tried to assure users that Apple has no policy to download the images and only Artificial Intelligence (AI) is doing this filtering and data matching and does not trace the particulars of a user’s online activity.
Nonetheless, the nudity protection feature is a good step of Instagram’s parent company, Meta which is showing its commitment to increasing protection for younger users.
Meta has taken measures to keep younger audiences safe on its platforms and paid a heavy price for it. Meta was penalized a huge amount of $402 million for giving permission to young teenagers to make accounts on Instagram that showed their contact numbers and email addresses publically.

0 Comments