Instagram to Test Nudity Blurring Features to Boost Teen Safety in an attempt to soothe worries about harmful content on its applications. Instagram’s parent company Meta announced on Thursday that the app will test features that blur messages containing nudity to protect teenagers and stop prospective scammers from accessing them.
The software giant is facing increasing pressure in the US and Europe due to claims that its applications were addictive and contributed to youth mental health problems.
According to Meta, the safety function for direct messaging on Instagram will utilize machine learning on the device to determine whether an image transmitted over the service contains nudity.
Also Read: Google files suit against Chinese nationals for promoting fake crypto apps on Play Store
Users under the age of eighteen will automatically have the function enabled, and Meta will alert adults to encourage them to do the same.
“Because the images are analyzed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meta won’t have access to these images – unless someone chooses to report them to us,” the company said.
Instagram’s direct messages are not encrypted, in contrast to Meta’s Messenger and WhatsApp apps, although the firm has stated that it intends to provide encryption for the platform.
Additionally, Meta announced that it was testing new pop-up messages for users who might have engaged with accounts that might be involved in sextortion schemes and that it was developing technologies to assist in identifying such accounts.
Also Read: TikTok is bringing its dedicated STEM feed to Europe
The social media behemoth announced in January that it would block more content from minors on Facebook and Instagram. The purpose of this move was to make it harder for teenagers to encounter sensitive content, such as images of eating disorders, suicide, and self-harm.
The firm was sued in October by the attorneys general of 33 US states, including New York and California, who said the company had consistently misled the public about the risks associated with its platforms.
The European Commission has inquired about Meta’s efforts to shield minors from dangerous and illicit content in Europe.