Meta will auto-blur nudity in Instagram DMs in latest teen safety step

Meta will auto-blur nudity in Instagram DMs in latest teen safety step


Meta has announced it’s testing new features on Instagram intended to help safeguard young people from unwanted nudity or sextortion scams. This includes a feature called Nudity Protection in DMs, which automatically blurs images detected as containing nudity.

The tech giant will also nudge teens to protect themselves by serving a warning encouraging them to think twice about sharing intimate imagery. Meta says it hopes this will boost protection against scammers who may send nude images to trick people into sending their own images in return.

It’s also making changes it suggests will make it more difficult for potential scammers and criminals to find and interact with teens. Meta says it’s developing new technology to identify accounts that are “potentially” involved in sextortion scams and applying some limits to how these suspect accounts can interact with other users. 

In another step announced Thursday, Meta said it’s increased the data it’s sharing with the cross-platform online child safety program Lantern — to include more “sextortion-specific signals”.

The social networking giant has long-standing policies banning the sending of unwanted nudes or seeking to coerce other users into sending intimate images. However that doesn’t stop these problems being rife online — and causing misery for scores of teens and young people, sometimes with extremely tragic results.

We’ve rounded up the latest crop of changes in more detail below.

Nudity screens

Nudity Protection in DMs aims to protect teen Instagram users from cyberflashing by putting nude images behind a safety screen. Users will then be able to choose whether or not to view it.

“We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat,” said Meta. 

The nudity safety-screen will be turned on by default for under 18s globally. Older users will see a notification encouraging them to turn it on.

“When nudity protection is turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they’ve changed their mind,” it added.

Anyone trying to forward a nude image will see the same warning encouraging them to reconsider.

The feature is powered by on-device machine learning so Meta said it will work within end-to-end encrypted chats because the image analysis is carried out on the user’s own device.

Safety tips

In another safeguarding measure, Instagram users sending or receiving nudes will be directed to safety tips — with information about the potential risks involved — which Meta said have been developed with guidance from experts.

“These tips include reminders that people may screenshot or forward images without your knowledge, that your relationship to the person may change in the future, and that you should review profiles carefully in case they’re not who they say they are,” it wrote. “They also link to a range of resources, including Meta’s Safety Center, support helplines, StopNCII.org for those over 18, and Take It Down for those under 18.

It’s also testing pop-up messages for people who may have interacted with an account Meta has removed for sextortion that will also direct them to relevant expert resources.

“We’re also adding new child safety helplines from around the world into our in-app reporting flows. This means when teens report relevant issues — such as nudity, threats to share private images or sexual exploitation or solicitation — we’ll direct them to local child safety helplines where available,” it added.

Tech to spot sextortionists  

While Meta says it removes the accounts of sextortionists when it becomes aware of them, it first needs to spot bad actors to shut them down. So Meta is trying to go further: It says it’s “developing technology to help identify where accounts may potentially be engaging in sextortion scams, based on a range of signals that could indicate sextortion behavior”.

“While these signals aren’t necessarily evidence that an account has broken our rules, we’re taking precautionary steps to help prevent these accounts from finding and interacting with teen accounts,” it goes on, adding: “This builds on the work we already do to prevent other potentially suspicious accounts from finding and interacting with teens.”

It’s not clear exactly what technology Meta is using for this, nor which signals might denote a potential sextortionist (we’ve asked for more) — but, presumably, it may analyze patterns of communication to try to detect bad actors.

Accounts that get flagged by Meta as potential sextortionists will face restrictions on how they can message or interact with other users.

“[A]ny message requests potential sextortion accounts try to send will go straight to the recipient’s hidden requests folder, meaning they won’t be notified of the message and never have to see it,” it wrote.

Users who are already chatting to potential scam or sextortion accounts, will not have their chats shut down but will be show Safety Notices “encouraging them to report any threats to share their private images, and reminding them that they can say no to anything that makes them feel uncomfortable”, per Meta.

Teen users are already protected from receiving DMs from adults they are not connected to on Instagram (and also from other teens in some cases). But Meta is taking a further step of not showing the “Message” button on a teen’s profile to potential sextortion accounts, i.e. even if they’re connected.

“We’re also testing hiding teens from these accounts in people’s follower, following and like lists, and making it harder for them to find teen accounts in Search results,” it added.

It’s worth noting the company is under rising scrutiny in Europe over child safety risks on Instagram, with enforcers asking questions about its approach since the bloc’s Digital Services Act (DSA) came into force last summer.

A long, slow creep towards safety

Meta has announced measures to combat sextortion before — most recently in February when it expanded access to Take It Down.

The third party tool lets people generate a hash of an intimate image locally on their own device and share it with the National Center for Missing and Exploited Children — creating a repository of non-consensual image hashes that companies can use to search for and remove revenge porn.

Previous approaches by Meta had been criticized as they required young people to upload their nudes. In the absence of hard laws regulating how social networks need to protect children Meta was left to self regulate for years — with patchy results.

However with some requirements landing on platforms in recent years, such as the UK’s Children Code, which came into force in 2021 — and, more recently, the EU’s DSA — tech giants like Meta are finally having to pay more attention to protecting minors.

For example, in July 2021 Meta switched to defaulting young people’s Instagram accounts to private just ahead of the UK compliance deadline. Even tighter privacy settings for teens on Instagram and Facebook followed in November 2022.

This January Meta also announced it would default teens on Facebook and Instagram into stricter message settings still with limits on teens messaging teens they’re not already connected to, shortly before the full compliance deadline for the DSA kicked in in February.

Meta’s slow and iterative feature creep when it comes to protective measures for young users raises questions about what took it so long to apply stronger safeguards — suggesting it’s opted for a cynical minimum in safeguarding in a bid to manage the impact on usage and prioritize engagement over safety. (Which is exactly what Meta whistleblower, Francis Haugen, repeatedly denounced her former employer for.)

Asked why it’s not also rolling out the latest protections it’s announced for Instagram users to Facebook, a spokeswomen for Meta told TechCrunch: “We want to respond to where we see the biggest need and relevance — which, when it comes to unwanted nudity and educating teens on the risks of sharing sensitive images — we think is on Instagram DMs, so that’s where we’re focusing first.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *