Follow us:

BREAKING
1h ‘We will not tolerate protests which cause disorder’, says Met Police
3h Devolved nations demand meeting amid fears of new era of austerity
5h Russia’s missile stock ‘likely limited’ amid ‘munitions shortages’
1h Rail network crippled as unions join forces for 24-hour walkout
6h Soldier involved in Queen’s funeral procession found dead at barracks
6h West condemns Putin’s land-grab as Ukraine fast-tracks Nato bid

Instagram is trying to stop unwanted nudes from sliding into your DMs

Cyberflashing has been a problem for years, and the tech giant is finally turning its attention to it

Instagram is finally doing something to protect teenage girls from cyberflashing

/ Carl Court/Staff/Getty
By
23 September 2022
I

nstagram is developing a feature that will let users block unwanted nudes in their direct messages.

Meta, Instagram’s parent company, confirmed to The Verge that the feature was in development, after a leaked screenshot was posted on Twitter.

The “nudity protection” toggle will be an optional control that users can activate in-app, similar to the Hidden Words feature introduced last year, which filters out DMs containing offensive words.

The company said it won’t view or store any of the images, rather processing them on the phone itself.

The company didn’t say why it is working on this feature only now – research released last year by the Pew Research Center in the US showed that the share of women who report being sexually harassed online has doubled since 2017, with 81 per cent saying they experienced harassment on social-media sites.

The UK is also set to make cyberflashing, or sending unsolicited sexual messages to strangers, a crime, should Parliament pass the Online Safety Bill. Three-quarters of girls aged 12 to 18 have received unsolicited nudes of boys or men, according to recent research.

“Some will come forward and say [cyber flashing] is harmless. You can’t rank sexual offences like that [...] different forms of offending can have the same impact on different people,” Durham law professor Clare McGlynn, an expert in image-based sexual abuse, said in an interview with HuffPost.

The move comes as Meta yet again finds itself in hot waters with regulators, this time after a $400 million fine from Ireland’s Data Protection Commission for failing to protect children’s information on Instagram.

The watchdog said Instagram set children’s accounts to “public” by default and allowed them to operate business accounts that could expose their phone numbers and emails.

But, more worryingly, public accounts can receive unsolicited DMs from anyone on Instagram. This means adults could message teens that don’t follow them.

Meta said it’s fixed these settings and now under-18s automatically have their profiles set to “private” when they join.

Register for free to continue reading

Sign up for exclusive newsletters, comment on stories, enter competitions and attend events.

ALREADY HAVE AN ACCOUNT? LOG IN
NEED AN ACCOUNT? REGISTER NOW