Instagram is trialling a feature that will stop explicit images appearing in direct message requests, after research showed trolls using unsolicited DMs to bombard high-profile women with abusive content.
The trial will limit people to sending one DM request to someone who does not follow them and that request will be limited to text-only. Users will be able to send images or videos via DM only after the subject of their request has accepted it.
“In practice, this means people will no longer be able to receive unsolicited images or videos from people they don’t follow,” said Meta, Instagram’s owner, in a statement.
The move follows research that showed the TV presenter Rachel Riley, the women’s safety campaigner Jamie Klingler and the magazine editor Sharan Dhaliwal were subjected to misogynistic, hateful and graphically violent messages in DM requests.
The report published last year by the Centre for Countering Digital Hate found that of 8,720 requests sent to the trio, more than 6% had content that violated Instagram’s content guidelines. Most of those violating messages were either image or video-based, CCDH said, with one example including more than two dozen explicit videos sent to Riley by the same man.
Last year it was reported that Instagram was testing ways to filter out unsolicited nude pictures sent via direct messages. Instagram introduced a feature in 2021 that filters out DM requests containing offensive words, phrases and emojis. At the time Instagram said DM request abuse was a particular problem for “people with larger followings”.
Instagram expects to roll out the new feature to its more than 1 billion users worldwide once it has completed the trial.
Other changes include encouraging teenagers after they have blocked someone to add their parents as supervisors of their accounts “as an extra layer of support”. It is also conducting a worldwide launch of its quiet mode feature, which was introduced in the UK, US and Australia in January and turns off notifications and sends auto-replies to DMs. Instagram said it was also considering asking teens to close the app if they are using Reels, the app’s short video feature, at night.
Meta added that teenagers who have spent 20 minutes on Facebook will receive a prompt to take time away from the app and set daily limits.
Also on Tuesday, Meta announced a series of safety changes for its Messenger service. Parents will be able to view how much time their child spends on Messenger and view their contact list. However, the new parental supervision tools will not allow parents to read their child’s messages.