Facebook wants users to upload nude pictures of themselves to Messenger.
The company believes the best way to combat revenge porn could be to post intimate pictures of yourself online before anyone else manages to.
It’s a highly unusual measure, which is likely to split opinion.
The social network has developed an anti-revenge porn system that uses artificial intelligence to recognise and block specific images, and is testing it in the UK, US, Canada and Australia.
“The safety and well-being of the Facebook community is our top priority,” said Antigone Davis, Facebook’s head of global safety.
“As part of our continued efforts to better detect and remove content that violates our community standards, we’re using image matching technology to prevent non-consensual intimate images from being shared on Facebook, Instagram, Facebook Groups and Messenger.”
Facebook will create a digital fingerprint of a nude picture you flag up to it through Messenger, and automatically block anyone from uploading the same image to the site at a later date.
The company says it won’t store the pictures and only Facebook’s AI is supposed to access them, but the system still requires an enormous amount of trust from users.
Also, if you’re worried about more than one explicit picture of you being posted to the site, you’d have to upload all of them to Messenger.
Furthermore, the system will only protect you from revenge porn on Facebook. People would still be able to post the images elsewhere.
“It would be like sending yourself your image in email, but obviously this is a much safer, secure end-to-end way of sending the image without sending it through the ether,” Australian e-Safety Commissioner Julie Inman Grant told ABC.
“They’re not storing the image, they’re storing the link and using artificial intelligence and other photo-matching technologies. So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”