Please verify

Blaze Media
Watch LIVE

Facebook wants some users to send in their nude photos. But it's not what you're thinking.

Facebook pilots new technology to help combat revenge porn in Australia.(Justin Sullivan/Getty Images)

If you're in Australia, Facebook wants your nude photographs, but it's not what you may be thinking.

What the heck? Why?

The social media giant is testing new technology in Australia that may help combat revenge porn.

The Australian government agency e-Safety is partnering with Facebook in an effort to prevent people from sharing intimate images without consent, according to the Australian Broadcasting Corporation.

How does it work?

Australian Facebook and Instagram users who are worried they could become a victim of revenge porn are asked to contact the e-Safety Commissioner, who may tell you to send the nude photo (or photos) to yourself using Messenger.

Yes, you read that correctly.

After you send the photos of yourself, Facebook will use its hashing technologies to create a digital footprint that will prevent others from uploading the same image on the social media platform.

Is Facebook storing my image?


"They're not storing the image, they're storing the link and using artificial intelligence and other photo-matching technologies," e-Safety Commissioner Julie Inman Grant told ABC.

Why is this being tested in Australia?

According to the e-Safety Commissioner, one in five women between 18-45 and one in four Indigenous Australians are victims of revenge porn.

"We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly," Inman Grant said.

Australia is one of four countries taking part in the "industry-first" technology. It's not clear where else the method is being piloted.

Will Facebook in the U.S. get this technology?

It's not clear if the U.S. will get the same technology being tested in Australia.

However, Facebook implemented new photo-matching technology in April to help address the problem in the U.S., Tech Crunch reported.

The technology prevents photos that have been previously reported or tagged as porn from being re-shared.

In other words, if an image has been taken down and someone tries to re-post or re-share it on Facebook, Instagram or Messenger, the person will see a pop-up that says the photo violates Facebook's policies, according to Tech Crunch. Facebook will then prevent that particular photo from being shared.

One in 25 Americans is a victim of "nonconsensual image sharing, according to a 2016 report from Data & Society Research Institute 2016.

Most recent

'The View's' guest pushes this big lie as Joy Behar eggs him on

All Articles