This article exists as part of the online archive for HuffPost Australia, which closed in 2021.

Facebook Users Encouraged to 'Inoculate' Themselves From Revenge Porn

An estimated 20 percent of Australians have experienced some form of image-based abuse.

Facebook users have been encouraged to participate in a pilot program which utilises image matching technology to prevent the sharing of non-consensual intimate images online.

Individuals concerned about images they have sent potentially being shared can file an online report with the Office of the eSafety Commissioner which will then notify Facebook as part of a new initiative between the Government agency and social networking giant.

As part of the process, Facebook's Director of Policy Mia Garlick told HuffPost Australia that users will then be prompted to share the image in a private message to themselves on Messenger.

"We will then go and check the account to create the hash from the image in Messenger and we can then use that hash again, run [it] against the photos uploaded on the site to prevent the image even being shared in the first place," she said.

The 'hash' -- a type of digital file -- is created using the image's metadata which means that Facebook won't be retaining any of the pictures users express concern about but rather a series of identifying numbers.

The eSafety Commissioner said a partnership with Facebook had the "potential to disable the control and power" perpetrators hold over victims.
Getty Images
The eSafety Commissioner said a partnership with Facebook had the "potential to disable the control and power" perpetrators hold over victims.

"We don't keep the image," Garlick said.

The image-matching technology, which was unveiled earlier this year, was described by Facebook's Head of Global Safety Antigone Davis as "one example of the potential technology has to help keep people safe".

According to research an estimated 20 percent of Australians have experienced some form of image-based abuse, with women aged 18-24 the most frequently targeted.

eSafety Commissioner Julie Inman Grant said that the partnership with Facebook "gives Australians a unique opportunity to proactively inoculate themselves from future image-based abuse" by using the reporting tool.

"This pilot has the potential to disable the control and power perpetrators hold over victims, particularly in cases of ex-partner retribution and sextortion, and the subsequent harm that could come to them," she said.

An 'intimate' image or video is one that shows you:

  • Engaged in sexual activity;
  • In a manner or context that is sexual;
  • Where you are nude, including where showering or bathing;
  • Where your breasts or genitals are visible; or
  • Where the image focuses on your genital, anal or breast region, including where they are covered in underwear, such as in 'up-skirting' and 'down-blousing'
  • It also includes real pictures, pictures that are digitally altered and those that have been drawn
Close
This article exists as part of the online archive for HuffPost Australia. Certain site features have been disabled. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.