Facebook, as a social media platform, has long held strong and, while somewhat vague, mostly consistent community standards about decency on the platform. Nudity, especially of a sexual nature, is a no-go. So, it was a bit of a surprise to learn recently that Facebook was creating a pilot program to collect millions of nude photos of its users.
The impetus of the idea is that the platform can use image-matching technology to find and stop anyone else from sharing nude photos of a person online. The point being to end the practice of “revenge postings” of nude photos made by jilted former significant others.
Australia’s Office of ESafety to Work with Facebook
The effort is beginning to gain steam, especially after Australia’s Office of eSafety announced that it would be working with Facebook to develop and implement this program, hoping to stop people from posting images of others against their will.
The program works in an interesting way. If a user is worried that a specific picture of themselves may be about to appear on Facebook, they can fill out a form, then send that same image to themselves using Facebook Messenger. That action triggers a notification to Facebook, which will then use its image-matching tech to search for the picture on another page or timeline. If found, the offending image will be deleted.
The action is being called an ‘image-abuse vaccine,’ and the government says the partnership with Facebook will “give Australians a unique opportunity to proactively inoculate themselves from future image-based abuse by coming to our portal and reporting tool…”
The Program is Avaliable in US, UK and Canada
This may be the first time you’ve heard of the program, but it’s already available in the United States, the United Kingdom, and Canada. It stems from the pilot program, which launched earlier this year. During the initial release of the program, Facebook head of global safety, said, “These tools, developed in partnership with global safety experts, are one example of how we’re using new technology to keep people safe and prevent harm…”
The problem is, apparently, especially prevalent in Australia, where experts report that up to 20 percent of all Australians have been impacted by “image-based” abuse. In other words, an intimate photo of them has been published online without their consent.
Supporters say the program is intended to “disable the control and power of perpetrators” especially in the case of “ex-partner retribution and sextortion…”
Twitter and other social media platforms simply banned the practice, but Facebook says they want to prevent rather than punish after the fact. Should be interesting to see where this security effort goes.
Top Public Relations News:
District of Columbia Developmental Disabilities Council Issues Marketing RFP
Digital Media RFP Issued By State of Illinois Public Institutions of Higher Education
Media RFP Issued By The Missouri Children’s Trust Fund
Fashion Institute of Technology Issues Marketing RFP
The Potential of Podcasts for PR
San Bernardino County Transportation Seeking Event Management Agency
SC Department of Parks, Recreation & Tourism Issues Market Research RFP
Walmart Supports Women-owned Businesses Around the World
United Nations Commissioner for Refugees Issues Media Training RFP
History Colorado Exhibits and Projects Issues Public Relations RFP