best android apps to sell stuff

Earlier this week, Facebook sent out a strange request to some of its users: send us your nude photos. While that may sound a little alarming, the call for naughty snaps is all part of a pilot program the social giant is running in Australia that it hopes will stop revenge porn from being spread across its main site, Messenger, and Instagram.

Initial reports from Australia attempted to explain why Facebook actually needs users’ nude photos for the trial. Apparently any nude photos or videos uploaded to Facebook as part of the scheme are converted into a digital “hash” which turns the image into numerical data.

This can then be tracked using the same imaging algorithms Facebook’s AI uses to match other users’ faces in uploaded photos. Then, if someone were to obtain your sensitive files and attempt to upload them without your consent, Facebook’s AI systems would block the upload.

Unsurprisingly, and despite suggestions that the images would be safe and sound during the entire process, Facebook’s noble attempt to preempt the spread of revenge porn still raised a few eyebrows. After all, surely someone at Facebook will be looking at these photos during the conversion process?

Now, Facebook’s global head of safety Antigone Davis has attempted to clear up those concerns. In a blog post, Davis emphasised that the pilot, which is being run in partnership with Australia’s eSafety Commissioner’s Office, is completely voluntary and is “a protective measure that can help prevent a much worse scenario where an image is shared more widely.”

He did admit, though, that yes, a “specially trained representative from our Community Operations team” will look at and review each image before hashing it, thereby creating a “human-unreadable, numerical fingerprint of it.”

“We are not asking random people to submit their nude photos.”

Facebook security chief Alex Stamos later took to Twitter to clarify that even though it will only keep the images on its servers for a limited amount of time, there are inherent privacy risks to the entire scheme.

Nevertheless, Stamos explained that “it’s a risk we are trying to balance against the serious, real-world harm that occurs every day when people (mostly women) can’t stop [non-consensual intimate images] from being posted.”

“To prevent adversarial reporting, at this time we need to have humans review the images in a controlled, secure environment,” Stamos continued. “We are not asking random people to submit their nude photos. This is a test to provide some option to victims to take back control. The test will help us figure out how to best protect people on our products and elsewhere.”

The blog post also stresses that the pilot’s methodology is intended as an emergency option. Facebook has yet to confirm whether the pilot will be rolled out to other regions.