That Facebook thinks it can tackle the problem in this way shows how just powerful its ability to identify people with AI has become.

The news: Facebook announced today that it will use machine learning to detect and block nude or near-nude images and videos that have been shared without permission—before they have even been reported. Revenge porn (the sharing of sexual videos of someone, usually a past partner, without consent) has become a serious problem, with devastating consequences for victims. Facebook also says it will overhaul the process by which victims can report unapproved images. 

Face time: Facebook didn't say exactly what sort of machine learning it was going to use, but it has an almost unparalleled ability to identify people in images, thanks to a vast corpus of labeled training data supplied by its own users.

Training sets: Although many users are unaware, social-media photos are widely used to train machine-learning algorithms. The state-of-art programs are often now better than humans at recognizing people in snaps.

Coming threat: New detection technology could become especially important as it becomes ever easier to generate convincing-looking fake video with AI. The rise of easy-to-use face-swapping software has already led to a proliferation of fake celebrity porn and other weird video mashups.

Silver bullets: The Facebook effort is a worthwhile use of machine learning, but AI is no silver bullet for dealing with harassment, abuse, or fake news on social media (regardless of what Mark Zuckerberg might tell Congress). Humans will always find ways to outwit the best algorithms. Besides that, the problem sadly extends far beyond the walls of Facebook.

Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.