Social Media Illustrations

Facebook is in hot water over autocomplete.


Facebook issued an apology on Friday after offensive terms appeared in the social network’s search predictions late Thursday.

When users typed “videos of” into the search bar, Facebook prompted them to search phrases including “videos of sexuals,” “videos of girl sucking dick under water” and, perhaps most disturbingly, “video of little girl giving oral.”

Shocked users reported the problem on Twitter, posting screenshots of the search terms, which also included multiple suggestions relating to the school shooting in Florida last month. The social network appeared to have fixed the problem by Friday morning.

“We’re very sorry this happened,” said a spokesman for the company in a statement. “As soon as we became aware of these offensive predictions we removed them.”

He added that Facebook is looking into why the phrases appeared and working to improve the quality of search suggestions.

“Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform,” he said. “We do not allow sexually explicit imagery, and we are committed to keeping such content off of our site.”

Search tends to rely on algorithms that pick up on what is trending or generally popular among people using the service and makes suggestions accordingly. Facebook isn’t the only company to have suffered autocomplete problems. In 2016, Google had its own issues with prompting users to search offensive questions about Jews and women.

Facebook remains insistent that it’s not a platform for sexual content, but it also came under fire a few weeks ago for a survey sent out to some users asking if they thought it was appropriate for adult men to ask children for explicit photos of themselves using Messenger. The company apologized and said that it had never and would never allow child grooming on the site.

iHate: CNET looks at how intolerance is taking over the internet.

‘Alexa, be more human’: Inside Amazon’s effort to make its voice assistant smarter, chattier and more like you.