
“Every time things like these unbelievable crimes are happening, or there is a terrorist attack, it’s very easy to say we have to be strong and we have to restrict rights,” said Birgit Sippel, a German member of the European Parliament. “We have to be very careful.” Of the more than 52 million photos, videos and other materials related to online child sexual abuse reported between January and September this year, over 2.3 million came from the European Union, according to the U.S. federal clearinghouse for the imagery. If the regulation took effect, the rate of reports from Europe would drop precipitously, because automated scanning is responsible for nearly all of them. Photo- and video-scanning software uses algorithms to compare users’ content with previously identified abuse imagery. Other software targeted at grooming searches for key words and phrases known to be used by predators. Facebook, the most prolific reporter of child sexual abuse imagery worldwide, said it would stop proactive scanning entirely in the E.U. if the regulation took effect. In an email, Antigone Davis, Facebook’s global head of safety, said the company was “concerned that the new rules as written today would limit our ability to prevent, detect and respond to harm,” but said it was “committed to complying with the updated privacy laws.”