War crimes are among the most grisly and difficult-to-prosecute crimes; and yet, ironically, the criminals have made it easier for prosecutors, by uploading videos celebrating their atrocities to Big Tech platforms like Facebook and Youtube, where they can act as recruiting tools for terrorists and extremists.
It’s these very videos that human rights activists, atrocity survivors and armchair sleuths turn to in order to perform “open source intelligence” analysis on the perpetrators, effectively doxing them and handing overworked, under-resourced prosecutors the evidence they need to bring war criminals to justice.
Against this trend, though, is Big Tech’s zeal to remove “terrorist content,” a kind of overreaction to years of indifference to complaints about all kinds of bad content that violated the platforms’ own guidelines. The newly self-deputized platforms are playing content police and taking down “terrorist content” as fast as they can find it, using algorithmic dragnets that catch plenty of dolphins along with the tuna they’re trawling for. To make things worse, Big Tech invents its own definitions of “terrorism,” that barely overlap with internationally recognized definitions.
It’s about to get much worse: in the wake of the Christchurch white terror attacks, the Australian government rushed through legislation requiring platforms to remove “terror” content within an hour (a deadline so short that it guarantees that there will be no serious checks undertaken before content is removed) and now both the EU and the UK are poised to follow suit.
And there’s plentiful evidence that terror cops are incredibly sloppy when they wield the censor’s pen: last month, a French intelligence agency gave the Internet Archive 24 hours to remove “terrorist” content, targeting the Archive’s collections of 15,000,000 text files, its copy of Project Gutenberg, and its entire archive of Grateful Dead recordings.
Human rights advocates are sounding the alarm, but no one is listening. It’s a rerun of SESTA/FOSTA, the US anti-sex-trafficking bill that sex workers vigorously opposed, saying it would make them less safe — but which passed anyway, and now sex workers are much less safe.
Designed to identify and take down content posted by “extremists”—“extremists” as defined by software engineers—machine-learning software has become a potent catch-and-kill tool to keep the world’s largest social networks remarkably more sanitized places than they were just a year ago. Google and Facebook break out the numbers in their quarterly transparency reports. YouTube pulled 33 million videos off its network in 2018—roughly 90,000 a day. Of the videos removed after automated systems flagged them, 73 percent were removed so fast that no community members ever saw them. Meanwhile, Facebook removed 15 million pieces of content it deemed “terrorist propaganda” from October 2017 to September 2018. In the third quarter of 2018, machines performed 99.5 percent of Facebook’s “terrorist content” takedowns. Just 0.5 percent of the purged material was reported by users first.
Those statistics are deeply troubling to open-source investigators, who complain that the machine-learning tools are black boxes. Few people, if any, in the human-rights world know how they’re programmed. Are these AI-powered vacuum cleaners able to discern that a video from Syria, Yemen, or Libya might be a valuable piece of evidence, something someone risked his or her life to post, and therefore worth preserving? YouTube, for one, says it’s working with human-rights experts to fine-tune its take-down procedures. But deeper discussions about the technology involved are rare.
“Companies are very loath to let civil society talk directly to engineers,” says Dia Kayyali, a technology-advocacy program manager at Witness, a human-rights organization that works with Khatib and the Syrian Archive. “It’s something that I’ve pushed for. A lot.”
Tech Companies Are Deleting Evidence of War Crimes [Bernhard Warner/The Atlantic]
China’s Xinjiang province is home to the country’s Uyghur ethnic minority and other people of Turkic Muslim descent; it has become a living laboratory for next-generation, electronically mediated totalitarianism; up to 1,000,000 people have been sent to concentration/torture camps in the region, and targets for rendition ot these camps come via compulsory mobile apps that […]
In the past week, the French government’s L’Office Central de Lutte contre la Criminalité liée aux Technologies de l’Information et de la Communication (OCLCTIC) have sent 500 “terrorism” takedown demands to the Internet Archive demanding the removal of tens of millions of works: the entire archive of Project Gutenberg; an archive of 15 million texts, […]
Last year the US Congress passed SESTA/FOSTA, an “anti-sex-trafficking bill” that has resulted in the shuttering of all the services formerly used by sex workers to vet their johns, massively increasing the personal physical risk borne by sex-workers and reinvigorating the dying pimping industry, as sex workers seek out protectors.
In the midst of social media encouraging us to increasingly overshare our lives, a curious thing has happened: Journaling is back. And while the practice of jotting down your thoughts and plans in a private, analog medium is therapeutic, it can also be pretty productive. We’ve tracked down a few decidedly modern notebooks that have […]
Even the most easily organized among us have trouble keeping things tidy on vacation. It’s enough of a hassle just trying to cram everything in your luggage, much less worry about what goes where. The result: Frantic searches for the right item or garment, and a sloppy trash bag to hold those dirty clothes. Then […]
Companies have always prioritized cutting costs and boosting efficiency, but it wasn’t until the past few years that Project Managers grew into big demand. PMs use industry-approved methodologies to reduce waste, streamline workflows, and ensure projects are executed on budget and on time. And they often earn a pretty penny for their services. While demand […]