Social Media's Stepped-Up Crackdown on Terrorists Still Falls Short

25.09.2018

Cyber Hacker - Credit: Bill Hinton Getty Images

Cyber Hacker - Credit: Bill Hinton Getty Images

YouTube, Facebook and other sites are working together to find and delete extremist propaganda and recruiting videos, but a new study says they can do better

Online video has long been a crucial recruitment and propaganda tool for the Islamic State in Iraq and Syria (ISIS) as well as al Qaeda and other terrorist organizations. The videos are inexpensive to make and easy to distribute via YouTube, Facebook, Twitter and other social networks. Facing sharp criticism over the situation, these companies claimed last year to be stepping up efforts to use both human employees and artificially intelligent software to find and delete videos promoting violence.

A study released Tuesday by a nongovernmental organization monitoring terrorist groups and their supporters indicates YouTube, in particular, has made progress in stemming the flood of uploaded extremist content. But the Counter Extremism Project (CEP) also notes terrorists are still finding a big audience on Google’s video-sharing network. ISIS members and supporters uploaded 1,348 YouTube videos garnering 163,391 views between March and June, according to CEP. “That’s a lot eyes on those videos,” says Hany Farid, a senior CEP adviser and the study’s lead researcher. Twenty-four percent of those videos remained on YouTube for at least two hours—long enough to be copied and disseminated on Facebook, Twitter and other popular social platforms, even when YouTube could find and delete them.

The videos CEP tracked were posted by 278 different accounts, 60 percent of which were allowed to remain on the platform even after having videos deleted by YouTube. “It’s discouraging that accounts caught posting terrorist material are allowed to continue uploading videos even after they’ve had their videos removed for violating YouTube’s terms of service,” says Farid, who is also a Dartmouth College computer science professor. “We know these videos are being created for propaganda purposes to incite and encourage violence, and I find those videos dangerous in a very real way.”

For its study, CEP searched YouTube using 183 keywords often associated with ISIS. These included the Arabic words for “crusader” and “jihad,” as well as the names of ISIS-controlled provinces, media outlets and prominent propagandists. CEP’s software searched YouTube every 20 minutes over the study’s three-month period, then used CEP’s video identification system—called eGLYPH—to compare the search results with 229 known terrorist videos in CEP’s own database. The eGLYPH algorithm “boils a video down to its essence,” analyzing when there is a significant change between frames—a new person enters the picture or the camera pans to a focus on something new, Farid says. From that analysis, eGLYPH creates a unique signature—called a “hash”—to identify either an entire video or specific scenes within it. Farid designed the algorithm to find a particular video even if it has been edited, copied or otherwise altered as it is shared.

CEP has for years criticized social media companies for not doing more to keep extremist content—including videos of beheadings, bombings and calls to violence—off their platforms. Still, Farid says the CEP study shows the companies are making progress. YouTube’s ability to take down the majority of the extremist videos within two hours means “clearly Google is doing something about the amount of terrorism-related content posted to its platform,” Farid says. “I'm encouraged because they’ve gone from not acknowledging the problem to actively pursuing it.”

A number of social media companies—including Google, Facebook, Microsoft and Twitter—adopted a “fingerprinting” approach similar to eGLYPH as part of the Global Internet Forum to Counter Terrorism (GIFCT) they launched in June 2017. Instagram, LinkedIn, Snap and others have since joined the coalition, which now numbers 13 companies. They have created a shared database of terrorist video hashes any GIFCT member can access and compare with hashes of videos posted to their sites. GIFCT claims the database will include 100,000 hashes by the end of 2018. Each company applies its own policies and definitions of terrorist material when deciding whether to remove content when it finds a match to a shared hash, according to a Facebook spokesperson. Facebook claims it finds and removes 99 percent of ISIS- and al Qaeda–related content before users report it, thanks to a combination of photo- and video-matching software and human monitors. Google asserts on the GIFCT Web site 98 percent of the YouTube videos it removes for violent extremism content are identified by the company’s AI software, but the company did not respond to Scientific American’s interview requests.

A lot of video still manages to slip through the cracks and onto social media for several reasons including: the massive volume of uploaded content; terrorists’ ability to disguise the nature of their posts; and more recent efforts to disseminate video footage using links to largely unmonitored tools such as Google Drive. YouTube—the world’s second-most visited Web site, behind Google—receives 300 hours of uploaded video every minute. And terrorist groups often upload YouTube videos as “unlisted,” meaning the videos cannot be searched and can be accessed only if a potential viewer is given the link, according to Rita Katz, executive director of SITE Intelligence Group, a Washington-based nongovernmental organization that tracks global terror networks.

Despite the emergence of new apps and sites for sharing video content, YouTube, Facebook, Twitter and other large social media platforms are still the most important to monitor because that is where aspiring terrorists gets their “first taste of ISIS propaganda, ideology and narrative,” says Seamus Hughes, deputy director of The George Washington University’s Program on Extremism. Those companies are improving their ability to find new content as it is posted, but they still have a lot of old video to clean up after taking a largely hands-off approach for the past decade, Hughes says.

Source: Link