Until now, YouTube had resisted pro-censorship calls to take down videos posted on its site, which were considered as promoting terrorism.
But this weekend, the world’s biggest video-sharing website changed tack. From now on, users can flag content which, in their eyes, counts as terrorist propaganda.
Pro-terror joins a list of already flag-able criteria, including nudity, sexual activity, harmful or dangerous acts to a third party, and hatred or violence against a “protected group” (ethnicity, gender, disability, sexual orientation etc).
Prior to the move, Google (YouTube’s parent company) was subject to a crescendo of disapproval, with many, especially in the US, saying it had become a mouthpiece for radical imams.
In early November, YouTube removed hundreds of videos of US-born Yemeni imam Anwar al-Awlaki.
Among others, US Congressman Anthony Weiner wrote to YouTube demanding that such “hateful” videos, which posed “a clear and present danger to American citizens”, be removed.
According to the British government, it was online footage of Awlaki preaching that inspired 21-year-old student Roshonara Choudary to stab MP Stephen Timms – who voted in favour of the Iraq war in 2003 – earlier this year.
Freedom of speech
No surprise then that YouTube’s decision was welcomed by a number of US politicians.
Independent Senator Joe Lieberman called it a “good first step” but added that he would prefer to see YouTube do the screening itself rather than leave it to web users.
YouTube has firmly ruled out that option. Its HQ, which for a long time refused to censor contents in the name of freedom of speech, says that screening videos before they are published – 24 hours of contents is uploaded to YouTube every minute – is not a viable option.
But relying on the vigilance of web users may not be the best solution either. Sexuality or animal cruelty can be objectively identified onscreen ; judging what “promotes terrorism” is a much less obvious task.
In an interview with the Los Angeles Times, law professor Jeffrey Rosen described YouTube’s move as “potentially troubling”.
Unlike other criteria, he explained, pro-terror is “more subject to interpretation than the longstanding language in the YouTube guidelines”.