YouTube announced an update to their agenda of stopping extremist propaganda videos on YouTube including ISIS recruitment content in an August 1 blog post. The company is implementing machine learning systems to detect and remove offensive content faster and more efficiently in the following ways:

 

1. Speed and Efficiency

 

The machine learning system is faster than any team of humans could ever be. According to YouTube, of the videos that they’ve taken down in the last month, over 75 percent were taken down before they received a single human flag.

 

2. Accuracy

 

While YouTube acknowledges that the accuracy of the systems is currently far from perfect, they claim that “in many cases” the system has proved more accurate than people manually flagging offending videos for removal.

 

3. Scale

 

Given that 400 hours of content is uploaded to YouTube every minute, it is impractical to expect manual detection and removal of all offending content. But according to YouTube, machine learning has enabled them to double both the number of videos taken down for violent or inappropriate content and the rate at which this was done.

 

Trusted Flagger

 

Through their Trusted Flagger program, YouTube is working with a number of NGOs and institutions to “bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists.”

 

Tougher Standards

 

The statement also speaks of tougher treatment of video content that isn’t illegal but has still been flagged as YouTube policy violations. This includes hate speech, extremist, religious or otherwise controversial content. Apparently, under the new policies, such videos will be placed in a limited state, where they will still be available, but will exist behind an interstitial—they will not be recommended, monetized or support comments, suggested videos and likes.

 

Jigsaw Redirect Method

 

YouTube is also experimenting with an early intervention and counter-extremism work. This involves showing someone searching for sensitive content on YouTube will be redirected to videos that represent opposing views. This means that ideally, an impressionable teen searching for extremist propaganda will actually find content that debunks and argues against extremist messaging.

While YouTube’s is making efforts to stop extremist propaganda, the only troubling this about this is the possibility for legitimate YouTubers having their videos demonetized and placed behind the interstitial just because they may have views that are unpopular or controversial. This absolutely has potential implications for free speech, and YouTube could see a pretty strong backlash if they don’t play these policies out correctly.