By Heather Brown

MINNEAPOLIS (WCCO) — More than 400 hours of new video is posted to YouTube every minute.

A small, but significant slice of it has terrorist content.

In a blog posted Tuesday, YouTube published an update on its commitment to fighting terrorism online.

So, how do the big companies monitor and remove offensive content?

“Terrorist [videos] is a slice of it,” said Mike Johnson, director of the security technologies program at the University of Minnesota. “A lot of the content is suicides videos and murders and other bad actions.”

Internet companies used to flag offensive video primarily by having users tag it. Much has changed in the past few years as the big companies have individually and together worked on removing terrorist video.

(credit: CBS)

Microsoft, Facebook, YouTube (owned by Google) and Twitter announced a shared database of offensive digital fingerprints for terroristic imagery in December.

The companies agreed to share the egregious content they have removed from their sites so other networks could more easily spot them.

In post issued in June, Facebook published information about its behind-the-scenes work on fighting online terrorism. It says it uses artificial intelligence — like matching images and text or detecting fake accounts. They also use human expertise, including reports from users and terrorism specialists.

Facebook also says it partners with other industry leaders, governmental organizations and anti-terrorist groups to magnify those voices.

In Tuesday’s YouTube blog, the site says it also has added more human experts. It also claimed success with incorporating tougher standards and counter-terrorism work.

It said its machine learning has also helped the site remove 75 percent of the terrorist videos before one human flag.

Still, some leaders say these efforts are not enough. British Prime Minister Theresa May addressed the situation in speech following a London terrorist attack in June.

“We need to work with allied democratic governments to reach international agreements to regulate cyberspace to prevent the spread of extremist and terrorism planning,” May said.

Heather Brown