MINNEAPOLIS (WCCO) — Facebook is facing criticism because it took the social media site more than two hours to take down video of a shooting death.
On Sunday afternoon, Steve Stephens posted a video of himself killing Robert Godwin.
About ten minutes later, Stephens posted a confession of the shooting on Facebook Live.
At Tuesday’s Facebook Developers’ Conference, Facebook CEO Mark Zuckerberg said, “We have a lot of work and we will keep doing all we can to prevent tragedies like this from happening.”
So, how does Facebook monitor its content? Daniel Danker, product director for Facebook Live, explained their policy to the Washington Post in March.
“We largely rely on the community to flag live moments that are unsafe or otherwise fall outside of our community standards,” Danker said.
Facebook says when a post is reported by a Facebook user, a team of real people review those posts 24 hours a day, seven days a week.
The vast majority are reviewed within 24 hours, and the majority within hours. The reviewers are located across the globe and speak more than 40 languages. They are native speakers, in order to understand true meanings and words.
When it comes to Facebook Live, the same review policies are in place, but Facebook says it also monitors all live videos once they reach a certain level of popularity — flagged or not.
“It’s particularly challenging of course for [Facebook Live] because there’s no time to react, it’s happening as you see it,” Danker said.
Facebook already uses algorithms and artificial intelligence and says it is working on that technology for monitoring its content.
Last year, Facebook briefly took down the iconic image of a naked girl running from napalm bombs during the Vietnam War, saying it violated their community standards. When it was republished, Facebook said it recognized the history and global importance of the image.
According the Ravi Bapna, a professor at the University of Minnesota’s Carlson School of Management, creating algorithms for text and images is easier than creating them for video.
He believes the tech giants will be able to master it, but aren’t there quite yet.
“There’s kind of a balance with in the company policing itself and crowd policing itself,” Bapna said. “The way Facebook runs is centralized, and I think they will have to be more decentralized where the crowd gets more power in policing itself.”