Facebook (NASDAQ:FB.NASDAQ) is reviewing how it handles offensive content after an American man posted a video of himself allegedly committing a murder on the site.
On Sunday 37-year-old Steven Stephens posted a video declaring his intention to kill someone and later a second video of the alleged murder.
Around 10 minutes later he broadcast a video using the Facebook Live function, in which he discussed what he had done.
Facebook removed Stephens' Facebook page around 20 minutes after being notified and around two hours after the videos were uploaded.
Facebook's vice president of global operations Justin Osofsky, used a blog post yesterday to say the company's content review process was flawed and steps were being taken to improve it.
"As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible," she wrote.
Facebook has a network of thousands of contractors across the world who review content that users have flagged as violating Facebook standards.
Live-video reports are handled by a small team of contractors in San Francisco who are reportedly notified to review a video when it reaches a certain number of concurrent views.
Experts say finding a technological solution is near impossible because software still cannot recognise what is happening in a given video.
Sarah T. Roberts, an assistant professor of information studies at the University of California told the Wall Street Journal a similar tragedy could be broadcast in the future.
"Because these processes cannot be easily and reliably automated—particularly those videos that are running as live streams—there is no reason to think that people will not continue to find terrible ways to use the platform,” she said.
"The question that I have is why these consequences were not adequately weighed before the rollout of the Facebook Live tool"