James Delahunty
24 Oct 2013 17:02
Facebook has changed its policies regarding violent video content again in a row over a video showing a woman being decapitated.
Facebook had ignited a storm when it refused to remove a video showing a masked individual decapitating a live woman, despite multiple complaints over the nature of the content. One of the complainants was the Australian police force.
As the story became more widespread, Facebook made changes so that a content warning was shown on the video, and said it would move to make users more aware of the nature of questionable violent content.
However, that wasn't enough to satisfy critics, including British Prime Minister David Cameron.
"It's irresponsible of Facebook to post beheading videos, especially without a warning. They must explain their actions to worried parents," Cameron tweeted. Of course, Facebook didn't "post" the content itself as the tweet suggests, but when made aware of it being available on its site, it certainly was Facebook's problem then.
Other critics questioned why its OK to post an extreme video showing a vile act of murder in the real world, and yet a picture of a woman with fully exposed breasts - non-violent imagery of body parts possessed by roughly half the human race - would be removed upon receipt of a complaint.
In the end, Facebook removed the video and has pledged to take a more comprehensive look at the context of violent videos posted in the future.
"First, when we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence. Second, we will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience. Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it."
Still, Facebook does make a good point about some graphic content that you can find on its service, such as videos documenting terrible human rights abuses, acts of terrorism, and so on. These videos are posted to raise awareness to real world events and may be of genuine public concern. Posters of such material typically condemn the content shown.
Facebook's challenge now is to identify the reason for the sharing of such material - if it is there for sadistic pleasure or to celebrate violence, then it will be removed.
See Facebook's full press release at: newsroom.fb.com/Fact-Check