Skip to main content

Facebook tightens rules for live video service after Christchurch killings

#3 of 84 articles from the Special Report: Democracy and Integrity Reporting Project
The Facebook icon is shown on an Android phone in this file photo. Photo by Alex Tétreault

Facebook is rolling out a new “one strike” rule for its live video service following the New Zealand mosque killings, banning users for a period of time after their first violation of company policy.

Guy Rosen, vice president of integrity announced the change to Facebook Live in a post on the company’s website on May 14.

Facebook has been reeling from a white supremacist’s use of the platform to broadcast his massacre of 51 Muslims at two mosques in Christchurch live on the internet two months ago on March 15.

“Today we are tightening the rules that apply specifically to Live. We will now apply a ‘one strike’ policy to Live in connection with a broader range of offenses,” Rosen wrote.

“From now on, anyone who violates our most serious policies will be restricted from using Live for set periods of time — for example 30 days — starting on their first offense.”

The change means that, in theory, the company will be less tolerant of Facebook Live users who violate its Community Standards. Before the switch, Facebook's practice was to take down specific posts for first offenders, only blocking a user if they kept posting problematic content, Rosen explained.

Rosen's post does not mention additional restrictions on the offending user's other Facebook activity or comments. “Our goal is to minimize risk of abuse on Live while enabling people to use Live in a positive way every day," he wrote.

The move comes on the heels of Facebook’s ban of white nationalism and white separatism. The company said March 27 it was adding those to an internal blacklist it calls “Dangerous Individuals and Organizations."

National Observer has documented how far-right influencers have been able to evade Facebook’s ban by repackaging content under a different brand.

Facebook’s announcement also comes as several international leaders from governments and tech companies gather in Paris to participate in a summit convened to stamp out sources of terrorism and violent extremism online.

The media resource for URL http://twitter.com/JustinTrudeau/status/1126239475628171265 could not be retrieved.

Prime Minister Justin Trudeau will be at this summit alongside other leaders, including French President Emmanuel Macron and New Zealand Prime Minister Jacinda Ardern.

The Christchurch shooter’s livestream was pulled down relatively quickly, but the content was copied and posted elsewhere online. Facebook removed over a million different versions of it over the 24 hours following the shooting.

Rosen acknowledged that the “proliferation of many different variants of the video of the attack” was a challenge for the company, particularly since people were sharing edited versions “not always intentionally.”

As a result, Facebook is partnering with the University of Maryland, Cornell University and The University of California Berkeley, he wrote, in order to “research new techniques” to detect "manipulated media" and “distinguish between unwitting posters and adversaries.”

Comments