AG百家乐在线官网

Police share 'shooting' video with Facebook to help identify live-streamed attacks

Facebook contacted the Metropolitan Police in the wake of the Christchurch mosque terror attack in March.

Please use Chrome browser for a more accessible video player

Police video to help Facebook detect shootings
Why you can trust Sky News

Police have carried out the first in a series of training exercises designed to help Facebook build technology to detect mass shootings on its website.

The exercise took place yesterday in Kent, with a counter terrorism officer playing the role of a gunman.

The training was filmed from the gunman's point of view, before the footage was sent to Facebook.

It will be used to train artificial intelligence systems that Facebook say will be able to detect and automatically remove live-streamed firearms attacks.

Flowers have lined the streets of Christchurch as the city comes to terms with the tragedy
Image: Fifty-one people died in the Christchurch mosque terror attack

Commander Richard Smith, head of the Met's Counter Terrorism Command, said Facebook got in touch with the force after the Christchurch mosque shooting in March.

"Facebook reached out to the Met as we have worked with them on numerous occasions before to remove online terrorist propaganda.

"The live-streaming of terrorist attacks is an incredibly distressing method of spreading toxic propaganda, so I am encouraged by Facebook's efforts to prevent such broadcasts.

More from Science, Climate & Tech

"Stopping this kind of material being published will potentially prevent the radicalisation of some vulnerable adults and children."

The force will give its footage to the Home Office, so that it can be shared with other tech companies to develop similar technology.

Erin Saltman, Counter-terrorism Policy Manager at Facebook, expressed the challenges of dealing with this kind of content.

"I look at terrorism content every single day and I can tell you that the Christchurch attack went more viral, more quickly than any one piece propaganda I've ever seen," she said.

"We saw over 800 different variations of that video being shared aggressively, where you saw teams of people trying to manipulate the video and re-upload.

"With that sort of aggressive tactic, we're having to look at a myriad of different tools and expertise and partnership, like with the Met Police, in order to get ahead of this type of threat."

The move is the latest measure from Facebook to regulate the use of its live streaming feature in the wake of the Christchurch attacks, in which 51 people died.

Brenton Tarrant, charged for murder in relation to the mosque attacks, is seen in the dock during his appearance in the Christchurch District Court, New Zealand March 16, 2019
Image: Brenton Tarrant is charged with murder in relation to the Christchurch mosque attacks

In the aftermath of the shooting, Facebook faced heavy criticism for its lack of response to New Zealand officials, who demanded strong action to prevent a repeat of the incident.

New Zealand's privacy commissioner said the firm's silence was "an insult to our grief".

In the last two years, Facebook claims to have removed more than 26 million pieces of content related to terrorist groups such as Islamic State and al Qaeda.

It has since expanded the techniques it uses to what it calls "a wider range of dangerous organisations", including white supremacist groups, banning more than 200.

:: Sky News is broadcasting a Brexit-free channel, weekdays 5pm to 10pm on Sky 523