On Sunday, Prime Minister Scott Morrison urged social media companies to ensure their platforms could not be used by terrorists to share these videos, warning that once they “get out there” it was difficult to stop them spreading.
“Some real discussions have to be had about how these capabilities can continue to be offered,” Mr Morrison said.
He said social media companies had been cooperating with authorities but had been limited “on the technology side”.
Opposition Leader Bill Shorten said in a press conference, livestreamed on Facebook, that social media companies had an “obligation to better monitor and prevent hate speech”.
“Their whole business model is to tell their customers [advertisers] that they know everything about their social media users. Well, if that’s your business model, fair enough, but you can’t go missing and not know what they’re saying when it comes to hate speech,” Mr Shorten said.
He said social media businesses were trading on a “commercial dynamic” of liberty and couldn’t go missing when hate speech perverted that liberty into violence.
“We wouldn’t allow television or print media to put some of the filth and rubbish and violence and perversion that gets put out on social media, so we can’t just have one standard for old technology and give a leave pass to new technology,” he said.
New Zealand Prime Minister Jacinda Ardern told a press conference that she would look at “discussing” with Facebook whether livestreaming should be stopped.
UK Home Secretary Sajid Javid told the Daily Express that online platforms had a “responsibility not to do the terrorists’ work for them” and said tech companies had to do more or face legal ramifications.
A Facebook spokesman was unable to guarantee to this masthead that there were no copies of the video still on the platform. Users continue to try and upload edited and altered versions of the footage.
There were 1.5 million videos of the attack removed in the first 24 hours on Facebook. The majority were taken down before they were uploaded, but about 300,000 were not.
Facebook New Zealand director of policy Mia Garlick said in a statement that the platform would “work around the clock to remove violating content … using a combination of technology and people” and all edited versions of the video would also be removed.
The digital giants are already under scrutiny as part of a world-first inquiry into their impact on media companies and advertising revenues.
It’s unclear whether there will be another inquiry into the social media giants on this topic as Australian Competition and Consumer Commission chairman Rod Sims said the terms were very broad but the “appalling livestream is under a different heading”.
David Vaile, co-convenor of UNSW’s Cyberspace Law and Policy Community and Australian Privacy Foundation chairman, said this was a “tragic and inevitable result” of social media giants not being held to account in the past.
“If the idea of being famous among the hard right racist supremacist mob is the goal, then nothing comes close to livestreaming on Facebook for an instant non-editorial non-checkable method,” he said.
“It is tragic it has taken something of this scale to get people to notice,” he said. “We’re finally getting a wake up call.”
Traditional television companies have also come under scrutiny over their handling of the footage, with several complaints received by the regulator. Sky News Australia chief executive Paul Whittaker said the local live news feed was withdrawn and switched to sports coverage for the New Zealand broadcast but not because of the distressing video (which did not include any shootings or victims).
He said it was due to the possibility it could compromise investigations and the network locally stopped running the vision on Saturday morning “unlike other networks”.
Jennifer Duke is a media and telecommunications journalist for The Sydney Morning Herald and The Age.