17 Minutes on Facebook, Forever in People’s Minds
(Commentary by Johnathen Tan)
In a haven for civility like New Zealand, the mass shooting in two Christchurch mosques came as a shock to us all – it struck me when I was most unprepared for it, just casually app-hopping on my phone. Undeniably, social media and technology played an apparent yet disturbing role in publicising this act of terrorism.
The world is more social media-oriented than ever – we rekindle long-distance friendships, discover job opportunities, buy and sell stuff – I can go on and on. But what’s really special about it, is that social media gives everyone a voice. We can post and share anything we want.
But this matter transcends any regular offensive or derogatory posting on your feed – it involved gruesome taking of innocent human lives, fueled by a hate-filled ideology. The white supremacist streamed himself on Facebook Live for seventeen (17) whole minutes, as he committed the attack on mosques in Christchurch.
Facebook deleted the video an hour after it finished streaming. But then again, upon the deletion, snippets of the video were still spreading like wildfire across other social media platforms as well, particularly YouTube and Twitter.
On YouTube, the video was accessible by just searching the obvious key words, like “New Zealand” and “shooting”. Even for me, as I was scrolling through Twitter trying to get the latest news updates of the shooting, I just couldn’t escape posts that appear once every three (3) tweets, of snippets from the live video that was posted on Facebook.
Facebook and YouTube are content-sharing platforms that are usually known for their automated regulation of harmful and extreme content using an algorithm – to combat the spread of copyrighted material, pornographic content or anything that violates the sites’ policies. But are these automated systems powerful, or precise, or fast enough to assess inappropriate videos, especially the ones that are livestreamed?
The social media giants have remarked that they have concentrated efforts into curbing explicit and terroristic videos by implementing AI algorithms and collectively hiring thousands of employees.
However, these algorithms and methods have proven to be flawed – which is apparent through this occurrence and multiple instances in the past. Videos that have been tweaked a little bit – like changing the video resolution, adding a watermark or distorting the audio – have escaped the sight of the rules.
Anyway, or anyhow – here’s my take on this. If you’re going to be the organisation that runs the highest user-populated social networking website in the world – earning big bucks from advertisement revenues and with that much influence, you got to have the ability to take full control and act on where the regulatory lines lie.
For further reading, explore the news article by Newshub below.
___________________________________________________________
Christchurch Terror Attack: Helen Clark Slams Facebook Livestream 'As Bad as it Gets'
Helen Clark. Photo credit: The AM Show |
The former NZ PM says the global policy boss for the online behemoth has contacted her saying he wants to visit NZ, following an angry backlash against the platform over its livestream of a mass terrorist murder at a Christchurch mosque. Toby Manhire reports.
Helen Clark has joined the chorus condemning Facebook and other online platforms after the terrorist attack that took 50 lives in Christchurch. Facebook had turned into a "monster", said Clark.
Asked for her response to the Facebook livestream of the murders in two Christchurch mosques, Clark was aghast. "Seventeen minutes of a killer doing his business. A mass murder. This is unthinkable. Is no one watching anything? I mean, really, unbelievable. Unbelievable."
Facebook had demonstrated an inability to self-regulate, she said, and following the atrocity in Christchurch, New Zealand could emerge as a trailblazer.
"I think countries are going to want governments to act. Ideally you'd have action at a global level, but negotiating treaties and conventions takes a very long time. It is an area that perhaps New Zealand could innovate in. New Zealand's got the world's attention right now for the wrong reasons. And for the good reason that the response has been appropriate.
"I think we'll be looked at with great interest about what we do in gun reform and around this issue in dealing with the publishers of outrageous material."
When Clark was in charge of New Zealand's security services, white-supremacist terrorism "wasn't an issue", she said.
"I think it's gained traction with social media. It's one of those monstrous elements that wouldn't have been seen. These people have been around, right? But they didn't really have a way to propagate their views… It's facilitated their interaction and linkages with each other. Just as it has for Isis and others."
Clark is no knee-jerk social media cynic. She is a voracious user of just about every platform imaginable.
"I love it. I see the potential to connect, to put ideas out there, make a comment, use your thought leadership position. It's got huge potential for good. But we'd like it to be eliminating the harm," she said.
The silence to date of Mark Zuckerberg, founder, chairman and CEO of the multibillion-dollar company, was hardly surprising, Clark said.
"He's very slow to say anything, whenever any of these issues arise. In the scheme of things, this is as bad as it gets. But there have been other issues such as Cambridge Analytica, and various other outrages which he was silent on for a long time."
Mark Zuckerberg. Photo credit: Reuters |
Clark told The Spinoff that she had, however, heard directly from Nick Clegg, a former UK deputy Prime Minister and now head of global policy and communications at Facebook.
"He wants to come. My advice to him was not to come in the grief phase. We're all in the grief phase. But he is rational and can be helpful. So a visit from him is anticipated," she said.
Clegg has been approached for comment.
Speaking to The Spinoff following the launch of The Helen Clark Foundation, a new think-tank based at AUT for which the former PM will act as patron, she said: "When I sit back objectively and look at it, social media, like Facebook, which started as a way of people connecting with their friends and family, quickly became a monster. No one understood the potential of the technology for harm. Now it's like trying to shut a door after the horse has bolted. Trying to put a software patch on things where some quite fundamental reform is needed."
Clark pointed to the effect on US democracy, in which Facebook has become a tool to distribute misinformation in the interests of foreign actors.
"It is tragic and deeply ironic that an open society like the US, that can spawn these innovative platforms, ends up getting them used against it, by Russian troll factories, by the potential for fake news advertising that subverts democracy and trades off a lack of public information… So I think there is a bit of a monster there," she said.
"The quick approach would be to look at how responsible regulation can occur. We have regulation of other media. But we're not seeing social media step up on self-regulation. And I'm really gobsmacked that platforms which can quickly pick up that you have a preference for this or that or the other and can then follow you relentlessly with advertising, cannot pick up that people are broadcasting hate, violence, calls to action to do extreme things. I think the regulators have to move in here, and New Zealand wouldn't be alone in wanting that."
Facebook was culpable, too, said Clark, in spreading false information about, for example, vaccination.
"There are so many people getting their news off the unmoderated platforms these days. We have a measles outbreak in New Zealand. Well, thanks. Thanks, fake news, for making this possible."
In her speech to parliament on Tuesday, Prime Minister Jacinda Ardern addressed the role of Facebook and other social media directly. "There is no question that ideas and language of division and hate have existed for decades, but their form of distribution, the tools of organisation, they are new," she said.
"We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published. They are the publisher. Not just the postman. There cannot be a case of all-profit, no-responsibility."
The backlash in New Zealand has grown through the week, with New Zealand's top telecommunications company executives urging the bosses of Facebook, Twitter and Google to take action, advertisers cutting ties, and a KiwiSaver fund manager dumping shares in Facebook.
Facebook has said it will review its livestreaming service.
As of early Friday New Zealand time, a week after the racist massacre that killed 50 Muslims at worship, streamed by the murderer live on Facebook, Mark Zuckerberg was yet to say anything about what happened.
___________________________________________________________
Image Source: Facebook
No comments:
Post a Comment