‘A perfect platform’: Internet’s Abyss Has Becomes A Far-Right Breeding Pit

Studies, time and again, have shown the algorithms employed by YouTube and Facebook push conspiracy theories and far-right propaganda into the feeds of users.

By Christopher Knaus
The Guardian (3/18/19)

No depth goes unplumbed on the far-right forum 8chan. Its threads reveal a seething, toxic mass of rabid antisemitism, neo-Nazism, Islamophobia, gratuitous violence, coded inside jokes and conspiratorial ravings published by anonymous users.

Nothing has changed in the days after the Australian alleged gunman Brenton Tarrant, 28, came to 8chan boasting of the imminent massacre in Christchurch. Posts have since praised Tarrant as a “hero” and called for copycat attacks, or, alternatively, denounced him as a pawn in a false flag conspiracy.

The starting point for coming to far-right online groups, whether fringe forums or those on Facebook, is an attraction to extreme views … A tendency to view the world in black and white, and an inability to consider opposing views.

8chan, which describes itself as the “darkest reaches of the internet”, is just one of a series of online forums populated by the extremist right. Studies in the UK suggest far-right forums are growing in number, giving a bigger platform to violent, racist messaging.

It begs the question: why is it that users are flocking to the internet’s abyss? And what roles do such forums play in radicalising those in the far right, alongside newspapers, broadcasters, YouTube, Twitter and Facebook?

There are no simple answers to such questions. The Griffith University terrorism expert Geoff Dean has spent a large chunk of his career attempting to understand these forums, their role in radicalisation and the ways security agencies can identify high-risk individuals – those most likely to move from rhetoric to action. “It’s about all I think about,” he told Guardian Australia.

The starting point for coming to far-right online groups, whether fringe forums or those on Facebook, is an attraction to extreme views, he says. A tendency to view the world in black and white, and an inability to consider opposing views. …

Read the Rest

(Commoner Call cartoon by Mark L. Taylor, 2019. Open source and free for non-serivative use with link to www.thecommonercall.org )

*****

Facebook: A Digital Gangster Destroying Democracy

“The report accuses Facebook’s chief technology officer, Mike Schroepfer, of giving a statement to parliament about Russian interference that “we now know … was simply not true”.

By Carole Cadwalladr
The Guardian (2/17/19)

Facebook is an out-of-control train wreck that is destroying democracy and must be brought under control. The final report of parliament’s inquiry into fake news and disinformation does not use this language, precisely, but it is, nonetheless, the report’s central message. And the language it does use is no less damning.

Facebook behaves like a “digital gangster”. It considers itself to be “ahead of and beyond the law”. It “misled” parliament. It gave statements that were “not true”. Its CEO, Mark Zuckerberg, has treated British lawmakers with “contempt”. It has pursued a “deliberate” strategy to deceive parliament.

In terms of how lawmakers across the globe need to think about Silicon Valley, the report is a landmark. The first really comprehensive attempt of a major legislative body to peer into the dark heart of a dark economy of data manipulation and voter influence. And to come up with a set of recommendations that its chair, the Conservative MP for Bournemouth, Damian Collins, says must involve “a radical shift in the balance of powerbetween the platforms and the people”.

Need for immediate action

The scale of the report – it drew from 170 written submissions and evidence from 73 witnesses who were asked more than 4,350 questions – is without precedent. And it’s what contributes to making its conclusions so damning: that the government must now act. That Facebook must be regulated. That Britain’s electoral laws must be re-written from the bottom up; the report is unequivocal, they are not “fit for purpose”. And that the government must now open an independent investigation into foreign interference in all British elections since 2014.

Cambridge Analytica was already on the committee’s radar when the scandal broke in March last year. But, over the ensuing weeks and months, it interviewed an extraordinary cast of characters to drill down into the underlying machinery of the new political power structures. And the result – a doorstopper of a report covering multiple interconnected issues – damns Facebook not just once or twice but time and time again. …

Read the Rest

  • Mark Zuckerberg, Four Days On, Your Silence On Christchurch Is Deafening — In New Zealand we’re waiting to see if the all-powerful Facebook boss means what he says about ‘moral responsibility’… Read the Rest

*****

Online Hate Threatens Us All. Platforms Can & Must Do More To Eradicate It

By Dan Hett
The Guardian (3/20/19)

Like so many, I was shocked to the core by the recent killing of 50 Muslim worshippers in New Zealand. As I absorbed the news, my thoughts – for reasons I will shortly explain – turned to the technology that is so closely linked to the atrocity. And let me say this clearly: major platforms such as YouTube and Facebook are a primary and active component in the radicalisation of, mostly, young men.

As a software engineer, I know extremist content can be curbed. After Christchurch, it’s more urgent than ever.

These organisations counter that they aim to take down content that violates their rules swiftly, and are increasing resources for efforts to identify and remove dangerous material before it causes harm. But clearly this isn’t enough. And by not doing enough to police their platforms, they risk being complicit in innocent lives being violently cut short. It is within their power to remove extremist content and users from their platforms, and they’re failing to do so in any meaningful way. Crucially, this is not caused by insurmountable technical problems.

I write these words with a high degree of confidence, speaking as an experienced software engineer who has spent much of his career writing similar code for dozens of large companies. More importantly, though, I write them with a heartfelt and burning concern, as the brother of Martyn Hett, who was killed in the 2017 attack on the Manchester Arena at the hands of a young man who was radicalised in part by the content and people he connected to online. …

Read the Rest

*****

‘First Dog On The Moon’ Cartoon: Why Can Twitter & Facebook Block Nazi Content In Germany But Not In The Rest Of The World?

They are choosing to actively shelter Nazi hate on the internet.

Link to Investigative Cartoon