Evolution of a Cyber Cascade

Cyber Cascade: “A massive snowball-effect of information that gets spread around without knowing anything about the truthfulness of the information.”

I have been back in the States this time since September 2017, and as a bicultural citizen of both Austria and the USA, I have to admit that before I made the jump across the Atlantic, what seems like an eternity but was only roughly 2.5 years ago, I was not at all actually keeping current on stateside issues. Nope, I was busy picking up some pieces over there, and actually was a bit in disbelief that in Austria such a seemingly unqualified person as Sebastian Kurz, who actually lost a no-confidence vote last year for corruption, became the Federal Chancellor.

Change channels to the States: It is arguable that at this moment in time – early March 2020 – and in the history of the internet, there has never been such polarized opinions shared publicly online. Through the medium of Facebook, for example, sharing one’s own opinion – be it correct and fact-based or not – is a matter of gaining access, typing text and posting.

The problem is that people are becoming quite good at “making a story sound believable”, even if what they are posting is speculation or flat out incorrect. The numbers of unsubstantiated posts are million-fold, and the occurrence of cognitive dissonance, a belief in two or more conflicting facts, seems to be becoming regular, and people are believing stories at an all-time rate even though facts may be available to counter their validity. 

This is even an accepted norm in some groups.

Taking Facebook as an example, it seems highly opinionated groups have mastered the art of creating one group voice, and grown accustomed to not having any tolerance for dissension or newcomers with varying opinions. There are groups and there are pages, and both have the option to disallow members or subscribers, respectively, from starting a post, but only to add on to an existing post, whereas strong opinion posts contrary to the regular beliefs of that group are often deleted.

Going back to the very act of collecting information which we like to assume is “credible knowledge”, we go to Google, put in our search string, and expect that whatever comes out as the result is credible, even trusting Google to the point of assuming that the items on the first page and even at the top of the list on the first page have randomly arrived there neutrally, without any outside influence, for my benefit.

This assumption – that Google is handing me an empiric, unfiltered result – has been shown to be grossly wrong. There is indeed an algorithm, which is alive and dynamic and watching my past search history, as well as is reading the cookies on my computer and actually making decisions about what it thinks I “want to see”, based on subprograms built into the algorithm that tell it to offer me links – or rather, direct me to links which may or may not actually help me to the end goal of being informed with all of the available information online.

No, it is often NOT the case that I will get a nice variety of links based on my search string.

After watching the video about Dylann Roof that was linked under a lesson on this course, I became more aware of this.

An article in the Daily We states: “With a dramatic increase in options, and a greater power to customize, comes an increase in the range of actual choices. Those choices are likely, in many cases, to mean that people will try to find material that makes them feel comfortable, or that is created by and for people like themselves.”

The case with Dylann Roof, mass murderer of nine African American persons at a prayer meeting in South Carolina 2015 shines a huge light on the polarizing effects on a user by the occurrence of a cyber cascade. In this case, one theory is that at the time of Roof’s search, there had been multiple searches for white supremicist sites already, and “blacks murdering whites” would then logically and statistically cause Roof to receive many white supremist hits, reinforcing his weak prejudice and hence polarizing his perspective. 

Given that Roof started with a weak but  definite prejudice when researching this interracial murder topic, Roof was trying to educate himself about the 2012 slaying of 17-year-old African American Trayvon Martin, by a caucasian, George Zimmerman, and in his words: “See what the big deal was” about the slaying. 

As reported: “Roof’s radicalization began, as he later wrote in an online manifesto, when he typed the words ‘black on White crime’ into Google and found what he described as ‘pages upon pages of these brutal black on White murders.’ The first web pages he found were produced by the Council of Conservative Citizens, a crudely racist group that once called black people a ‘retrograde species of humanity.’ Roof wrote that he has “never been the same since that day.” 

Were these sites clearly and intentionally composed to incite rage and vitriol in the reader who might already have a hateful attitude toward African Americans?

Roof ended up clicking on the links that Google provided to him, the harshest flagship white supremist sites, all at the top of his search page.

It is arguable that perhaps Google’s algorithm was active for Roof, that it was simply taking his past browsing history, and was delivering to him links related to his history, but the Southern Poverty Law Center begs to differ.

The problem here is that the combination of Google-user Roof and the links he was browsing altered his view of the actual full picture of the hostile scenario of interracial murders, and essentially caused him to strengthen the roots of his resolve of hatred and contemplating violence against African Americans, wher he eventually decided to voluntarily take violent action.

This story is not a theory but is factual, drawing from Roof’s own testimony about the development of his hate, where he became so emotionally inflamed that he even came to the point of “starting a race war.”

The Southern Poverty Law Office surmises that Roof likely entered into his research not necessarily having his mind made up, and despite a possibly open mind, which actually seems doubtful, the huge dose of anti-black links that he was provided may very well have influenced him into his rage. The question may have been timing, and Roof’s receiving the result 

What Roof, with a fragile, impressionable mind, failed to do is to engage a critical thinking approach, where he might mentally “step back” from the picture that was being created in his mind by these sites and then view the different published sources as possibly not being legitimate, or at least calling them into question or imagining for himself that such sites and text were structured in such a way to incite a reader, with a crafty and intentional call to racial action.  

Upon being contacted, Google claimed its algorithm takes into account how trustworthy, reputable or authoritative a source is. Obviously in Roof’s case, it clearly did not.

Let this be a lesson to users of Google or any search engine that very polarizing information swells – cyber cascades – do occur, and what is at the top of one’s search page may not be the most neutral or reliable information source, as one would naturally like to assume. 

 Bibliography:

1) Article: “Polarization and Cybercascades”

Mediated Subjectivity – Politics and Subjectivity in the Networked Public Sphere

By Mollie Ableman 

Posted September 25, 2011

https://mediasubjectivity.wordpress.com/2011/09/26/polarization-and-cybercascades/

2) Video: “The Miseducation of Dylann Roof – How did Dylann Roof go from being someone who was not raised in a racist home to someone so steeped in white supremacist propaganda that he murdered nine African Americans during a Bible study?”

By Southern Poverty Law Center

3) Excerpt: Cass Sunstein on Group Polarizationand Cyber-Cascades 

From the “The Daily We”, which appeared in the Boston Review, Summer 2001

Click to access Sunstein_on_Group_Polarization_and_Cyber-Cascades.pdf