The erosion of the American relationship with facts by the ‘attention economy’

Temple University Japan Campus — Thesis Project

David Cortez
21 min readJun 21, 2018

April 27th, 2018

What information consumes is rather obvious: it consumes the attention of its recipients. — Herbert A. Simon[1]

— — — — — — — — — — — — — — — — — — — — — — — — — —

The voices of experts who fear what the internet and mass media are doing to our human psyche and to our democracies have been ramping up over recent decades. Their concern is that the constant syphoning of our attention into an online world — a world cluttered with endless streams of contradicting information — can disrupt the critical thinking faculties necessary for a healthy society.

I argue in this piece that this thesis is not only correct, but that it is clearly and presently manifest in our society today.

Following the 2016 election of Donald Trump, the wider media seemed to be tiptoeing toward a similar conclusion. Headlines reading, “Facebook Says Social Media Can Be Negative For Democracy,”[2] or “Social Media is Making Us Dumber,”[3] were regularly being churned out on almost every newsroom floor. Article after article discussing social media’s role in the presidential election were published ad nauseam, suggesting that the aforementioned fears may have actually materialized.

This ad nauseam nature of the online press is the very thing I will be indicting here — but only in so far as the online press operates under the same business model of any web based business. It’s the incentive structure between money and information technology and the effect it has on media that I want to explore in this piece. And specifically, I want to address what the down stream effects on humanity this relationship might have. I will argue that “human attention” is the resource that online businesses compete for, and this has deleterious effects on societies ability to parse information reliably.

If money is the motivator for companies to compete for our attention, it’s no leap of the imagination that the expansion of information access since the rise of the internet would be exploited to make us more willing give our attention away. My contention is that this exploitation is a Pandora’s box that is wounding the American relationship with facts.

A necessary detour: the problem with bullshit

The Oxford English Dictionary entry for Truth is as follows:

· TRUTH — That which is true or in accordance with fact or reality.[4]

This appears to be perfectly reasonable, albeit circular as it uses a cognate of the word it is trying to define in the definition. But, it also begs two further definitions: Fact and Reality.

The Oxford entries for these two terms are as follows:

· FACT- A thing that is known or proved to be true.[5]

· REALITY — The state of things as they actually exist, as opposed to an idealistic or notional idea of them.[6]

The definition of Fact gives us some insight into Truth because something cannot be a fact unless its truth is proven. In other words, the provability of something has a high degree of relationship to Truth. But this does little to help us understand what Truth actually is, tangibly. It only tells us that in order for facts to be facts, they are contingent on Truth.

In the definition of Reality, however, we really have something. It describes an opposition to idealism or mere notions of what is real. With this definition, one can begin to shape a better characterization of what Truth is. For something to be true, it seems, it must operate within the constraints of reality, and not within notions or ideals. Reality is the arbiter of Truth.

Philosophers have long understood that if you wanted to stand in opposition to idealism and mere notion (i.e. get closer to Truth), you need to have a rubric by which to establish what is real — and by extension what is not. Thus, the scientific method slowly grew to become humanity’s best attempt at finding what what is real and what is, for lack of a better word, “bullshit.”

In his book, “On Bullshit,” philosopher Harry G. Frankfurt defines the term in the following way:

Describing a certain state of affairs without a genuine submission to the constraints which the endeavor to provide an accurate representation of reality imposes. [7]

In simpler terms, bullshit is produced when something is stated as fact without caring if indeed it is accurate at all. Frankfurt also goes on to point out an important distinction between outright lies and what he means by bullshit. He explains that individuals or even groups can express something that is not on its face a genuine lie, but is far more sinister:

The concept most central to the distinctive nature of a lie, is that of falsity. The liar is essentially someone who deliberately promulgates a falsehood. … This is what accounts for its nearness to bullshit. For the essence of bullshit is not that it is false, but that it is phony.[7]

From this it is understood that bullshit does not have to be simply false to be harmful to those seeking truth, it has to be phony. Frankfurt says that a phony reality is not necessarily inferior to a genuine reality — except in its authenticity — and that just because something is not genuine, does not make it defective (as in the case of an exact copy of something). Because of the banality of something phony, bullshit can be more easily confused with genuine Truth than an outright lie typically can.

If caught delivering a slice of bullshit, the consequences are much less severe than to be caught in an outright lie, simply because any plausible connection to reality their phony statements have can give the accused an “out” (i.e. plausible deniability). This is what purveyors of bullshit see as its utility, and is why so much bullshit abounds throughout human societies. People can get away with not “endeavoring to provide an accurate representation of reality,” because they can simply claim some tenuous connection to reality in order to soften the blow and do a bit of sleight of hand regarding their motivations.

But what does this have to do with a discussion about the internet and mass media, you might ask?

The answer: everything.

When gauging the health of a particular society, it goes without saying that one measure would be exactly how much bullshit citizens are exposed to, or, how little Truth they find themselves grounded in.

I contend that parsing out what is bullshit is at the heart of the problems facing the American democracy today.

Bullshit runs the market

The desire to maximize user attention on the web has fostered an arms race in incredibly sophisticated marketing techniques that are powerfully designed to keep us coming back to the web.

This arms race for our attention is what technology ethicist Tristan Harris calls the “attention economy.” Businesses such as Facebook, Twitter and Amazon — as well as the news media — are all competing for our attention inside a “virtual city.” In much the same way as businesses operate in a physical city — replete with billboards, posters and ads designed to drive city-goers into making certain purchases or spending time in certain areas — these online businesses use classic persuasion techniques, well understood for centuries, to win your attention inside the virtual city.

These techniques are the ones you have come to know and love: flashing lights, serotonin drips, habit forming actions, gamed incentives, keeping you in your comfort zone, pavlovian conditioning, gamblers fallacies and so on…

What this tells us is that businesses in the virtual age no longer need to rely on potential customers randomly passing by a swath of town where they have placed an enticing ad in order to get their attention. Now, they can reach them right in their own pockets, and once they have you, they can deploy methods to keep your attention with military precision.

The growing immediacy and addictive quality of web related products and devices is not an accident of technology. It is technology designed with the express purpose of maximizing your usage. Your attention has become the markets most prized possession, and the technology is the pickpocket that takes it away.

Harris goes on to explain the insidious nature of this relationship between technology, attention and online businesses:

People don’t realize that technology is not neutral. The best way to get attention is to know how people’s minds work. You can push some buttons so you can get them to not just come [to your platform], but to stay as long as possible. Because of the link that more of your attention or more of your time equals more money, [businesses] have an infinite appetite for getting more of it. Time on site is the currency of the tech industry. The only other industry that measures users this way is drug dealing.[8]

The malignant part of all this is that in this marketplace, social media and big data companies have pushed the persuasion game to a whole new level. According to Tristan Harris, companies like Facebook, Twitter and YouTube are using technology that is running an auction for your eyeballs 100,000,000 times a second — constantly asking itself, “what is it that you want?”

What completes this auction is merely an algorithm, a simple fragment of mathematics that is not inherently evil, but is incredibly powerful and is the entire basis for the economy that exists online today. The ability for code to be generated to give you more of what you want and therefore keep you on a specific site or application is truly the paradigm shift of our age.

Given this, I content that the cognitive environment that giving your attention away to an algorithm fosters is one that erodes incredulity and skepticism.

Relying on algorithms to give us “more of what we want” and not “more of what is true,” is why — for example — there are portions of the American population that think survivors of the Stoneman Douglas High School shooting are actually so-called paid crisis actors. This is because the auction for your attention is more likely to be won by providing you with information that confirms your political biases and comforts your emotional needs — regardless if it is planted squarely in the realm of bullshit or not. My great claim in this piece is that the algorithms designed in the attention economy will mindlessly work to give users more and more of the sort of content that leads to beliefs about children being paid as crisis actors in the aftermath of a mass shooting than the sort of content that is plain old true.

Our problem is that bullshit runs the market place.

After reading this far, one might ask, why blame the algorithm and not the creators of the dubious or misleading content? My answer to this is that, in the past, fringe or incredulous beliefs had less access to large platforms and had to memetically spread with a far more effort. But today, such beliefs can be presented to users whenever an algorithm deems it useful to keeping them engaged with an online product, and the memetic spread can be amplified to a massive degree by exponential and lighting fast sharing of the content.

It’s more than the invisible hand of the market, it’s something new. It is tech designed to grip you, but with an end result of rapidly proliferating echo chambers filled with bullshit. I cannot absolve the bullshitter, but he/she is made vastly more harmful because of the attention economy’s pet algorithms.

Dismantling the counterargument

One objection here might be that science and education — as they always have — should act as a protection against bullshit, artificially maximized or not.

While this might be true at face value, my response to this is to argue that the efficacy of science and education are perilously undermined in the current technological environment for the following reason:

There is a critical mass — an information overload — making it hard for citizens to know what is truth and what is phony. Tribalistic thinking tends to be the default in such scenarios. Scientific or logic-based messaging simply cannot keep up.

Due to the fact that the sheer volume of information on the web — be it fact or phony — can often keep reliability and objectivity just out of arms reach, many citizens run into situations where they are unable to tell what online information is true and what is false. As a result, these citizens will tend to select their own truths along moral or tribal lines, rather than ones based on objectivity — an obvious problem for creating a healthy society.

When explaining this instinct towards moral or tribal judgment calls, American psychologist Jonathan Haidt consistently reminds readers in his book “The Righteous Mindthat, “intuitions come first, strategic reasoning second.” [9]

Given that judgments are themselves produced by the non-conscious cognitive machinery in the brain, sometimes correctly and sometimes not, human beings produce rationales they believe account for their judgments. But the rationales are only ex post rationalizations.[9]

This is to say that our judgments about the truth of a claim are first run through the mammalian “wetware” in our brains, and on their journey, they do not always stand up to logical scrutiny. We construct rationalizations for these judgements, but only after they have come in contact with our evolutionarily sculpted intuitions and moral baggage. When faced with uncertainty, our innate moral frameworks guide our decision making far more often than logic does.

The critical point for my thesis is that if we are maximizing uncertainty in society, we are incentivizing tribalism.

Haidt goes on to explain that evolution has imbued us with social pressures that compel us to craft rationalizations for our judgement in order to preserve our reputation, which in the modern world is made breathtakingly easy by the internet:

Now that we all have access to search engines on our cell phones, we can call up a team of supportive scientists for almost any conclusion twenty-four hours a day. Whatever you want to believe about the causes of global warming or whether a fetus can feel pain, just Google your belief. You’ll find partisan websites summarizing and sometimes distorting relevant scientific studies. Science is a smorgasbord, and Google will guide you to the study that’s right for you. [9]

In this age of information saturation, science does not have the voice of authority and certainty it needs to stamp out our tribal thinking.

The problem compounds once the proverbial well of scientific objectivity is poisoned. In this environment, those with malicious purpose can use the muddy waters to disseminate unscrupulous information for their own gain, and the internet helps them do this with ease.

In an article written for Project Syndicate in February 2018, former U.N. Secretary-General Kofi Annan wrote, “The internet and social media provide another battlefield, it seems, for the surreptitious manipulation of public opinion.”[10] This manipulation can happen because the ability to educate one’s self — or follow any tried-and-true “anti-bullshit” method — becomes drastically diminished when the same space we put our trust in to retrieve facts also works to actively spread misinformation, ultimately allowing for confusion between the two.

This is the core of what I mean when I claim that there has been an erosion of our relationship with facts. Consider my claims of information overload and its malicious exploitation as two separate waves moving across the sea. When the crests of these two waves meet, the resulting amplitude thrusts society into an environment where anything from the melting of the polar ice caps to how many individuals attended the inauguration of President Trump seem to have perfectly contestable answers.

There is in fact a correct answer to each of these questions, one that, referring to our working definition of truth, “operates within the constraints of reality.” But yet, this is meaningless to those in their tribal echo chambers. Science and education will always struggle to overcome tribal thinking, but the prospect seems all the more daunting when tribalism is back by the power of the attention economy: the internet and its unthinking algorithms.

‘Fake News’ and social media

What of the ethical journalist working to ensure that accurate information can be reliably accessed by the citizens? Is journalism not the self-appointed fourth member of the ring of checks and balances in a modern Democracy, making sure that malicious actors are exposed?

The answers to these questions reveal yet another victim in all this. The attention economy has damaged our vital mechanisms for spreading verified facts.

The concept of pushing false or misleading news stories is nothing new, but the ubiquity of the stories and the ease with which they can be convincingly faked and disseminated is a by-product of the attention economy and its fight for content virality — the most coveted achievement in the new marketplace. Axiomatically speaking, a story cannot go viral unless the media is involved, and the news business (fake or otherwise) is also subject to this fact.

What kinds of news goes viral?

Author Ray Williams in Psychology Today explains that, “our brains evolved in a hunter-gatherer environment where anything novel or dramatic had to be attended to immediately for survival. Many studies have shown that we care more about the threat of bad things than we do about the prospect of good things. Our negative brain tripwires are far more sensitive than our positive triggers.”[11] This means that whether a story is true or false, it simply needs to be sufficiently provocative to spark our animal brain’s capacity to key in on the dramatic. And so it bears repeating, bullshit is often indistinguishable from genuine truth, and particularly dramatic bullshit has the right ingredients to go viral.

But this virality cannot simply go accross the airwaves or be printed in news papers. It needs the infrastructure of the attention economy. In other words, its not traditional media that is to blame for something false riding the viral train. Social media and its use of the dreaded algorithm to spread a fake news is difficult for “quality journalism” to stamp out completely.

On top of this, deliberately or unknowingly, the attention economy has created a situation where bullshit is not only more readily spread, but it can be more readily believed. By taking advantage of what Jonathan Haidt explained as our “ex post rationalization” tendency, as well as what Ray Williams demonstrates as our penchant for reacting to dramatic news, those crafting malicious fake news stories can reliably achieve virality via social media’s algorithms as throngs of users vulnerable to unwarranted credulity spread the information.

To put it bluntly, the New York Times, LA Times, Washington Post and many dozens of other newspapers did in fact accurately published the results of the 2016 Presidential election, but still nearly half of Republican voters believe that Donald Trump won the popular vote.[12] This kind of credulity can only come from those citizens thinking tribally about false information being fed to them by social media algorithms, which then goes on to live an endless life of sharing, liking and upvoting. Needless to say, relying on ethical journalists to curb any and all misunderstandings by the public in this information technology age is more or less a pipe dream.

With science and education weakened in their ability to sharpen our critical thinking skills, we now have a situation where journalism, too, is less effective than ever in reliably equipping our population with a positive relationship with facts.

Political cause for concern

To abandon facts is to abandon freedom. If nothing is true, then no one can criticize power, because there is no basis upon which to do so. If nothing is true, then all is spectacle. The biggest wallet pays for the most blinding lights. — Timothy Snyder[13]

Dr. Timothy Snyder has written a timely book calledOn Tyranny: Twenty Lessons from the Twentieth Century.” In it, he details how authoritarian regime changes follow a pattern that starts with subtly picking off important institutions in order to slowly take power out from under the noses of the people. Not surprisingly, journalism is one of those important institutions targeted.

Snyder argues that a weakened state of journalism is “good news’”— pun intended — for the authoritarian minded because it is an institution that traditionally functions as a check to their power. So, with the attention economy severely hindering journalism’s ability to educate the masses, its ability to keep the powerful accountable is equally hindered.

According to Reporters Without Borders, global free press is in “bad shape.”[14] More and more countries are ranking lower and lower on information freedom indices. Taking a cue from Snyder, I contend that it is easier than ever for power-seekers to distort facts and clamp down on good journalism. They can capitalize on the masses of citizens who have been anesthetized by the attention economy’s robotic propagation of misleading and out-right fake information. When President Trump thrusted his pointer finger towards CNN’s Jim Acosta and bloviated the phrase, “you are fake news,” it represented exactly what Timothy Snyder is saying about the targeting of important institutions by those in power.

For Snyder, there is a direct connection between our current situation and the authoritarian regimes of the mid 20th century:

Fascists despised the small truths of daily existence, loved slogans that resonated like a new religion, and preferred creative myths to history or journalism. They used new media, which at the time was radio, to create a drumbeat of propaganda that aroused feelings before people had time to ascertain facts. And now, as then, many people confused faith in a hugely flawed leader with the truth about the world we all share. Post-truth is pre-fascism.[13]

“Post-truth” is the academic way of referring to a degraded relationship with facts, a relationship that has historically been made weaker by the exploitation of new and not-completely-understood technologies by power-seekers. Just as radio was a tool for disseminating fake news and propaganda in the past, the internet now is playing a commanding role in this current era.

Echo Chambers, radicalization and the ‘Daily Me’

As obviously dangerous as propaganda is for the health of a society, it is not the only political concern that arises from the unintended consequences of the attention economy. Here we must bring echo chambers fully into the discussion, but instead of focusing on their connection to bullshit, we must focus on their politically destructive consequences.

In 1995, MIT technology specialist Nicholas Negroponte prophesied the emergence of “the Daily Me.” With the Daily Me, he suggested, you would not rely on the local newspaper to curate what you saw, and you could bypass the television networks. Instead, you could design a communications package just for you, with each component fully chosen in advance. [15]

This is the opening passage of the first chapter of Harvard Law professor Cass Sunstein’s recent book “#Republic.” What Sunstein aims to show in the book is that Nicholas Negroponte’s prophecy has come true, and with devastating effect. The book outlines how algorithmic based social media sites aid us in curating only the information we want, resulting in those aforementioned echo chambers. This capacity to unknowingly exist in a “Daily Me” echo chamber essentially takes our post ex rationalization tendency to its politically destructive zenith.

This political destruction is seen most fiercely in how echo chambers undermine the quality of debate.

Jonathan Haidt says, “We do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment.”[9] We use post ex rationalization to conjure up persuasive rhetoric to get people on our side.

In most everyday cases, arguments with fallacious reasoning can be simply classified as “healthy debate,” and where they lead should be rather harmless. It’s not an existential issue to be wrong about something and chat about it with your peers. Likely your peers will try and point our any flaw in your reasoning.

But, according to Sunstein, when we are in an echo chamber, we are surrounded by people that do not require us to produce adequate reasons for our judgment, and this creates a very dangerous situation. We are not having effective debate when we are in an echo chamber, meaning that fallacious reasoning is less likely to be pointed out and we are less likely to be exposed to arguments that go against our comfortable, tribal thinking.

Sunstein’s conclusion is that the ultimate by-product of such echo chambers is radicalization, which can and often does lead to violent extremism. When people are only exposed to certain ideas rather than the full gamut of human thought, radicalization often follows.

Sunstein says that “in a well-functioning democracy, people do not live in echo chambers or information cocoons,” and by implication, in an unhealthy democracy this sort of situation begins to become rampant.[15] We now know that echo chambers, also referred to by information technologists as “closed networks,” are ubiquitous online — either existing on the dark web, 4chan chat rooms, of other types of private forums. But echo chambers can exist on “open networks” as well, such as Facebook, YouTube, and Reddit, tucked away in pockets of these platforms and populated by individuals lead there by an algorithm simply programed to help them create their “Daily Me.”

This echo chamber aspect of social media was famously exploited by Russian agents during the 2016 election to “subvert the 2016 election and support Donald J. Trump’s presidential campaign.”[16] They bought ads and posted internet memes “focused on issues like religion and immigration to collect online followers and eventually organize pro-Trump rallies.”[17] We know too that Islamic extremists and white supremacists both employ internet echo chambers for their radicalization/recruitment needs.[18] There is simply no getting around the fact that echo chambers foster radicalization, and there is also no getting around the fact that these echo chambers are being fueled by the attention economy that drives our attention to the web by any means necessary.

If one’s tribe or one’s post ex rationalization leads them to specific content, the algorithm will show them more of what they want and will slowly silo them into an echo chamber.

If you think Timothy Snyder’s three-way link between fascism, President Trump and fake news seems a little hyperbolic, Sunstein’s cautioning about echo chambers may not be hyperbolic enough. If we take these thinkers’ concerns seriously, the true damage that an eroded relationship with facts can cause ends up staring you right in the face.

Admittedly, when I say that on one end, information overload contributes to the blurring of facts, and on the other end I say that people are not being exposed to enough information due to time spent in echo chambers, it can sound a bit contradictory. But it should be thought of, rather, as a candle burning at both ends — with huge chunks of the population finding objectivity difficult to determine, and at the same time being funneled into spaces in which they can comfortably confirm their tribal and post ex rationalizations. This is perhaps the most dire example of the diminished American relationship with facts, one that tears at the very fabric of our democracy and our ability to coexist.

When it comes to the attention economy, this pressure cooker of bullshit is actually what’s for sale.

Conclusion

American author Kurt Andersen spends the entirety of his book “Fantasyland: How America Went Haywire: A 500-year History,” tracking the long history of the American tendency towards outlandish belief.

He documents the Puritans seeking a place to practice religious extremism, the prevalence of homeopathic cure-alls, new-age religions, postmodern academic nonsense, as well as anti-vaxxers and modern flat earth proponents. Andersen’s exhaustive research leaves the reader feeling that unwarranted credulity is in the very DNA of America. [19]

If this is indeed the case, it is hard not to feel a sense of astonishment at the fact that the United States has made it this far without its own 20th century-esque authoritarian take over. So, given the current weakened state of journalism, the proliferation of echo chambers, and the degraded American relationship with facts, it is wise to be concerned about the state of our democracy going forward.

“We should never have believed Silicon Valley’s promise that if everybody was connected everything would be awesome,” states British historian Niall Ferguson. [20] I am inclined to agree. I recall the optimism that people had in social media and the internet during the 2010 Arab Spring, yet after all of my research, I am no longer so optimistic.

Of course, it should not be said that the internet is merely an all-powerful tool for social mayhem, and that is indeed not my argument. The internet does have an incredible capacity for good. However, it is undeniable the negative connection it has to our current political predicament and what a degraded relationship with facts is doing to the progress of our society.

As we march ever forward towards even more murky technological spaces, with advanced A.I. and robotic technology on the horizon, how can we make sure that the world we create doesn’t further lull us into even more unbounded credulity that threatens our societal foundations?

At the very least, we should examine and understand what a Pandora’s box the adoption of the attention economy’s business model has been for society, and in particular the American democracy. We must do this in order to both encourage the creation of a better informed citizenry and ensure that future technology has our best interests in mind.

References:

[1] H. A. Simon, (1971), “Designing Organizations for an Information-Rich World,” in Martin Greenberger, Computers, Communication, and the Public Interest, Baltimore, MD: The Johns Hopkins Press, pp. 40–41.

[2] Kennedy, Merrit. “Facebook Says Social Media Can Be Negative For Democracy.” NPR, NPR, 22 Jan. 2018

[3] “Opinion | Social Media Is Making Us Dumber. Here’s Exhibit A.” The New York Times, The New York Times, 20 Jan. 2018

[4] “truth, n.1.” OED Online, Oxford University Press, March 2018.

[5] “fact, n.1.” OED Online, Oxford University Press, March 2018.

[6] “reality, n.1.” OED Online, Oxford University Press, March 2018.

[7] Frankfurt, Harry G. On Bullshit. Princeton University Press, 2005.

[8] Harris, Tristan. “Waking Up With Sam Harris #71 — What is Technology Doing to Us? Interview. Podcast. YouTube, 19 Apr. 2017. Web.

[9]Haidt, Jonathan. The Righteous Mind: Why Good People Are Divided by Politics and Religion (p. 36, 50, 99). Knopf Doubleday Publishing Group. Kindle Edition.

[10] Annan, Kofi A. “How IT Threatens Democracy by Kofi A. Annan.” Project Syndicate, 16 Feb. 2018.

[11] Williams, Ray. “Why We Love Bad News.” Psychology Today. Sussex Publishers, 01 Nov. 2014.

[12] Marcin, Tim. “Trump Voters Believe the President’s False Claims That He Won the Popular Vote.” Newsweek, 26 July 2017.

[13] Snyder, Timothy. On Tyranny: Twenty Lessons from the Twentieth Century (p. 65, 71, 74). Crown/Archetype.

[14] Erickson, Amanda. “The Free Press Is in Really Bad Shape around the World. A New Report Says Populism Is to Blame.” The Washington Post, WP Company, 26 Apr. 2017.

[15] Sunstein, Cass R.. #Republic: Divided Democracy in the Age of Social Media . Princeton University Press. Kindle Edition.

[16] Frenkel, Sheera, and Katie Benner. “To Stir Discord in 2016, Russians Turned Most Often to Facebook.” The New York Times, The New York Times, 17 Feb. 2018.

[17] Shane, Scott. “These Are the Ads Russia Bought on Facebook in 2016.” The New York Times, The New York Times, 1 Nov. 2017.

[18] Manjoo, Farhad. “A Hunt for Ways to Combat Online Radicalization.” The New York Times, The New York Times, 23 Aug. 2017.

[19] Andersen, Kurt. Fantasyland: How America Went Haywire: A 500-Year History (p. 82). Random House Publishing Group.

[20] Ferguson, Niall. “Waking Up With Sam Harris #117 — Networks, Power, and Chaos.” Interview. Podcast. YouTube, 28 Feb. 2018. Web.

--

--