Recently, I posted a blog about “alternative facts.” Since then, I have come upon two more articles discussing why alternative facts, are, indeed, true. It all comes down to who is in charge, sociability and/or to our cognitive biases.

In an article entitled “Seeking truth among ‘alternative facts’” by Peter Neal Peregrine, posted on The Conversation website on February 23, 2017, the author, an archaeologist, notes that in science, one looks for the facts before deciding if something is true. But, science is not the only perspective one uses to judge whether a fact is true. We also use, the “argument from authority”. (Id. at 2.) That is, “

[w]hatever the people in power said was true. That an individual saw or thought or reasoned something different did not matter….” (Id.) Therefore, empirical data carries very little weight when one is arguing against authority. (Id.)

We have all seen this occur. How many times has a parent answered a child’s question, “why”? with “Because I said so?” Isn’t this an argument from authority in its purest form?

Or, to take litigation, experts- who speak from authority- tend to play a key role in whether one wins or loses at trial. Even, at mediations, parties will reference what their experts have said or opined to convince the other parties that their “alternative facts” are, indeed, the true ones. Sometimes, the experts attend the mediation to lend the most support possible to the set of ‘alternative facts’ that their clients are advocating.

In short, “alternative facts” can stem from authority. Because so and so said so, it is true. But, to another author, “alternative facts” have a more psychological or cognitive underpinning. In “Why Facts Don’t Change Our Minds,” Elizabeth Kolbert (in the February 27, 2017 issue of the New Yorker), discusses three books and various experiments that show how irrational our “rational” thinking can be. One example she mentions is the “confirmation bias” or “… a tendency people have to embrace information that supports their beliefs and reject information that contradicts them.” (Id. at 4.) She notes that two researchers- Hugo Mercier and Dan Sperber call this bias- “myside bias” pointing out that when “…presented with someone else’s argument. [humans] are quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.”  (Id. at 6.)

Their research indicated that the ability to reason evolved to allow humans to cooperate:

Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” … Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view proved shrewd when seen from a social “interactionist” perspective. (Id. at 4.)

Referencing another book written by two cognitive scientists- Steve Sloman and Philip Fernbach- the author notes that again, “sociability is the key how the human mind functions, or, perhaps more pertinently, malfunctions.” (Id. at 7.)  The example used is asking someone to explain how a toilet works. While we all use them every day, do we really and truly understand exactly how one works (assuming one is not a plumber)?  When students were initially asked how a toilet worked, each student professed to have an understanding. It was only after they were asked to write down the step by step process of how one worked, that each came to understand their own ignorance.  Their own assessment of their degree of knowledge then dropped. (Id.at 8.)

Sloman and Ferbach then related this to everyday life. On any given topic, each of us tends to believe that we know more than we do. Unless and until we are questioned extensively on the topic, we will convey this belief to others so that they too will tend to think that we, (along with them) know more than we do.  An experiment involving Russia’s annexation of the Ukraine demonstrated that the farther off base people were in locating Ukraine on some map vis a vis Russia, the more they favored military intervention. That is, “As a rule, strong feelings about issues do not emerge from deep understanding” …. [O]ur dependence on other minds reinforces the problem.” (Id. at 9.)

In sum, due to our psychological makeup- we tend to believe “alternative facts” as being true, whether because it is someone in authority who is stating, them, sociability or our “myside” (or other cognitive) biases. And the more we try to convince someone that those alternative facts are wrong, the more unconvincing we will be. “Providing people with accurate information doesn’t seem to help, they simply discount it.” (Id. at 11.)

“Alternative facts” are emotional facts. They are not logical, reasonable or rational. They are emotional. And, if we respond to them on that level, we just may better success at overcoming such alternative truths.

Lawsuits and disputes are all about “alternative facts” and “alternative truths”. Simply put- they are emotional … no matter how much the parties protest to the contrary.

…   Just something to think about.

-------------------------------------

Do you like what you read?

If you would like to receive this blog automatically by e mail each week, please click on one of the following plugins/services:

and for the URL, type in my blog post address: http://www.pgpmediation.com/feed/ and then type in your e mail address and click "submit".

Copyright 2021 Phyllis G. Pollack and www.pgpmediation.com, 2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Phyllis G. Pollack and www.pgpmediation.com with appropriate and specific direction to the original content.