Are we rational? Not!

ecards.jpg

LInks - I have a lot in this post because I feel they are really worth watching and reading ... Fair warning :-)

So much of what we do in life is predicated on homo sapiens being considered as ‘rational’. Trial by jury, elections, how we buy, etc. All of these mechanisms presume, that when presented with all the known facts, that people, being rational, will make the right decision or choice. And ‘rational’ is the key word here. In fact, our whole world is built around this premise.

Yet the media, politicians, and marketing all take explicit advantage of the fact that most people are irrational. And it works, because without conscious awareness, I believe we all are irrational. We are all subject to an unfortunate plethora of cognitive biases. We seem to be innately wired to accept false beliefs. (Please click on the link below about the undercover investigation by a reputable news channel in the U.K to see an egregious example of this explained by senior executives of Cambridge Analytica) – Note the senior executive when he says it is ‘all about the emotion’ when it comes to influencing people for elections.

https://www.youtube.com/watch?v=mpbeOCKZFfQ

As a species we are prey to a plethora of cognitive biases or illusions – i.e.  ways of thinking that convince us we are right when we are actually wrong.

Definition of Cognitive Bias (Wikipedia): A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. ... Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

So, what causes us to be irrational? Although it may be obvious, it bears enumerating what these factors are.

•    Psychological factors
•    Sociological factors
•    Cultural factors

I will reiterate this several times, but it takes a good deal of discipline, self-examination, self-honesty, and effort to rise above these factors to be, in fact, rational.

Some common cognitive biases/illusions: There are many as you can see from this link.

https://en.wikipedia.org/wiki/List_of_cognitive_biases

By Jm3 - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=57942404

By Jm3 - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=57942404

List of 25 Common Cognitive Biases from Charlie Munger – I liked this list as an opinion by Charlie Munger (a long-time advisor to Warren Buffett) as to some of the most common cognitive biases.

https://www.allencheng.com/25-cognitive-biases-charlie-munger/

Here are some cognitive biases that I thought particularly germane.

Confirmation bias – Always looking for a way to confirm what you believe (regardless of how much evidence there is against it) – Commonly found in areas such as politics, religion, global warming, etc. – Especially when the beliefs are part of one’s world view or social acculturation. The more the issue in question is rooted in your world view, the more desperately one will scrabble for any data point that somehow supports what one wants to believe, no matter how ludicrously isolated that data point may be.

Irrational Escalation – Judging an action less risky if we have already invested a lot of time in it

Halo effect -- People are more prone to believe someone if they are good looking or successful, regardless of whether they actually know anything about the subject under discussion.

Positive outcome bias – Always thinking we are more likely to win, succeed (irrational exuberance, lotteries)

Illusory Superiority – Irrational belief in our own ability or intelligence (which seemingly tends to be amplified the more ignorant one is) – known as the Dunning-Kruger effect) Please click the YouTube link to see clip by John Cleese on this subject 😊

https://www.youtube.com/watch?v=wvVPdyYeaQU

All of these effects have been verified in multiple studies by different researchers … and … there is, of course, some amount of controversy about the results of particular studies and biases. But, based on the research I have skimmed in Wikipedia, and other sources, there is a vast amount of research that supports that these cognitive biases do indeed exist. And I’m sure that you, the reader, will recognize many of them.

 https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds - I liked this article on it.

In this modern day and age, even fonts can subconsciously affect our judgement.

https://medium.com/design-ibm/how-fonts-influence-what-users-think-of-your-product-238874c593d7

https://opinionator.blogs.nytimes.com/2012/08/08/hear-all-ye-people-hearken-o-earth/?utm_source=slashdot&utm_medium=slashdot&utm_campaign=slashdot

Groupthink … sacrificing our own capacity for critical thought in order to adopt a consensus belief. We don’t simply act as if we believe it, but we actually DO believe it. The logical fallacy of ad populum (appeal to popular opinon ... well everybody else is doing it ...) is intricately intertwined with this.

http://www.psysr.org/about/pubs_resources/groupthink%20overview.htm

So, what can we do about this predilection to irrationality?

As irrational as we may be, logic and critical thinking are the tools that allows us to rigorously analyze what we are thinking. Regardless of our most deeply held beliefs, we owe it to ourselves to not shy away from analyzing them. We owe it to ourselves, those close to us, and the world we live in.

Logical analysis is not innate. But it can be trained. And the first step, as always, is to admit that there is an issue.

And why is it so important? Because, unless you are a complete Luddite and / or a recluse, you are subjected to a barrage of claims for you to believe(buy) this that or the other … every day. Politicians, the media, religious figures, your friends, entities trying to sell you something, the list goes on. You owe it to yourself and the people close to you to be aware of how to evaluate what is right and what is not.

That is not an excuse to act like a jerk to other people. This is a self-defense tool. I have met a lot of people who hold irrational and (to my mind obviously wrong) beliefs. I have pretty much stopped trying to change peoples’ minds or rub their noses it. What I will do, on occasion, is to try to ask them questions about the validity of the data. That may not seem very different, but I’m not trying to change their minds, I’m trying to get them to look at the world rationally. And attacking someone’s beliefs is a tactic almost guaranteed to fail. So, I try to turn the discussion (not argument) onto ground where we discuss data (not emotions). Like I said, limited success 😊.

I’m far from perfect in this. Taking these courses have made me realize that I have been lazier than I should have been. I have fallen prey to many of these cognitive biases because it was the path of least resistance for me. I hope that I can find the self-discipline to stop being lazy about this because I think it is critical not only for survival, but also my responsibility as a, hopefully, rational human being.

Common logical Fallacies:

I wanted to summarize four common logical fallacies that can be easily seen in the everyday world around us.

•    Circular arguments
•    Begging the question
•    Equivocation
•    Distinction without a difference
 

Circular arguments: Where the conclusion is the same as your premise. You could say that something is A because A. Very commonly used by people who have deeply held beliefs, in politics etc. E.g. In the 2004 presidential election United States candidate John Edwards said that the states not the federal government should decide about the validity of gay marriage. When asked why, his answer was essentially “because that’s the kind of thing I think the states should decide.”
 

Begging the question: This is where the connotative power of language is used to frame a question or concept unfairly and the intent is to lead the listener to choose one side or the other based on the power of the language and not through evidence. There are so many examples of this I will name just a few.

Budget Movers – Appeals to people who want to move and not pay a lot – but there is no evidence that supports that the company in question actually is more economical than other moving companies.

Speedy Oil – Appeals to people that are short on time and need their car serviced quickly. But again, where is the evidence that supports that they work substantially faster than any other service company?

Abortion – The use of the labels ‘Pro-choice’ and ‘Pro-life’. These are highly emotionally laden language and labels. After all, who would want to say that they were ‘Anti-life’ or ‘Anti-choice’? The mere use of the label is designed to polarize people.

I’m sure you can think of hundreds of other examples.

Equivocation: This is where the meaning of the critical word or words in the argument changes inside the premises. This can be an extremely subtle technique and sometimes hard to identify, particularly if one is not particularly well educated in the English language. For example:

1.    We have a right to vote (Premise)
2.    One should always do what is right (Premise)
3.    Therefore, one should always vote (Conclusion)

In the first premise, ‘right’ refers to a legal right that the person has, whereas in the second premise ‘right’ refers to doing what is morally correct. To therefore say that one should always vote is an erroneous conclusion based on those arguments.

Distinction without a difference: This is saying two things are different when really, they are the same. Like saying it’s not A it’s A.


Examples:
•    “I am not a slob, I just don’t clean up as often as I should.”
•    “I’m not a racist, I just think that people from this <group ... fill in the blank> aren’t as intelligent as everybody else”.

So, hopefully, food for thought. Whatever you listen to and watch, just as an experiment, see how many of these fallacies you can spot in everyday life. And fight to rise above them.

 

 

Previous
Previous

Tribal Philosophy

Next
Next

Refocusing on what's right