Tribal Philosophy


I apologize in advance that this may not be the most positive post I’ve ever written, but I believe it is one of the most important.

One of my subscribers sent me a recommendation for a podcast based on the article that I wrote about being rational. It was for the podcast You Are Not So Smart. I only listened to three episodes, but I found it extraordinarily interesting. And very germane to the discussion from last week about being rational.

The podcast author is a gentleman by the name of David McRaney. The podcast episode that I listened to is called Tribal Psychology. I found it both interesting and depressing.

He was interviewing two academics, Liliana Mason, and Dan Kahan.

I am putting in the link to the podcast. It is about an hour long but well worth listening to. It is concise, the people are articulate and easy to understand, and they give example after example of studies and situations which delineate the problem.


I will try however to summarize the message of the podcast.

In the 1970s a psychologist named Henri Tajfel developed Social Identity Theory. This theory states that when people define themselves, a large part of that self-definition is based on loyalty to a group or groups that we belong to. His experiments showed that:

·        Humans will organize themselves into groups extremely quickly, sometimes within seconds

·        They will organize themselves into groups over almost anything

·        Once in those groups they will start indulging in ‘us’ versus ‘them’ thinking

·        There is no minimal baseline for what will start instigating prejudice and discrimination

Even the most minute difference between one’s own group and another is enough to stimulate tribal psychology behavior, i.e. That your group is better than the other group and to start treating people in the other group badly for no other reason than that they are not part of your group.

Quote from David McRaney:
“After many experiments built on these foundations, social psychologists found that there is simply no salient, shared quality around which opposing groups will not form. And once they do form, people in those groups immediately begin exhibiting tribal favoritism, tribal signaling, tribal bias, and so on. And that’s why this is called the Minimal Group Paradigm. Humans not only instinctively form groups, they will form them over anything — no matter how arbitrary or minimal or meaningless.”

In addition, technology and social media have given us the tools (for lack of a better word) that allow populations to polarize themselves faster than ever before. I.e. to initiate tribal signaling and form groups.

I found this to be an extremely depressing finding. In this day and age, a huge number of people have one of the most powerful tools for education and communication ever created (smart devices + the Internet), and what do we use it for? For clarity of communication and education? No. We use it to play games, create egregious quantities of white noise under the guise of ‘connecting’, and to spread hatred and discontent. Oh ... and to allow entities like Cambridge Analytica to manipulate us effortlessly.

The sad complement to this finding is that facts seems to be totally irrelevant to groups that are engaged in tribal thinking. If the facts disagree with the outlook of one’s group, then the people in that group will simply interpret the facts in the way that best seems to support their group’s thinking. Or reject them altogether.

The primary example in the podcast was the testimony of James Comey of the FBI in July 2017 some two months after he was fired by Trump. The testimony was regarding Comey’s investigation of possible links between the Trump campaign and attempts by the Russians to manipulate U.S. election results. After many hours of testimony, which was intensely scrutinized and rebroadcast on almost every social media network possible, the outcome was unbelievable.

Again, a quote from David McRaney in the podcast:
” A few days out, Reddit user, “it’s not the network,” summed up the sentiment of everyone watching all this coverage. And this user wrote:
“I learned three things from watching all the TV coverage, Twitter coverage, Facebook pages and Reddit coverage.
One — the Republicans think Comey’s testimony exonerates the president.
Two — the Democrats think Comey’s testimony condemns the president.
And three — politics makes me want to shoot myself in the face with a bazooka.”
Everyone heard what Comey said. Everyone had the same facts, and yet everyone heard him say something different.”

This podcast has articulated, very clearly, what I have been thinking myself for quite some years now; people don’t listen to facts. When everyone is fixated on group loyalty, what room is there for compromise, rational discussions, and factual data?

As I discovered, I have been indulging in a mindset called “naïve realism”. This is the model where people like me, i.e. scientists, fact-based professionals, think that to fix problems the only thing necessary is to make sure that people understand the facts. Give them enough solid information and then people will make the right decision. Of course, this never seems to have worked in the past, so I don’t know why I thought that eventually it would. And in fact, this was the source of my inchoate dread, that I realized somehow it wouldn’t work. The last few years have beaten my naïve realism into a sodden pulp on a scorching hot pavement. But there is no doubt that is the mindset that I used to have. I just couldn’t figure out why it was wrong.

And I always ask myself, and other people, why don’t people listen to facts?

This podcast gives you a very good answer as to exactly why that is. I.e. Allegiance to ones ‘tribe’ is more important than anything else. And I, with a subdued but ever-present sense of angst, find that a sad comment on the human race.

I do understand that we have evolutionary reasons for this type of thinking. It is more beneficial to be a part of a group than to stand alone. And that still has bearing now to some extent in the modern world. But to indulge in it mindlessly? Can’t we raise ourselves above that?

I have a few concluding thoughts and/or comments.

I do indulge in quite a bit of self-examination, not from ego, but because I am genuinely interested why I don’t seem to be affected to any great extent by this tribal psychology. Make no mistake, I do indulge in tribal thinking to a certain degree. However, I believe that I am in general aware of it when I do and fight it when I can (if required). This is one of the reasons that I had such a mindset of naïve realism. I think; I listen to facts; if I am genuinely interested in something I will go out and try to find out the rationale behind it and make my decision accordingly, etc. — why can’t others do that? Now I know why.

And, in professions that rely upon facts such as medicine, science etc. people do in fact engage in fact-based thinking. And thankfully so for that or otherwise we wouldn’t have made the scientific advancements that we have.

But as soon as one ventures outside of these professions into the arenas of politics and (as far as I’m concerned) journalism, all fact-based thinking gets supplanted by tribal signaling and tribal psychology. [Tribal signaling is where people signal to each other their adherence to the group – Can you say Twitter or Facebook neighbor?]. The podcast makes the point that once an issue becomes politicized, thinking goes straight down the tribal path, i.e. rational thought goes out the window.

So, how can we fight this? I am condensing from what Leanna Mason and Dan Kahan say:

·        You must fight to make yourself aware of what you are doing

·        We should exploit scientific curiosity as a remedy to tribal thinking

·        We need to try to restore an environment in which scientific communication is the norm and not the exception.


How you do this with politicians? No idea. After all, they instinctually and / or purposefully rely on this tribal thinking to gain and maintain power. Why would they want to go the way of scientific discourse? They rely on tribal signaling with sound bites and rabble rousing to STOP people from thinking. When people hear this drivel that politicians spew … you know that tool I mentioned above? They could fact check this stuff in a heartbeat. Do they? Evidently not. Or, they see the facts and then interpret them in such a way that suits them.

Mark Twain - "Get your facts first, then you can distort them as you please."

And by the way I should mention, as did Dan Kahan, that most people simply don’t know that much about science. And how can you in today’s day and age? The amount of knowledge and science in the world today is beyond comprehension. But it is not a matter of being an expert in every area of science. It is a matter of being willing to acknowledge that whether one is an expert or not, scientific method is the way to look at in question things. Disciplined observation and scientific rigor of thought. Questioning should always triumph over blind adherence to groupthink.

The Scientific Method

The Scientific Method

As I said in the last article, it isn’t a matter of trying to change people’s minds on an issue, but rather trying to spread the word on a better way to think. It sounds patronizing and absurd when I write it that way, but I can’t think of another way to put it.

A small addendum from my Formal Logic class. I think these are quite relevant when trying to combat Tribal Behavior. This addresses the logical fallacy of irrelevance. There is a class of logical fallacies termed diversionary fallacies – where instead of attacking the proposition and arguments, one attacks the arguer. The general term is ‘ad hominem’ – Latin for ‘to the man’. When I lay these out, I think you will see that these are common when the tribal thinking mode of thought is dominant.

Arguments should be evaluated strictly according to the strength or weaknesses of the premises – regardless of who presents the arguments. Attacking the source of the argument is to commit an ‘ad hominem’ fallacy:

You’re a Jerk version – Discarding the argument because the person presenting it is … well … a jerk from your point of view. Not relevant to the argument itself.

Guilt by Association – Discarding the argument because you don’t like the group the arguer is associated with (sound familiar in the context of tribal thinking?)

Tu quoque (but you do it to) [a variation of guilt by association] – Best by example. Ignoring the argument to stop smoking because the person telling you this smokes 4 packs a day. The fact that they do has no relevance to the strength of the argument that smoking is bad for you.

Focusing on the motivations of the speaker – It is possible that that person putting forth the argument may profit in some way if the argument is true. But again, that has no relevance to whether the argument is true or not

Post Script: Tribal behavior, i.e. having a small group of people that provide mutual support, isn’t a bad thing. It is in fact a very good thing. It provides a lot of benefits. I am part of a small tribe myself and I would be bereft without it, loner though I tend to be. But I would like to hope that I have the discipline to not just go along for ride just because my tribe says ‘That’s the way we think.’ … then again, I have an extremely high opinion of my tribe and they are all highly functioning individuals who take nothing for granted. I can’t think of a higher compliment in this context.