Ludwig Wittgenstein famously talked about language as an interconnected assemblage of language games that make up a world-picture. A world-picture are all of the assumptions, norms, and grounds that a community holds as certain, and from there certain propositions in the language games the community employs will be either true or false. While I somewhat disagree with Wittgenstein’s conclusion that the truth criteria of any proposition is its proper usage within a language game, rather than the proposition’s correspondence with reality, I think his analysis gives a good framework for examining the epistemic disunity in the culture of the west.
By epistemic disunity I am referring to the loss of agreed upon certainties. In the past, certain things were agreed upon by the majority of people within a community and a civilization: within the west, everyone believed in an orthodoxy centered on Christianity, where the Middle East had Islam, and the far east had Buddhism. This gave everyone within a civilization a shared metaphysics and a shared overall ethics. Then, within smaller communities, there were more localized certainties – taboos, superstitions, socioeconomic realities, and so on.
In the west, the Protestant reformation and the scientific revolution shattered this epistemic unity. The enlightenment was a secular project to bring this back under control under the banner of reason. That worked, for the most part, in giving people certainties on which to base other beliefs on – if two people accepted some things as certainties (e.g. the earth is a sphere that revolves around the sun; all people are capable of reason and reason is capable of mediating disputes; knowledge of the external world is possible; and so on) then other disagreements were only matters of opinion, not disputes about the very nature of reality.
In our modern times, this epistemic unity has begun to crumble. Different granular bubble cultures have caused people, living within close geographic proximity, to diverge in their certainties. Civilizations are composed of communities that hate each other. This hatred is largely a result of the different communities taking different things to be certainties: the radical left, for instance, takes Postmodernist Critical Theory as axiomatic, while the radical right buys into things like QAnon and believe that Trump is a purveyor of truth.
When our certainties – the grounds of our beliefs – are challenged by external arguments or evidence, these things require either to be assimilated, accommodated, or rejected. In Piaget’s theory of cognitive development, new bits of information either have to be assimilated into one’s mental model – the new bit of information is altered so as to conform to the things one already believes – or the new bits of information are accommodated – one’s internal mental model is altered so as to fit the new information. A third option, obviously, is just to reject the new information altogether, in what is sometimes called the backfire effect.
Radical differences in certainties seem to result, more often than not, in the backfire effect, rather than assimilation of accommodation of new information. For instance, when both a proponent of critical race theory and a proponent of white nationalism are confronted with evidence that racial minorities are over-represented in cases of police use of force, the former will have their biases confirmed while the latter will, without justification, reject the statistics (possibly by assuming bias in the statisticians). On the other hand, when both are confronted with statistics showing that black people are disproportionately both the perpetrators and the victims of violent crime, the roles are likely reversed, which would result in more confrontations between police and black people. The point being that people take different things to be certain – to be those facts that are presupposed and on which all other knowledge is based – and therefore come to wildly different conclusions when faced with the same arguments or evidence.
One wonders if there may even be distinct sets of rules present in different minds. Do two people within different world-pictures have the same truth conditions or rules of inference? Do they share the same referents for object-words? Saul Kripke imagined a civilization in which there were no agreed upon rules for language. In this civilization, it may be by pure chance that, from an outside viewer, there have been no disputes, even though the rules present in people’s heads are completely different. When person A asks person B what x + y equals, the concept of plus may be different in person A‘s head than in person B‘s head.
Or, consider, when a person who is going for a walk takes an umbrella: if you ask why they are taking an umbrella, they will justify it by saying “it might rain soon.” Unstated in this is that it is undesirable to get wet and that umbrella’s protect a person from getting wet due to rainfall. However, if these unstated assumptions are different for different people – if person A and person B understand the world by means of different unstated assumptions – then will justificatory statements mean the same thing to different people? If person B goes under the assumption that getting wet is always desirable, then person A‘s justification for taking an umbrella along – “because it might rain soon” – will seem absurd to person B. This sort of difference in unspoken assumptions may be present in our political discourse.
And so, is it possible that, within the west, our epistemic disunity is causing the rules of conceptual understanding to diverge, such that the thoughts in two people’s heads – even though, on the surface, they utter the same words – there are vastly different rules that inexorably alter each person’s understanding of the world? If this is the case, then are ideological opponents simply talking past each other?