Working with local history organisations has got me thinking about the crossover between group psychology and epistemology. If we are terribly honest with ourselves, historians – amateurs and professional alike, as with most human beings, we are petty empire-builders. It’s not that we are tyrants, or most of us are not, and we still value and act upon other things – particularly, for historians, on scholarship. But, attitudinally, most of us have a patch of scholarship, or some other work or play, that is our own, and we find ourselves threatened by what others are doing in the neighbourhood. Sometimes we lash out at or snub things and persons we don’t understand.
There are a number of psychological theories which might help explain why we, or maybe just I, don’t seek further understanding. It seems to me, observing many persons, that often the intellectual short-sightedness of any particular person is obvious to other persons around them. However, other people do not always find that same understanding – what the particular person has missed has escaped their attention. I wonder what I have missed. Among the psychological theories are cognitive dissonance and self-perception. A number of other psychological theories may also shed light on behaviour in this context, such as cost-benefit analysis, and free-choice paradigm. What strikes me in the debates are several observations:
- People actively avoid situations and information likely to increase cognitive dissonance – the discomfort from holding contradictory beliefs, ideas, or values, or dealing with new information that conflicts with existing beliefs, ideas, or values.
- People do not think much about their attitudes, let alone whether they are in conflict. They can come to conclusions as observers without much (or no) emotional or intellectual (cognitive) reflection.
Nevertheless, in spite of an over-emphasis in rational process or agency, there are basic elements in reasoning and choice which ought not to be ignored. Ideas on action and motivation may also provide some illumination. We continue to make ethical judgements. We can identify a person’s motivation as malice, intellectual laziness, or just plain ignorance. These are what Bernard Williams called thick concepts – the way we, at the same time, combine our valuation and facts of the matter in language. We see it in others because we think and feel in the same way. It is an inescapable part of our socialization. It is also a fact about how the brain is “wired”. The big mistake of the ancients, and which we continue in modernity, is to emphasize the difference between emotion and reason.
Neuro-psychology is also a very important tool, but in describing brain function, explanation never escapes language, and language, as with any perception or psychological state, is from its only position of being the conscious being, and hence, from inside the norms of psychology. We don’t observe the brain from the outside of a brain. Although we believe that we have a better, scientific, understanding of our attitudinal states, we cannot be reductive in the way we live our lives. Ordinary and common social psychology and ethical judgements still come into play. What is needed is a fuller or more substantial consideration of our emotive-judgements, including the cognitive state in seeking objectivity.
So far the view here has been to a wider consideration of our attitudes. Something else is required. Unless we are a global skeptic, there is a special attitude which happens when we say we know something. It has become fashionable from post-modernist reasoning to abandon all endeavours in epistemology – to be able to say that what we claim as knowledge has a special status among beliefs. However, from the fact that radical post-modernity is forced, in the end, to reason itself into existence, we can see that there is a plain and vicious contradiction through which it just cannot leap out through faith. After being burnt once by hoodwinking liars and fools, people learn to evaluate reason. Our brain is wired to learn the difference between good and bad reasoning, and socially this becomes the process of learning critical thinking strategies – knowing the difference between logical arguments (good reasoning) and fallacies (bad reasoning). What the radical post-modernists have not noticed is that in understanding the role of passion, or the non-rational elements in cognition, reasoning is inescapable. The true global skeptic is forced to admit that he has no understanding.
To understand, the conscious being has to make judgements, and inevitably a series of judgements that flow from one conclusion to another. In living our lives, most of the time we do not reflect on that cognitive process. Reflection, though, enables us to identify the knowledge we believe we have. That we actually have knowledge is the project of epistemology. What this comes to, for historians, is an on-going applied epistemology in our work. It is using the skeptic’s doubt as a tool to refine our conclusions or overthrow them if they come up short. It is true that what I have explained here is far too simple if we were to push deeply into the epistemic field. Nevertheless, the epistemic boot-strapped grounding is still sufficient for what we do.
Many thanks to psychologist, Dr Kelly Dixon, and philosopher, Daniel Halverson, for reviewing this piece. Any omissions or errors are solely the responsibility of the author.
Neville Buch
Latest posts by Neville Buch (see all)
- Cognition Histories and Sociology in The United States. No. 1 - November 18, 2024
- To the new Queensland Minister for Education and the Arts - November 16, 2024
- Conversations with, and Between, Queensland Elites - November 16, 2024