Archive: Then begins a journey in my head – reflections on religious belief and delusion

Author

More from this author

- Advertisement -spot_img

This article originally appeared in The Skeptic, Volume 22, Issue 3, from 2012.

We might wonder, if there is a god, why bother? With the ultimate father figure ‘up there’, someone so much bigger than us, so much more in control, who cares for us and knows what’s best, why not lie back and enjoy the moment?

In an ancient book, a character called Jesus recommends we take no thought for the morrow. Is that wise, let alone possible? A healthy human mind looks forward and back, as well as attending to the present. We may be anxious or excited about the future, regretful about or content with our past. Whatever our feelings, without them and without this sense of time stretching out before and behind us, our lives would collapse into mere sentience.

We are conscious and self-conscious animals, the one species lucky to have evolved frontal lobes large enough to liberate us from the present and enrich our personalities. We can make plans and make our own meanings, and carry within a sense of permanence that anchors us against an ever-changing world.

It would be a shame to let all that go to waste by not being bothered. There are plenty of slings and arrows and tamping bars of outrageous fortune that could make life harder, even without receiving a hole in the head like the unfortunate Phineas Gage. An injury that should have killed him remarkably left much of his brain function intact. And yet, although still able to walk and talk, his personality changed. There was ‘a new spirit animating’ his body, according to Antonio Damasio. The idea of something beyond the material realm has great appeal, and not just to the religious. In Paul Bloom’s memorable phrase, we are all ‘natural-born dualists’ who find it very easy to think of our bodies as material shells guided by an immaterial spirit, thought to be an eternal soul in some religions.

However, if this spirit cannot always survive bodily injury, what hope it can survive bodily death? The case of Phineas Gage challenges our intuitions and modern science seems to confirm our materiality.

As we live longer and have less contact with high explosives, it is more likely to be a disease of the mind rather than an iron bar that will change who we are. Unlike trauma, a disease like Alzheimer’s acts slowly, barely noticeable at first, its effects ignored or rationalized away. Its progress is regress, an unravelling of personality. A healthy body, so long essential for a happy life, now becomes the stage on which is played out one final tragedy. The actor who once strode on and spoke such lines to make the world shake now shuffles to the wings, unrecognizable even to his fans, forgetful of his lines and unaware of his next move. Horizons close in. There is no curtain call, no final bow. For the audience left behind, it’s too late for applause, too soon to mourn.

At the other end of life, we are also dependent, but a child’s horizons are expanding like the early universe. We begin as self-centred creatures, although there’s not much self to be centred around. We’re explorers in a vast world full of objects and other people, who are their own centres of consciousness and sources of endless fascination. As we grow we naturally look beyond ourselves for something bigger than ourselves, but why go to the extremes that religion so often demands? There are plenty of bigger purposes without inventing gods to worship. Indeed, if all you want to do is please God and save your soul, then everyone else may just be means to those selfish ends. You do good not because of the intrinsic value of doing good but because it achieves a goal you desire.

That we have an eternal soul to save is, I believe, one of the enduring delusions of religion. Its origin and fate – the stain of original sin and the judgment waiting at the end of time – have distracted and terrified far too many for far too long. Humanism celebrates our rise and has no time for the miserable doctrine of the fall of man. This is not a secular happy-clappy optimism, however. We were never perfect and utopia is not just round the corner. What is hopeful about humanism, compared with the individualism of religion, is its cooperative nature, founded on a fundamental respect for all humans.

Humanism acknowledges human weakness without rubbing our faces in it. The physicist Robert Park describes science as a process that “transcends the human failings of individual scientists”and while individual minds discover great truths aboutthe natural laws that govern the universe, these are never accepted on authority and are always subject to reasonable scrutiny by a wider community.

Compare how Christians have made use of, for example, the (unhistorical) foundational story of Eve and her blunder. For many centuries, this flimsiest of tales was used to justify an astonishing prejudice against half of the human race. According to St. Clement of Alexandria,

[e]very woman should be filled with shame by the thought that she is a woman.

He was not alone in reaching such unwarranted moral judgments. A humanist thanks goodness there isn’t a god, no magic man in the sky, no heaven waiting no matter that we trash this earth, and no hell either, no second coming to judge the quick and the dead. It’s all down to us. At our best, being bothered is what we’re good at and, given the problems we face, we’d better be bothered.

Other minds and make-believe

Gilbert Ryle pointed out that “one person cannot in principle visit another person’s mind as he can visit signal-boxes”. This may be obvious, but it is still important, because other minds feature so prominently in our lives. The idea that there are other perspectives on the same objective reality is itself a remarkable cognitive step. A child becomes a mind reader, not the hokum end-of-the-pier kind but an astute interpreter of things like facial expressions and the sounds emerging from a parent’s mouth. In this way most human babies soon become social beings, acquiring language and all the cognitive tools, including a moral sense, that both develop the sense of self and integrate the individual into a larger group.

The imagination might seem to be little more than either child’s play, literally, or the preserve of ad men and arts professionals. It is, of course, a far more ordinary presence in our lives, although still an extraordinary part of our biology, as its absence or impairment shows. Autistic children, for example, have difficulties in communication and social interaction, in recognizing other points of view, and may in extreme cases regard other people as nothing more than objects. Living in a world depopulated of minds is an often terrifying and distressing experience, and psychologists have coined the term ‘mindblindness’ to describe this serious condition.

In contrast is the mind hypersensitive to signs of agency, and occasionally regards inanimate objects as having mental states. Steven Mithen argues that such anthropomorphic thinking was an important acquisition in the prehistory of the mind and a sign of cognitive fluidity. One of the consequences of this evolutionary move was that the world became overpopulated with minds and spirits, some of which eventually graduated into gods with a more independent existence. And then, in some cultures, the pantheon was whittled back down to a single god, a supreme creator, the architect of the universe.

The many arguments from design, for example, all exploit our mind-reading fluency. In the same way we infer a neighbour’s intentions when we see a shed being built, we think we can infer the cosmic intentions of the builder of the universe. One goal of many religious people is, after all, to read the mind of God, to not be ‘mindblind’ with respect to God.

Given the pronouncements of certain Christians in America regarding the inadvisability of, say, stem cell research and gay bishops, it must seem to a non-sceptic that they have been remarkably successful at this cognitive task. Be that as it may, now consider four scenes illustrating various minds in action:

  1. An actor takes the part of Aeneas and is moved to tears by his telling of Priam’s death, of how Hecuba watches as Pyrrhus butchers Priam, “mincing with his sword her husband’s limbs”. Hamlet looks on, amazed, and says of the actor: “What’s Hecuba to him, or he to Hecuba, that he should weep for her?”
  1. A doctor asks his patient about a car accident, who responds by launching into a long-winded story about how he opened the fridge and discovered they were running out of milk, which would annoy his wife because she always had to have milk on her cereal for breakfast, and so on until he gets to the part the doctor is interested in, the accident on the way to the supermarket.
  2. A priest stands at an altar and raises a wafer, intoning ancient Latin phrases including the words “hoc est corpus” (which some believe to be hocus pocus), and supposedly effects a supernatural transformation of the wafer’s rather ordinary ingredients that even Heston Blumenthal could not manage in his culinary laboratory.
  3. A little girl watches as two dolls called Sally and Ann act out a little scene. Sally has a ball. She places the ball under a cushion and then leaves the room. In her absence, Ann takes the ball out from under the cushion, and hides it in the toy box. Later, Sally returns. The little girl is asked, “Where does Sally think her ball is?”

Between Hamlet and this last little drama is a huge theatrical gulf, and between the doctor’s office and the church there is also a world of difference. In each, however, a human mind either succeeds or fails to take account of another mind, itself either real or imagined. If the little girl is younger than or around four years of age, she will probably think that Sally shares her own knowledge of the world and conclude that Sally believes the ball is in the toy box. Above this age she is beginning to learn about other points of view, and knows that another individual, even a doll with an imaginary mind, can have a belief about the world that is different from hers, a belief that may well be untrue. The child’s ability to attribute mental states to an inanimate doll should astonish us as much as the actor’s performance astonishes Hamlet.

In the grown-up play, the actor is actually a character, and so we have a real actor playing the part of a fictional actor playing the part of an ancient warrior (possibly mythical) who is himself telling a story about the death of a king (to him real, to us again possibly mythical), a story which includes the queen’s distraught reaction to the slaughter of her husband, an emotion rendered apparently authentically on stage by the actor’s tears. Shakespeare intended that his audience should realize that an actor is pretending to play Hamlet, who wants the traveling actor to pretend he is playing the part of Aeneas, who in the story wants Dido to imagine the grief of Hecuba. This appears to be seventh-order intentionality, a very demanding cognitive load.

While Shakespeare’s appetite for operating at these high orders was exceptional, Daniel Dennett gives a humble playground game as an example of fifth-order intentionality: “You be the sheriff, and ask me which way the robbers went!”

In contrast, the patient is having less success reading the mind of his doctor, since he recounts far too much irrelevant detail. Many of us will have been on the receiving end of similar splurges, and perhaps even caught ourselves delivering a little more information than strictly necessary.

The priest, like Hamlet, is engaged in a performance. There is a text to be followed, a stage and stage directions, a space for the audience, costumes to be worn and an impressive set. There are also similarities in the minds of those involved, in the levels of intentionality needed to follow the proceedings.

As Robin Dunbar argues, religion and story telling seem to be the only human activities that require such advanced cognitive capacities. What has make-believe to do with scepticism or credulity? At first glance, if you think pretend games are childish and children rather gullible, then indulging in make-believe can only encourage credulity. On the other hand, the false-belief task reveals nascent cognitive powers that could mature into scepticism. If others have false beliefs, maybe we have them too? And if there are false beliefs in the world, we’d better tread carefully. For sceptics and scientists, this means relying on reason and evidence and only those authorities whom themselves have relied on reason and evidence and are open to scrutiny. For the religious, authority trumps everything; secrecy is a strategy, and reason and evidence are reduced to cheerleading dogma.

When Othello says over and over that Iago is an honest man, we are willing him for one moment to doubt the truth of Iago’s tales. When a believer listens to a priest describe exactly what God wants us to do, we are witnessing a similar credulity, only more entrenched. While Othello’s faith in Iago could be disabused by a simple handkerchief, the believer prides herself on the strength of her faith to withstand whatever counter-evidence comes her way. Being able to imagine objects that do not actually exist, to think about people who are not actually there, to make accurate inferences about a mind hidden in another body, to see the world from another’s point of view, to think reflexively about beliefs and desires – these are all cognitively demanding but entirely commonplace mental activities for humans; it’s no surprise that religion piggybacks on these cognitive abilities.

The false-belief task requires that we hold in our mind two conflicting pictures of the world: the world as it really is and the world as seen by someone else. And what is reality? Setting aside abstruse metaphysical speculation, objective reality is simply that which doesn’t depend on anyone’s point of view: it isn’t a personal perspective but the “view from nowhere”.

Less abstract is to think of it as a god’s-eye view; after all, we can imagine people who aren’t there, why not a god who isn’t there? However, as Gilbert Ryle pointed out, a “person picturing Helvellyn is not really seeing Helvellyn”, and one reason why religion is replete with delusion is this forgetfulness of the power of the imagination.

Despite the cognitive similarities between religion and story-telling, the crucial difference between Shakespeare and scripture is that in scripture you willfind a character like Fabian who says: “If this were played upon a stage now, I could condemn it as an improbable fiction.” Religious beliefs often come with adjectives like ‘genuine’ and ‘sincere’ attached. It’s not enough for a religious belief to be true: it must also be deeply held. Why is this? If beliefs aim at objective reality, why does their ‘depth’ matter? Because it both distracts the believer from enquiring too closely into their truth and focuses on the social dimension, in which what you say you believe matters. As Dennett notes, what is commonly referred to as religious belief might less misleadingly be called religious professing.

On the whole, we are impressed by a person’s strong avowal of belief, unless that person happens to be Tony Blair at the Chilcot Inquiry or a cardinal defending the Pope’s track record on child abuse, where we have good reason to be sceptical of what is being said. If someone protests a little too strongly that their beliefs are genuine and sincere, we should wonder whether the truth content of their beliefs is proportionately a little too thin on the ground.

Pain and suffering: Is God bothered?

There is a price to pay for our ability to conjure up other worlds, and it is heavier for some than for others. Just as the concept of make-believe is of a higher order than that of belief, so too is the subjective feeling of suffering of a higher order than sensory experience alone. Pain is momentary, felt by a single sentient creature. Its evolutionary value is to deter living things from spoiling their chances of passing on their genes. Even simple creatures recoil from threats. Suffering can describe intense pain, but it usually also involves the imagination. A broken bone is painful, but a footballer whose career has ended will also wonder what might have been. More sympathy is due to a mother who loses a child but keeps him alive in memory for a lifetime. She too will wonder what might have been, although her suffering is neither caused nor limited by direct physical injury.

We can discriminate between good and bad reasons for inflicting pain: a vaccination jab versus torturing someone for pleasure. If there is a Christian god, then at least one theodicy is true, that is, there is a ‘good’ reason why this god did not prevent, for example, the Haitian earthquake. The problem for Christians is, apart from convincing the rest of us of both the existence of their god and of the truth of the theodicy, from our perspective God’s ‘good’ reason must be a ‘bad’ reason precisely because of all the pain and suffering that resulted from that earthquake. To think otherwise is to engage in Orwellian double-think.

T. H. Huxley acknowledged that many people seek comfort in religion, but what about its many discomforts, such as the knowledge of a benevolent god letting people suffer? This is hardly consoling. Much suffering has no explanation other than chance. You may have counted yourself lucky to catch the plane given the traffic jam on the way to the airport, and then wondered about that same luck as it fell out of the sky.

Darwin had a deep appreciation of the beauty of life, but he also recognized the clumsy, wasteful and horridly cruel works of nature, the industrial quantities of death needed to fuel natural selection’s algorithmic grind. Religious belief has as much room for chance as it has for doubt. Everything happens for a purpose, and some Christians regard suffering as the road to salvation.

More troubling is the temptation to purposely inflict suffering if it’s good for the soul. While the religious must ask why their god allows suffering, humanists square up to randomness and get on with the job of minimizing suffering as best they can. There is often no reason, but that is no reason not to act.

Our evolutionary journey

Homo sapiens sapiens have come a long way. We are the animals that not only know things about the world, but we know that we know these things. This rich layering of representations, this reflexivity, this cognitive fluidity was made possible by the particular evolutionary path taken by our species. Compared with our primate cousins, we have the bigger brains, but this is not like replacing the computer on your desk with a more powerful model.

Our brains don’t just contain more of the same grey matter. As well as all the thalamic gubbins of mammalian brains we have the fanciest frontal lobes available. What difference has this made? Early humans spent a million years failing to improve on the hand axe. Then, this thin cortical layer got big enough to trigger the big bang of human culture. So began our love affair with innovation of all kinds. So began all those imaginative journeys in our head. No longer were we compelled to live in the moment, to make do with the world as it was: we could remake it, literally. The tool making species got to work.

The prefrontal cortex is the bit of the neo-cortex that seems to be hooked up to the rest of the brain and to contain a map of the whole cortex. As well as being linked to purposeful behaviour and the so-called executive functions, Elkhonon Goldberg suggests that this is the source of our ‘inner perception’. If our perception of the outside world sometimes lets us down, it should come as no surprise that perception of perception occasionally generates delusions. With our frontal lobes teeming with rich sensory images of other people and with memories and desires and beliefs, it’s no wonder that in those early days of gloaming consciousness we now and then mistook our own thought processes for spirits outside ourselves. The wonder is, of course, why we continue to do so.

Our minds entertain all kinds of weird and wonderful ideas about the world. The wackiest are kept in check since any inability to distinguish fact from fancy would have faced strong selective pressures. Even those parts of the brain hooked up to the outside world, however, do not simply map objective reality. The visual cortex provides an interpretation we call visualperception.

Beau Lotto argues that optical illusions are not failures of the visual sense, because we don’t see the world as it is in any case. We always see what has proved useful in the past, over evolutionary history. Similarly, perhaps delusions – false beliefs about the way the world is – are not always aberrations of the mind but part of its robust normal working? If so, this may be linked to why religion has been useful in the past and why we can’t just switch it off now that we know better. As Bruce Hood argues, our supersense is too strong.

Implications for ethics

If not from religion, where do moral values come from? To many, not believing in god means you believe in anything. Atheists eat babies for breakfast. I can’t guarantee that Fox News ran this story, but sometimes it feels as though humanists are the wickedest people on the planet. Those who advocate a ‘value’ agenda like to put scientists in their place by reminding them that we can’t get moral values from facts.

The more sophisticated might even quote one of the ‘gods’ of atheism: David Hume, who said that we can’t get an ‘ought’ from an ‘is’. What he actually said, however, was that ‘ought’ cannot be derived exclusively from ‘is’. We need an ethical premise as well as and not instead of a factual premise. No one can reach a moral conclusion without reference to the real world and we’re not likely to reach true conclusions if we rely on faulty data. You might have the finest moral sentiments known to man but mix them with delusions about how the world works and the result won’t be a pretty sight.

False beliefs can arise in the simplest of situations, and it’s not surprising that true beliefs about complex situations are hard to come by. The difficulty of doing science bears this out. Religion understood as storytelling is less constrained by the facts, and its taste for metaphor only obscures exactly what those facts are.

The journey ends

As early humans emerged into consciousness and acquired a unique existential intuition, questions were bound to follow. Where did we come from and where were we going? Who were we? All too easy to extrapolate from our own lives to the ends of eternity and put god in charge. To tell creation stories and tales about the end of things. Our brains, for good evolutionary reasons, hate uncertainty.

Over the past few thousand years, however, our appetite for certainty has been exploited by faith; a popular route to religious belief that bypasses evidence. Forgivable, perhaps, when facts were hard to come by, inexcusable when faith denies the facts, as creationism denies evolution. We sideline scepticism and a scientific worldview at our peril in our search for the good life.

One day each one of us will complete our physical journey on this earth. Most of us will live on, for a while, in the memories of those who have known us. Some will live on in their work, but no one will continue the journey as an independent and embodied subject. There is no evidence for an afterlife, which is one more reason to make the most of this life.

The sad fact is that some of us, as a result of brain disorders like Alzheimer’s, will complete the journey in our head prematurely, before our body dies. We’re not born with all our mental powers and we may lose some of them before we die, but, despite their tendency to generate all kinds of delusions, we would not be without them.

The Skeptic is made possible thanks to support from our readers. If you enjoyed this article, please consider taking out a voluntary monthly subscription on Patreon.

- Advertisement -spot_img

Latest articles

- Advertisement -spot_img

More like this