bookmark_border“But is it Art?!” Family resemblance concepts’ (Wittgenstein) explained simply (from my The Philosophy Gym)

9. But is it Art?

From my book The Philosophy Gym: 25 Short Adventures in Thinking.
I mean they’d gone and fucking installed the work without me even being here. That’s just not on. This is my bed. If someone else installs it, it’s just dirty linen. If I do it, it’s art. Tracey Emin (artist), Evening Standard, 12/9/00.
Today it seems almost anything can be classified as a work of art: Damien Hirst’s pickled shark or Tracey Emin’s unmade bed, for example. But what is art, exactly? What is it that Macbeth, a piece of tribal sculpture, The Nutcracker Suite, the roof of the Cistene Chapel and Emin’s bed all have in common? What is the common denominator that makes each one of these things art? This is an extremely difficult question to answer. This chapter explains one of the leading theories, taking in one of Wittgenstein’s (1889-1951) most important insights along the way.

What is a work of art?

The scene: an art gallery. Fox, an artist, is peering intently at a Rothko. O’Corky tries to engage him in conversation.
Continue reading ““But is it Art?!” Family resemblance concepts’ (Wittgenstein) explained simply (from my The Philosophy Gym)”

bookmark_borderSecular Humanism: why it’s a strategic mistake to define as requiring naturalism

What does secular humanism (or, as we say in the UK, humanism) involve? In Humanism: A Very Short Introduction (OUP 2011) I suggest that most of those who sign up to secular humanism sign up to following:
Continue reading “Secular Humanism: why it’s a strategic mistake to define as requiring naturalism”

bookmark_borderErik Wielenberg’s argument re sceptical theism defended and developed – forthcoming in Religious Studies

(revised 9 April 2014)

Sceptical theism and a lying God – Wielenberg’s argument defended and developed


Stephen Law

Department of Philosophy, Heythrop College, University of London, Kensington Square, London W5 8HX UK



Abstract: Sceptical theists attempt to block the evidential argument from evil by arguing that a key premise of that argument – that gratuitous evil exists – cannot reasonably be maintained. They argue that, for all we know, our knowledge of reasons God may have to permit such evil is radically incomplete. Thus the fact that we cannot identify reasons for God to permit the evil we observe does not allow us reasonably to conclude that no such reasons exist. In response, Erik Wielenberg has pointed out what appears to be, for many sceptical theists, an unfortunate further consequence of their position. According to Wielenberg, if sceptical theism is correct, then, similarly, the fact that we cannot identify reasons why God would lie to us does not allow us reasonably to conclude no such reasons exist. But then, for all we know, God’s word constitutes not a divine revelation but a divine lie. This paper examines sceptical theist responses to Wielenberg’s argument to date (from Segal, and McBrayer and Swenson) and develops two new Wielenberg-style arguments for the same conclusion.

  Continue reading “Erik Wielenberg’s argument re sceptical theism defended and developed – forthcoming in Religious Studies”

bookmark_borderPlaying The Mystery Card (incl. McGrath vs Dawkins) from my book Believing Bullshit



Suppose critics point out that not only do you have little in the way of argument to support your particular belief system, there also seems to be powerful evidence against it. If you want, nevertheless, to convince both yourself and others that your beliefs are not nearly as ridiculous as your critics suggest, what might you do?


Perhaps Play The Mystery Card. As we will see, this sort of strategy is particularly popular when it comes to defending beliefs in the supernatural – beliefs in ghosts, angels, psychic powers and gods, and so on. By far the most popular version of the strategy – the version on which I focus here – is to say, “Ah, but of course this is beyond the ability of science/reason to decide. We must acknowledge that science and reason have their limits. It is sheer arrogance to suppose they can explain everything.” Some things may indeed be beyond the ability of science and reason to decide. However, as we’ll see, those who say “But it’s beyond the ability of science/reason to decide” in order to try to immunize what they believe against rational criticism are often erecting little more than a smokescreen.


 “But it’s beyond science /reason to decide”



The view that science can ultimately explain everything – can answer every legitimate question – is called scientism. Actually, even most scientists consider scientism a dubious doctrine. Many of them accept that there may be questions science cannot answer.

  Continue reading “Playing The Mystery Card (incl. McGrath vs Dawkins) from my book Believing Bullshit”

bookmark_borderPhilosophy and religion in Schools

Religion and philosophy in schools

(from Hand and Winstanley, Philosophy in Schools, Continuum 2008))


Is philosophy in schools a good idea? The extent to which early exposure to a little philosophical thinking is of educational benefit is, of course, largely an empirical question. As a philosopher, that sort of empirical study is not my area of expertise.

But of course there is also a philosophical dimension to this question. As a philosopher, conceptual clarification and the analysis of the logic of the arguments on either side certainly is my field. That is where I hope to make a small contribution here.

This chapter is in two parts. In the first, I look at two popular religious objections to the suggestion that all children ought to be encouraged to think independently and critically about moral and religious issues. In the second part, I explain a well-known philosophical distinction – that between reasons and causes – and give a couple of examples of how this conceptual distinction might help illuminate this debate.

PART ONE: Two popular religious objections

Philosophy in the classroom involves children thinking critically and independently about the big questions. These questions include questions about morality and the origin and purpose of human existence. Examples are: “Why is there anything at all?”, “What makes things right or wrong?” And “What happens to us when we die?” These questions are also addressed by religion. The subject matter of philosophy and religion significantly overlap. And where there is overlap, there is the possibility of disputed territory. Proponents of philosophy in the classroom may find themselves coming into conflict with at least some of the faithful. While many religious people are enthusiastic about philosophy in the classroom, there are also many who are either totally opposed to it, or else want severely to restrict its scope. Some Christians, Muslims and Jews consider the introduction of philosophy an unwelcome intrusion into those parts of the curriculum that have traditionally been deemed theirs. They have developed a whole range of objections. Continue reading “Philosophy and religion in Schools”

bookmark_borderTV interview in Tehran

I was at a philosophy of religion conference in Tehran, Iran last week – invited as an atheist to speak to and engage with assembled philosophers, cleric. etc.
I appeared briefly on TV – unfortunately the bit of the interview they chose to broadcast was misleading as they cut the “but”…
Go here: I was at a philosophy of religion conference in Tehran, Iran last week – invited as an atheist to speak to and engage with assembled philosophers, cleric. etc.
I appeared briefly on TV – unfortunately the bit of the interview they chose to broadcast was misleading as they cut the “but”…
Go here.

bookmark_borderWhat is humanism?


What is Humanism?


“Humanism” is a word that has had and continues to have a number of meanings. The focus here is on kind of atheistic world-view espoused by those who organize and campaign under that banner in the UK and abroad.


We should acknowledge that there remain other uses of term. In one of the loosest senses of the expression, a “Humanist” is someone whose world-view gives special importance to human concerns, values and dignity. If that is what a Humanist is, then of course most of us qualify as Humanists, including many religious theists. But the fact remains that, around the world, those who organize under the label “Humanism” tend to sign up to a narrower, atheistic view.


What does Humanism, understood in this narrower way, involve? The boundaries of the concept remain somewhat vague and ambiguous. However, most of those who organize under the banner of Humanism would accept the following minimal seven-point characterization of their world-view.

  Continue reading “What is humanism?”

bookmark_borderWielenberg’s Divine Lies, and McBrayer and Swenson’s response – my comments for feedback


Skeptical Theism and Divine Deception: The McBrayer/Swenson response to Wielenberg


1. Skeptical Theism


Evidential arguments from evil often[i] take something like the following form:


If God exists, gratuitous evil does not exist.

Gratuitous evil exists.

Therefore, God does not exist


Gratuitous evil is evil for which there is no God-justifying reason. Why suppose gratuitous evil exists? Well, we observe great evils for which we can identify no God-justifying reason. Thus, it is suggested, it’s reasonable to believe gratuitous evil exists.

  Continue reading “Wielenberg’s Divine Lies, and McBrayer and Swenson’s response – my comments for feedback”

bookmark_borderPressing Your Buttons (from my book Believing Bullshit)



One way in which we can shape the beliefs of others is by rational persuasion. Suppose, for example, that I want someone to believe that Buckingham Palace is in London (which it is). I could provide them with a great deal of evidence to support that belief. I could also just take them to London so they can see with their own eyes that that’s where Buckingham Palace is located.

But what if these kinds of method aren’t available? Suppose I have little or no evidence to support the belief I nevertheless want people to accept. Suppose I can’t just show them that it’s true. How else might I get them to believe?

I might try to dupe them, of course. I could produce fraudulent evidence and bogus arguments. But what if I suspect this won’t be enough? What if I think my deceit is likely to be detected? Another option is to drop even the pretence of rational persuasion and to adopt what I call Pressing your Buttons.

Belief-shaping mechanisms

All sorts of causal mechanisms can be used to shape belief. For example, our beliefs are shaped by social and psychological mechanisms such as peer pressure and a desire to conform. Finding ourselves believing something of which our community disapproves is a deeply uncomfortable experience, an experience that may lead us unconsciously to tailor what we believe so that we remain in step with them. We’re far more susceptible to such social pressures than we like to believe (as several famous psychological studies have shown[i]).

Belief can also be shaped through the use of reward and punishment. A grandmother may influence the beliefs of her grandson by giving him a sweet whenever he expresses the kind of beliefs of which she approves, and ignores or smacks him when he expresses the “wrong” sort of belief. Over time, this may change not just the kind of beliefs her grandson expresses, but also the kinds of belief he holds.

Perhaps beliefs might also be directly implanted in us. Some suppose God has implanted certain beliefs in at least some of us. Our evolutionary history may also produce certain beliefs, or at least certain predispositions to belief. For example, there’s growing evidence that a disposition towards religious belief is part of our evolutionary heritage, bestowed on us by natural selection. But even if neither God, nor evolution, has implanted beliefs in us, perhaps we’ll one day be able to implant beliefs ourselves using technology. Perhaps we’ll be able to strap a brain-state-altering helmet on to an unwitting victim while they sleep, dial in the required belief, press the red button and “Bing!”, our victim wakes up with the belief we’ve programmed them hold. That would be a rather cruel trick. Some hypnotists claim a similar ability to, as it were, directly “inject” beliefs into people’s minds.

Obviously, these kinds of causal mechanism can operate on us without our realizing what’s going on. I might think I condemn racism because I have good grounds for supposing racism is morally wrong, but the truth is I have merely caved into peer pressure and my desire not to be ostracised by my liberal family and friends. If a belief has been implanted in me by, say, natural selection, or by some brain-state-altering device then, again, I may not be aware that this is the reason why I believe. Suppose, for example, that some prankster to programmes me to believe I have been abducted by aliens using the belief-inducing helmet described above. I wake up one morning and find, as a result, that I now very strongly believe I was taken aboard a flying saucer during the night. I have no awareness of the real reason why I now hold that belief – of the mechanism that actually produced the belief in me. If asked how I know I was abducted, I will probably say “I Just Know!”

Isolation, control, uncertainty, repetition, emotion

I’m going to focus here on five important belief-shaping mechanisms: isolation, control, uncertainty, repetition and emotion.

(i) isolation. Isolation is a useful belief-shaping tool. An isolated individual is more vulnerable to various forms of psychological manipulation. If you want someone to believe something that runs contrary to what their friends and family believe, it’s a good idea to have them spend some time at a retreat or remote training camp where their attachment to other ideas can more easily be undermined. Cults often isolate their members in this way. The The cult leader Jim Jones physically moved both himself and all his followers to the Guyanan jungle (where they all eventually committed suicide). Isolation is also recommended by some within more mainstream religions. In the UK, hermetically sealed-off religious schools are not uncommon. Students at the Tarbiyah Academy in Dewsbury, for example, are allegedly taught that


‘the enemies of Allah’ have schemed to poison the thinking and minds of [Muslim] youth and to plant the spirit of unsteadiness and moral depravity in their lives. Parents are told that they betray their children if they allow them to befriend non-Muslims.[ii]

A related mechanism is:


(ii) control. If you want people to accept your belief system, it’s unwise to expose them to alternative systems of belief. Gain control over the kind of ideas to which they have access and to which they are exposed. Censor beliefs and ideas that threaten to undermine your own. This kind of control is often justified on the grounds that people will otherwise be corrupted or confused. Totalitarian regimes will often remove “unhealthy” books from their libraries if the books contradict the regime. All sorts of media are restricted on the grounds that they will only “mislead” people. Schools under totalitarian regimes will sometimes justify preventing children from discovering or exploring other points of view on the grounds they will only succeed in “muddling” children. Take a leaf out of the manuals of such regimes and restrict your followers’ field of vision so that everything is interpreted through a single ideological lens – your own.

(iii) uncertainty. If you want people to abandon their former beliefs and embrace your own, or if you want to be sure they won’t reject your beliefs in favour of others, it helps to raise as much doubt and uncertainty as possible about those rival beliefs. Uncertainty is a potent source of stress, so the more you associate alternative beliefs with uncertainty, the better. Ideally, offer a simple set of geometric, easily formulated and remembered certainties designed to give meaning to and cover every aspect of life. By constantly harping on the vagaries, uncertainties and meaninglessness of life outside your belief system, the simple, concrete certainties you offer may begin to seem increasingly attractive to your audience.

(iv) repetition. Encourage repetition. Get people to recite what you want them to believe over and over again in a mantra-like way. Make the beliefs trip unthinkingly off their tongues. It doesn’t matter whether your subjects accept what they are saying, or even fully understand it, to begin with. There’s still a fair chance that belief will eventually take hold. Mindless repetition works especially well when applied in situations in which your subjects feel powerful pressure to confirm. Lining pupils up in playgrounds for a daily, mantra-like recitation of your key tenets, for example, combines repetition with a situation in which any deviation by an individual will immediately result in a hundred pairs of eyes turned in their direction.

(v) emotion. Emotion can be harnessed to shape belief. Fear is particularly useful. In George Orwell’s novel Nineteen Eighty-Four, the regime seeks control not just over people’s behaviour, but, even more importantly, what they think and feel. When the hapless rebel Winston is finally captured, his ”educators” make it clear that what ultimately concerns them are his thoughts:


“And why do you imagine that we bring people to this place?”

“To make them confess.”

“No, that is not the reason. Try again.”

“To punish them.”

“No!” exclaimed O’Brien. His voice had changed extraordinarily, and his face had suddenly become both stern and animated. “No! Not merely to extract your confession, not to punish you. Shall I tell you why we have brought you here? To cure you! To make you sane! Will you understand, Winston, that no one whom we bring to this place ever leaves our hands uncured? We are not interested in those stupid crimes that you have committed. The Party is not interested in the overt act: the thought is all we care about.[iii]

The terrifying contents of Room 101 eventually cause Winston to succumb. He ends up genuinely believing that if Big Brother says that 2 plus 2 equals five, then two plus two does equal five. Many real regimes have been prepared to employ similarly brutal methods to control what is going on in people’s minds. However, emotional manipulation can take much milder forms yet still be effective. For example, you might harness the emotional power of iconic music and imagery. Ensure people are regularly confronted by portraits of Our Leader accompanied by smiling children and sunbeams emanating from his head (those Baghdad murals of Saddam Hussein spring to mind). Ensure your opponents and critics are always portrayed accompanied by images of catastrophe and suffering, or even Hieronymus-Bosch-like visions of hell. Make people emotional dependent on your own belief system. Ensure that what self-esteem and sense of meaning, purpose and belonging they have is derived as far as possible from their belonging to your system of belief. Make sure they recognise that abandoning that belief system will involve the loss of things about which they care deeply.

It goes without saying that these five mechanisms of thought-control are popular with various totalitarian regimes. They are also a staple of many extreme religious cults.

Applied determinedly and systematically, these mechanisms can be highly effective in shaping belief and suppressing “unacceptable” lines of thought. They are particularly potent when applied to children and young adults, whose critical defences are weak, and who have a sponge-like tendency to accept whatever they are told.

Note that traditional mainstream religious education has sometimes also involved heavy reliance on many, sometimes all, of these five mechanisms. I was struck by a story a colleague once told me that, as a teenage pupil of rather strict Catholic in the 1960’s, she once put her hand up in class to ask why contraception was wrong. She was immediately sent to the headmaster who asked her why she was obsessed with sex. Interestingly, my colleague added that, even before she asked the question, she knew she shouldn’t. While never explicitly saying so, her school and wider Catholic community had managed to convey to her that asking such a question was unacceptable. Her role was not to think and question, but to passively accept. My colleague added that, even today, nearly half a century later later, despite the fact that she no longer has any religious conviction, she finds herself feeling guilty if she dares to question a Catholic belief. So effective was her religious upbringing in straight-jacketing her thinking that she still feels instinctively that to do so is to commit a thought-crime.

Of course, religious education doesn’t have to be like this, and often it isn’t. An open, questioning attitude can be encouraged rather than suppressed. Still, it’s clear that some mainstream religions have historically been very reliant upon such techniques so far as the transmission of the faith from one generation to the next is concerned. In some places, they still are.



Applied in a consistent and systematic fashion these various techniques add up to what many would call “brainwashing”. Kathleen Taylor, a research scientist in physiology at the University of Oxford, upon whose work I am partly drawing here, has published a book on brainwashing. In an associated newspaper article, Taylor writes that:


One striking fact about brainwashing is its consistency. Whether the context is a prisoner of war camp, a cult’s headquarters or a radical mosque, five core techniques keep cropping up: isolation, control, uncertainty, repetition and emotional manipulation.[iv]

Taylor adds in her book that within the discipline of psychology, “brainwashing” is an increasingly superfluous word. It can be a misleading term, associated as it is, with Manchurian-Candidate-type stories of seemingly ordinary members of the public transformed into presidential, assassins on hearing a trigger phrase. As Taylor says, that kind of brainwashing is a myth. Case studies suggest there is

no “magic” process called “brainwashing”, though many (including the U.S. government) have spent time and money looking for such a process. Rather the studies suggest that brainwashing… is best regarded as a collective noun for various, increasingly well-understood techniques of non-consensual mind-change.

The unwitting and well-intentioned brainwasher

Often, those who use such techniques are despicable people with the evil aim of enslaving minds. Edward Hunter, the CIA operative who coined the phrase back in 1950, characterized brainwashing in emotive terms:

The intent is to change a mind radically so that its owner becomes a living puppet – a human robot – without the atrocity being visible from the outside. The aim is to create a mechanism in flesh and blood, with new beliefs and new thought processes inserted into a captive body. What that amounts to is the search for a slave race that, unlike the slaves of olden times, can be trusted never to revolt, always to be amenable to orders, like an insect to its instincts.

Perhaps this very often was the intent so far as the regimes of which Hunter had experience were concerned. However, surely the intent to produce mental slaves is not required for brainwashing. Sometimes those who apply these techniques genuinely believe themselves to be doing good. Their intention is not to enslave but to free their victims from evil and illusion. Yet, despite the absence of any evil intent, heavy reliance on such techniques still adds up to brainwashing. Brainwashers can be good people with little or no awareness that what they are engaged in is brainwashing.

The consenting victim

In the second Taylor quotation above, Taylor says that brainwashing involves various techniques of non-consensual mind-change. That cannot be quite right. Of course, prisoners-of-war don’t usually consent to being brainwashed. But people can in principle consent. In one well-known thriller, the trained assassin at the heart of the film turns out to have agreed to be brainwashed. The fact that he consented to have such techniques applied to him doesn’t entail that he wasn’t brainwashed.

People sometimes willingly submit themselves to brainwashing. They sign up to be brainwashed at a cult’s training camp, say. Admittedly, they will not usually describe what they have signed up to as “brainwashing”. As they see it, even while they are fully aware that the above techniques will be applied to them, they nevertheless suppose they are merely being “educated” – being put through a process that will open up their minds and allow them to see the truth.

Also notice that people are sometimes forcibly confronted with the truth. I might be forced to look at compelling evidence that someone I love has done some terrible deed, evidence that does convince me that they’re guilty. So not only is not all brainwashing non-consensual, not all non-consensual mind-change is brainwashing.

Reason vs. brainwashing

So what is brainwashing, then? What marks it out from other belief-shaping mechanisms? At this point, some readers might be wondering whether what I am calling “brainwashing” is really any different to any other educational method. Isn’t the application of reason to persuade really just another form of thought-control? Just another way of wielding power over the minds of others? So why shouldn’t we favour brainwashing over reason? Particularly if no one is actually being coerced, threatened or harmed?

In fact, there’s at least one very obvious and important difference between the use of reason and the use of these kinds of belief-shaping techniques. Reason is truth-sensitive. It favours true beliefs over false beliefs. Trying making a rational case for believing that New Jersey is populated with ant-people or that the Earth’s core is made of yoghurt. Because these beliefs are false, you’re not going to find it easy.

Reason functions, in effect, as a filter on false beliefs. It’s not one hundred percent reliable of course – false beliefs can still get through. But it does tend to weed out false beliefs. There are innumerable beliefs out there that might end up lodging in your head, from the belief that Paris is the capital of France to the belief that the Earth is ruled by alien lizard-people. Apply your filter of reason, and only those with a fair chance of being true will get through. Turn your filter off, and your head will soon fill up with nonsense.

And yet many belief systems do demand that we turn our filters off, at least when it comes to their own particular beliefs. In fact, those who turn their filters off – those whose minds have become entirely passive receptacles of the faith – are often held up by such belief-systems as a shining example to others. Mindless, uncritical acceptance (or, as they would see it, a simple, trusting faith in the pronouncements of Big Brother) is paraded as a badge of honour.

Reason is a double-edged sword. It does not favour the beliefs of the “educator” over those of the “pupil”. It favours those beliefs that are true. This means that if you try to use reason to try to bring others round to your way of thinking, you run the risk that they may be able to demonstrate that it is actually you that’s mistaken. That’s a risk that some “educators” aren’t prepared to take.

The contrast between the use of reason to persuade, and the use of the kind of belief-shaping mechanisms outlined above, is obvious. You can use emotional manipulation, peer pressure, censorship and so on to induce beliefs that happen to be true. But they can be just effectively used to induce belief that Big Brother loves you, that there are fairies at the bottom of the garden and that the Earth’s core is made of yoghurt. Such techniques do indeed favour the beliefs of the “educator” over those of the “pupil”. Which is precisely why those “educators” who suspect they may end up losing the argument tend to favour them.

I call the application of such non-truth-sensitive belief-inducing techniques – techniques that don’t require even the pretence of rational persuasion – Pressing Your Buttons. Brainwashing involves the systematic and dedicated application of such button-pressing techniques.

Of course, to some extent, we can’t avoid pressing the buttons of others. Nor can we entirely avoid having our own buttons pressed. That fact is, we all have our beliefs shaped by such non-truth sensitive mechanisms. No doubt we flatter ourselves about just how “rational” we really are. And, like it or not, you will inevitably influence the beliefs of others by non-truth-sensitive means.

For example, my own children’s beliefs are undoubtedly shaped by the kind of peer group to which I introduce them, by their desire to want to please (or perhaps annoy) me, by the range of different beliefs to which I have given them access at home, and so on. But of course that’s not yet to say I’m guilty of brainwashing my children. The extent to which we shape the beliefs of other by pressing their buttons, rather than relying on rational means, is a matter of degree. There’s a sliding scale of reliance on non-truth-sensitive mechanisms, with brainwashing located at the far end of the scale. There’s clearly a world of difference between, on the one hand, the parent who tries to give their child access to a wide range of religious and political points of views, encourages their child to think, question, and value reason, and allows their child to befriend children with different beliefs and, on the other hand, the parent who deliberately isolates their child, ensures their child has access only to ideas of which the parent approves, demands formal recitation of certain beliefs, allows their child to befriend children who share the same beliefs, and so on.

The dehumanizing effect of button-pressing

So one key difference between relying on reason to influence the beliefs of others and relying on button pressing is that only the former is sensitive to truth. Button pressing can as easily be used to induce false or even downright ridiculous beliefs as it can true beliefs.

There is also a second important difference worth noting. As the philosopher Kant noted, when you rely on reason to try to influence the beliefs of others, you respect their freedom to make (or fail to make) a rational decision. When you resort to pressing their buttons on the other hand, you are, in effect, stripping them of that freedom. Your subject might think they’ve made a free and rational decision, but the truth is they’re your puppet – you’re pulling their strings. By resorting to button-pressing  – peer pressure, emotional manipulation, repetition, and so on – you are, in effect, treating them as just one more bit of the causally-manipulatable natural order – as mere things. The button-pressing approach is, in essence, a dehumanizing approach.



Clearly, a cult that employs full-blown brainwashing at a training camp is a cause for concern. If the beliefs it induces are pernicious – if, for example, followers are being lured into terrorism – then obviously we should alarmed. However, even if the beliefs induced happen to be benign, there’s still cause for concern.

One reason we should be concerned is the potential hazard such mindless and uncritical followers pose. They may as well have cotton wool in their ears so far as the ideas and arguments of non-believers are concerned. They are immune to reason. Trapped inside an Intellectual Black Hole, they are now largely at the mercy of those who control the ideas at its core. The dangers are obvious.

Such extreme examples of brainwashing are comparatively rare. Still, even if not engaged in full-blown brainwashing, if the promoters of belief system come increasingly to rely on button-pressing to shape the beliefs of others, that too is a cause for concern. The more we rely on button-pressing, the less sensitive to reason and truth our beliefs become.

[i] Solomon Asch’s conformity experiments revealed people are prone to denying the evidence of their own eyes if it brings them into disagreement with others (though admittedly this is not quite the same thing as changing what one believes in order to conform). See Asch, S. E. “Effects Of Group Pressure Upon The Modification And Distortion Of Judgment” in H. Guetzkow (ed.) Groups, Leadership And Men (Pittsburgh, PA: Carnegie Press, 1951).

[ii] The Times, 20th July 2005, p. 25.

[iii] George Orwell, Nineteen Eighty-Four (Harmondsworth: Penguin, 1954), p. 265

[iv] Kathleen Taylor, “Thought Crime” The Guardian, 8th October 2005, p. 23.

bookmark_borderResponse to Randal Rauser’s response to my response to his shoddy review…

Randal Rauser has responded to my suggestion that his review of my book Believing Bullshit was pretty shoddy (though not as shoddy as Martin Cohen’s in the THES). Go here.
Understandable, I suppose. By combining selective quotation, misdirection and quite a lot of bluster, Rauser is quite successful at generating the impression I have been unfair to him.
A preliminary point re not responding to Rauser’s entire review. After disclaimers about what follows being nothing personal, Rauser moans that I only respond to 10% of his review. Sure I did. Because it is, to use Rauser’s own description of it, “bloated”.
I didn’t cherry-pick which bit to respond to. I just started at the beginning of the review and kept going till I felt I had expended enough effort in terms of hours and word count. Given the way Rauser packs in the muddles, misrepresentations, bad arguments, etc. it took me 2,500 words to unpack what was wrong with just the first 10% of Rauser’s review. I stopped at that point. I thought that pretty reasonable and am sorry if Rauser thinks otherwise.
In addition here are another 2,750+ words dealing with Rauser’s defence of just that first 10% of his review. So that’s 5,250 words I have now written (and we know what’s coming next, of course). Looks like full response to Rauser’s review will probably require I write at least 50k words. More than my entire doctoral thesis.
So to business. It seems to me Rauser makes three main points re my response, which I have attempted to gloss below (numbered, in bold).
1. Rauser claims he didn’t misrepresent me, and did deal with my main argument, re Wykstra-type appeals to mystery (if not in his actual review). Continue reading “Response to Randal Rauser’s response to my response to his shoddy review…”