Swinburne’s Case for God – Part 4

Swinburne makes use of Bayes’ Theorem in presenting most of the a posteriori arguments for and against God in The Existence of God (EOG), and he makes significant use of it in summing up his case for God. Although his argument can be presented without using Bayes’ Theorem, I want to stick closely to Swinburne’s presentation of his case as presented in EOG, so I expect to take a look at his use of Bayes’ Theorem as part of presenting and explaining his case for God.


This theorem looks a bit daunting initially, but it is not as complicated as it first appears. If you are unfamiliar with Bayes’ Theorem, as I was last year, think of it like learning a new game. Reading the directions to a new game often make the game seem very complicated, but once you have played it a few times, most of the rules seem quite natural, and you only need to refer to the directions for a few fine points every now and again. Similarly, if you are unfamiliar with Bayes’ Theorem, you can expect to be much less perplexed by it after walking through a few examples of how it is used by Swinburne in his case for God.

The general mathematical form of Bayes’ Theorem (as opposed to the precise formula) is not that complicated:

X = (A x B)/C

By the symmetry of equality we can infer this equation:

(A x B)/C = X

If you took algebra in high school, this shouldn’t look too scary. In fact, with a bit more manipulation, we get a rather familiar looking equation:

A x (B/C) = X

This is the form of unit-conversion problems. For example, take the coversion of feet into yards:

12 feet x (1 yard/3 feet) = X yards

So, if you can handle unit-conversion math, then you should be able to handle Bayes’ Theorem, at least to the extent required to follow Swinburne’s case for God.

Bayes’ Theorem looks a bit more complicated than the equations above, becuase the variables are replaced by expressions for conditional probabilities.

Logical probabilities are always given in terms of a specific body of evidence or information. As information changes, so do probabilities. If you shuffle a standard deck of cards well, and then pick a card at random, without looking at the face of the card, the chance that you will select the Ace of Hearts is about one in 52. The probability that you will select the Ace of Hearts = 1/52. But if you then turn the card over and you clearly see that the card is the Ace of Hearts, then the chance that you have selected the Ace of Hearts becomes about one chance in one, and the probability now = 1. The additional information raised the probability from 1/52 to 1 (certainty).

The conditional probability in this case could be expressed like this:

P (h I e)

Read this as: “The probability of the hypothesis being true, given the specified evidence.”

h: I will draw the Ace of Hearts
e: I will draw a card at random from a standard deck of cards that has just been well-shuffled.

In this case the conditional probability above means: “The probability that I will draw the Ace of Hearts, given the evidence that I will draw a card at random from a standard deck of cards that has just been shuffled.” So, in this case, we can agree with the following statement:

P (h I e) = 1/52

Read this as: “The probability that I will draw the Ace of Hearts, given that I select a card at random from a standard deck of cards that has just been well shuffled is equal to 1/52.”

One more complication, and then we can spell out Bayes’ Theorem. In confirmation theory, evidence is often divided into two categories: (1) specific evidence used to confirm or disconfirm an hypothesis, and (2) general background knowledge. e represents the former, and k represents the latter. So, what we are usually interested in figuring out is this:

P (h I e & k)

Read this as: “The probability that the hypothesis is true, given both the specific evidence cited and our general background knowledge.”

Now if we use some specific conditional probability expressions to replace the variables in the first simple equation presented above, we can construct Bayes’ Theorem:

P (h I e & k) = P(e I h & k) x P(h I k) / P(e I k)

Bayes’ Theorem has the same general form as this:

X = (A x B)/C

By the symmetry of equality we can restate Bayes’ Theorem with the “answer” on the right hand side of the equation:

P(e I h & k) x P(h I k) / P(e I k) = P (h I e & k)

To be continued…