Intelligent Design - Introduction

Last updated Fri Sep 20 16:44:15 EDT 2002

Administrivia

This course will last 10 weeks. Your instructor is Stuart D. Gathman, who can be emailed at stuart@gathman.org. In addition to the book Intelligent Design, by William Dembski, Stuart will be providing supplementary material on the web at http://gathman.org/class/design

Copies of web material will be available at each class.

Because that which may be known of God is manifest in them; for God hath shewed it unto them. Romans 1:19 KJV
Information and entropy are two sides of the same coin. Dembski sticks to the concept of "specified complexity", which I and others call "information", because it is in many ways more intuitive than entropy. Entropy is the absence of information.

Complexity

A Boggle game was used at the beginning of class to illustrate intelligent design. The 16 boggle cubes displayed the letters GODFIRSTLOVEDYOU in that order. Is it possible that this combination came up from a random throw of the dice? Or are you pretty certain that someone arranged them?

The Boggle game has about 1024 combinations. (616×16! permutations. There are 2 or 3 duplicated letters each permutation, so divide by 2!⋅3! or more.) For a continuous system like ideal gas molecules, the collection of all possible states is called "phase space". Information theory measures complexity in bits. If a system has N states, the complexity in bits is log2N. For the Boggle game, 1024≅280, so that 80 bits, or 10 bytes suffices to remember a particular state. (A byte is 8 bits.)

Entropy

How many of these combinations form intelligible sentences? If we restrict ourselves to row ordering, we can take the number of intelligible 16 letter sentences and multiply by 4 for rotations. To estimate the number of intelligible 16 letter sentences, consider that english text is generally compressible by about 50%. 16 letters × 5 bits per letter = 80 bits ≅ 40 bits compressed. Divide by the total number of combinations to get a measure of entropy. For a continuous system, this measure would be the fraction of phase space. In the case of Boggle, 240 × 4 rotations ÷ 280 yields an entropy for intelligible boggles throws of ½38.

It is important to notice that the criterion of "intelligible" for Boggle combinations is "a priori" - defined beforehand. Taking whatever was thrown as a sign of intelligence simply because repeating it is improbable is a mistake, akin to an archer painting a target around wherever his arrow happens to land. This is the mistake commonly made by proponents of The Bible Code. As Boggle players know, finding words in text arranged as a grid is not all that difficult. For a Clue By Four to smack Bible Code fanatics with, you can find hidden messages in any reasonably large piece of literature, for example, Moby Dick.

Choice

What makes certain Boggle combinations "intelligible"? What makes the special states of a continuous system used to measure entropy special? The word intelligent literally means "to choose between". It comes from the Latin inter "between" + legere "to choose". When certain possibilities for a system are chosen as "special", these chosen possibilities are a fraction of all possibilities. This fraction is the entropy of the system. The more precise and specific the choosing, the fewer possibilities are chosen, and the lower the entropy. Also, the more total possibilities, the lower the entropy.

In The Emperors New Mind, Roger Penrose does a back of the envelope estimate of the entropy of the Big Bang based what we know of the choices made in the formation of our universe that make human life possible. He comes up with ½1080. His conclusion: "Even God couldn't be that precise."

Entropy and Disorder

Engineers don't normally compute total entropy. An accurate figure for total entropy requires enumerating all possible states of a system as well as enumerating chosen states. We usually don't know enough about a system to compute total entropy. Instead, engineering deals with change in entropy, which is directly related to heat flow.

Imagine that we select a Boggle cube at random, and turn it to a random face. The chosen message will degrade. If we keep doing this, it will eventually be no longer recognizable. Each random turn increases entropy, because after N turns, the special set now includes all combinations that are reachable from our initial choice in N random moves. As the number of random moves increases, all Boggle combinations become equally likely - a condition of maximum entropy.

While most people equate entropy with disorder, we now see that it measures how far a system has departed from its chosen condition.

Information

We can calculate the number of computer bits needed to record which possibilities are chosen. If our choices select C out of N possibilities, the entropy is C/N, and the number of bits needed to specify our choice is -log2(C/N) = log2(N/C).

The more total possibilities there are, and the more specific the choices are, the more bits are needed to specify the choices. The information needed to specify choices is what Dembski calls complex specified information.

Thermodynamics

The Laws of Thermodynamics are mathematical laws that govern any dynamic system in any universe where certain configurations are chosen as "special". However, they are poorly understood by most people when stated in terms of entropy. They are easier to understand when stated in terms of information - complex specified information, that is.

In a closed (no outside interference) system,

  1. Total entropy can never decrease, total information can never increase.
  2. Any "internally irreversible" change increases entropy, and destroys information. Memory is an "internally irreversible" change, so if you remember the event, it is irreversible.
  3. Change is inevitable (or at least, no one would be able to remember it stopping). Therefore, entropy will inevitably increase to its maximum, and all information will be eventually obliterated.