Copies of web material will be available at each class.

Information and entropy are two sides of the same coin. Dembski sticks to the concept of "specified complexity", which I and others call "information", because it is in many ways more intuitive than entropy. Entropy is the absence of information.## Because that which may be known of God is manifest in them; for God hath shewed it unto them. Romans 1:19 KJV

The Boggle game has about 10^{24} combinations.
(6^{16}×16! permutations. There are 2 or 3 duplicated letters
each permutation, so divide by 2!⋅3! or more.) For a continuous system like
ideal gas molecules,
the collection of all possible states is called "phase space". Information
theory measures complexity in bits. If a system has N states, the complexity
in bits is log_{2}N. For the Boggle game,
10^{24}≅2^{80}, so that 80 bits, or 10 bytes suffices
to remember a particular state. (A byte is 8 bits.)

It is important to notice that the criterion of "intelligible" for Boggle
combinations is "a priori" - defined beforehand. Taking whatever was
thrown as a sign of intelligence simply because repeating it is improbable is
a mistake,
akin to an archer painting a target around wherever his arrow happens to land.
This is the mistake commonly made by proponents of
*The Bible Code*. As
Boggle players know, finding words in text arranged as a grid is not
all that difficult. For a Clue By Four to smack Bible Code fanatics with,
you can find hidden messages in any reasonably large piece of literature,
for example,
Moby Dick.

In
*The Emperors New Mind*, Roger Penrose does a back of the
envelope estimate of the entropy of the Big Bang based what we know
of the choices made in the formation of our universe that make human
life possible. He comes up
with ½^{1080}. His conclusion: "Even God couldn't
be **that** precise."

Imagine that we select a Boggle cube at random, and turn it to a random face. The chosen message will degrade. If we keep doing this, it will eventually be no longer recognizable. Each random turn increases entropy, because after N turns, the special set now includes all combinations that are reachable from our initial choice in N random moves. As the number of random moves increases, all Boggle combinations become equally likely - a condition of maximum entropy.

While most people equate entropy with disorder, we now see that it measures how far a system has departed from its chosen condition.

The more total possibilities there are, and the more specific the choices are, the more bits are needed to specify the choices. The information needed to specify choices is what Dembski calls complex specified information.

In a closed (no outside interference) system,

- Total entropy can never decrease, total information can never increase.
- Any "internally irreversible" change increases entropy, and destroys information. Memory is an "internally irreversible" change, so if you remember the event, it is irreversible.
- Change is inevitable (or at least, no one would be able to remember it stopping). Therefore, entropy will inevitably increase to its maximum, and all information will be eventually obliterated.