Intelligent Design - Information Theory
Last updated
Fri Nov 1 17:11:39 EST 2002
Information and the Law of Entropy
This chapter is concerned with an informal demonstration of the law
of information - which is the dual of the law of entropy. The
basic principal is: in a closed system, information can be destroyed, but
not created. By "information" in this context, we mean "complex
specified information", or CSI. All CSI in a closed universe had to
be present at the beginning. If CSI has been added since
the beginning, then the universe is not closed.
Measuring Information
Information is the reduction in uncertainty associated with an event.
The probability of a realized event is one measure of information, but an
inconvenient one. Smaller probabilities signify more information, and
probabilities are
multiplicative. The probability of A & B is A × B.
For this reason, information is measured in bits. For a probability
P: bits = -log2P.
The reason for a base of 2 is that the smallest unit of information
is considered to be a bit - the knowledge that one of two possibilities of
equal probability was realized.
This measure of information is a measure of complexity. Information
can also be specified.
Where does CSI come from?
Algorithms?
Manfred Eigen claims that CSI comes from algorithms and natural laws.
While this is mathematically impossible (not merely in conflict with
observation), let's look at the touted source of this information: Chaos.
Chaos refers to the fact that simple deterministic functions in mathematics
or systems in physics can have chaotic behaviour. Chaotic behaviour is
deterministic, but completely unpredictable. It is not even predictable
in principle, because computing its behaviour requires infinite
precision.
Does chaos produce CSI? It certainly produces complexity. However,
the specification for all that complexity is, and by the basic principle
of chaos itself can only be, exactly the deterministic function or
system which produces it. So chaos can produce complex information,
but not CSI. It plays a role indistinguishable from Chance.
In general, laws and algorithms can preserve existing CSI, or destroy it.
Chaotic laws and algorithms can generate complexity, but not CSI.
Chance?
This horse has been beat to death, but monkeys at typewriters do
not produce sonnets. There is a vanishing small chance that they might,
but then there is a vanishing small change that my glass of water might
spontaneously turn into ice and steam. Emile Borel proposed 10-50
as lower bound on probability. Anything less likely that this can be
considered impossible for all practical puprposes. Dembski proposes an
stricter bound of 10-150 based on the number of elementary
particles in the observable universe, its duration, and the
Plank time.
Chance can generate complexity, but not CSI. In fact, Chance tends
to destroy CSI.
Chance and Necessity?
Chance and Necessity form what is called a Stochastic process. Stochastic
processes cannot generate CSI. The gist of the proof is that all
Stochastic processes can be decomposed into a function with random and
deterministic variables. The random variable cannot generate CSI as we
have already seen. And the function operating on a sample of the
random variable cannot generate CSI.
Mutation and Selection?
Darwin's mutation-selection mechanism is a Stochastic process and cannot
generate CSI. Genetic algorithms cannot generate CSI. Random mutations can
only destroy CSI. Their combination cannot produce CSI either.
The law of information is as complete general as the law of entropy. It
doesn't matter what the specific random or deterministic processes are.
The Law of Conservation of Information
The upshot of all this is that just as entropy can only increase,
so CSI can only decrease (in a closed system).
In light of this knowlege, you can still be a Naturalist or Deist. You simply
posit that the universe sprang forth (or was created) with all the CSI
needed for its evolution already present. What does not fly is the current
Evolutionary Dogma claiming that CSI is created out of nothing by
the mutual action of Chance and Necessity. This is the sort of tired
optimism from which perpertual motion and vacuum energy inventions are born.
Information Theory and Evolution
While positing that all CSI came with the Big Bang is consistent, it is
scientifically unsatisfying. We would like to know how the CSI
was transferred into things like living cells. The ways that we
know about include Inheritance with modification, Selection, and Infusion.
(Contrary to Dogma, Mutation adds no CSI.) Inheritance passes on a
randomly selected mix of CSI from both parents. Selection conveys CSI
from the environment, killing off ill adapted genetic configurations.
Infusion is the addition
of CSI from outside the organism. For instance, bacteria exchange
plasmids to acquire antibiotic resistance, and genetically engineered
plants and animals acquire new CSI from researchers.
Evolutionary theory needs to be reformulated in light of the science
of information.