If you read creationist literature or get to know their followers, it doesn't take long before one of them will tell you with absolute confidence that 'natural selection/mutation cannot increase genetic information'. It's an intimidating sounding claim. You probably don't know the definition of 'information' and the creationist isn't about to offer one. Most likely because s/he doesn't have the faintest clue what it would be. But it's actually an easy claim to counter because it's like saying you can only reduce the magnitude of a number with arithmetic.
This flows from the elementary topological concept of a metric set.

A set is a metric set if there exists a function, usually called distance, such that:
- The distance from any element A to itself is zero
- The distance between element A and element B must have the same magnitude in both directions as the diagram on the left illustrates
- The sum of the distance from A to B to C must be greater than or equal to the distance from A to C. In Euclidean spaces, planes and solids for example, this means the shortest distance between two points is a straight line as illustrated in the diagram to the right
Simply put, if a mutation decreases genetic information, then reversing that mutation, a so-called back mutation, will increase the information by the same amount. This flatly falsifies the claim that all mutations decrease genetic information without anyone having to so much as define what genetic information means. It's true for any metric of genetic information that follows well defined rules. See the trivial proof in comments.
It does bring up an interesting question though, what is information? How does one get an analytical handle on what is such an intuitive yet elusive idea? As far as what humans mean when we say the word information, there isn't any single way; the term is too broad. But analytically in both communications and discrete mathematics there are at least two useful ways. The first is called Shannon Information Theory and the other is a type of Algorithmic approach we'll represent with the tongue twisting polysyllabic Kolmogorov-Chaitin Theory, or K-C for short.
Claude E. Shannon was an engineer for Bell Labs over fifty-years ago and he was understandably interested in the technical challenges of transmitting data on old fashioned phone lines. Shannon developed a way to measure the difference in uncertainty between before a signal was sent and after it was received. You might think the uncertainty after a signal had been received would be zero, but because of glitches, noise in the line, etc., this is not the case. And that's exactly what Shannon was interested in analyzing. Shannon Info Theory doesn't really measure information, it measures a change between two states. Now that's somewhat over simplified and I'm sure some of the tech guys in here are wincing. So for those of you deeply interested who are solid in calculus and infinite series, the mathematics of Shannon Theory is pretty nifty.
K-C Theory measures information directly as a function of data compressibility. To be more precise, DR Wesley Elsberry of National Center for Science Education described it as:
The fundamental concept is that a string's information content is the same as the length of the shortest universal Turing machine program and input data required to produce the string as output.
A long random string of alphanumeric characters would probably be less compressible than an essay or book of the same length, because in the latter you'd have words and letter combos used over and over. And that would make it more compressible than the randomized string. The longer both strings are the more likely that this inequality will hold. And it's interesting to point out that sometimes even really complicated, infinitely dense non-repeating sets can be represented as a simple algorithm, a Mandelbrot Set for example, a type of fractal.
We deal with informational conceptually and intuitively. It's one of those things we can do without knowing how we do it. Tissues, organs cells, genes, and words make up larger structures the way components make up a circuit. It's not just the presence of the components, but how they're wired together that makes the device operate as it does. Good luck measuring any of those with a metric of any kind.
Creationism has become much slicker in the last few years and part of that makeover has to do with how they now couch the ancient information argument. We'll look at that tomorrow on Know Your Creationists!
But all of this analysis has very little to do with what we mean by information conceptually. If there is a metric to measure and formally compare She sells sea shells down by the sea shore with the whole is pervaded by a strong smell of turpentine I think I'd like to see it!
Comments are closed on this story.