Tim O’Reilly forwarded the following Slashdot posting to the ORA editors’ mailing list:
“To distinguish images derived from living vs. non-living sources, USC and NASA JPL researchers report today using the standard gzip compression utility. As a measure of overall pattern complexity, they find that the inherent pixel content of biologically generated fossils
produces higher image compression ratios [more data redundancy], compared to their non-biological counterparts. The more the file shrinks, the more likely it is
that a living process was involved.”
This experiment, and the others like it described in the Slashdot posting, make a very deep and intuitive sort of sense. Data compression involves encoding
redundancy in a dataset in such a way as to increase entropy — the
apparent degree of randomness in a given set of information. The term “entropy” is used in information theory to mirror the analogous concept from thermodynamics — and they are described by equations of the same form.
From the standpoint of information theory, life itself might be viewed as being a kind of spontaneous ordering
force in the universe, a sort of anti-entropy. Nearly all living things
embody some form of survival-promoting symmetry or redundancy. So it
amuses me intensely to see that idea, that intuitive notion of life as an ongoing pattern-generating pattern — life as negative entropy — made
manifest in such a direct and compellingly mundane fashion as the
comparison of image compression ratios.
What other useful intuitions can we describe or discuss about living things, from an information theoretic viewpoint?