in

Logical Depth

Logical Depth

The time necessary to generate an object, given a concise description of it, is a factor that intuitively plays an important role when we assess the value of this object. We are inclined to consider an object valuable if we believe that a long sequence of rather difficult actions has led to its existence. It is, thus, a promising idea to use the length of the process in which an object has been generated as an indicator of its complexity, on which indica­tion we then base our assessment of its value. The measure of logical (or algorithmic) depth brings this reasoning into a mathematical form. It was introduced, in the 1980s, by Charles H. Bennett, a physicist and computer scientist doing research at IBM. Along with being one of the founders of quantum computing, Bennett is also among the most important contributors to the structural science of complexity.

Bennett’s definition of logical depth is based on central ideas of algorithmic information theory. The algorithmic information content of a string of binary digits (bits) is, broadly speaking, equal to the bit length of the shortest computer program that, if run on a certain type of computing machine (a universal Turing machine), outputs this string and then halts. Such a program is the most concise algorithmic description of the bit string. If this string can, again, be interpreted as the digital representation of an object (e.g., as containing a detailed account of parameters of a physical system at a certain time), then the short­est program outputting the string encodes the algorithmically most economical hypothesis about the causal history of the digitally represented object.

Bennett defines, in its simplest version, the logical depth of an object represented by a binary string as the run time of the shortest program that outputs the binary string and then stops; this run time is measured in execution steps of the program on a universal Turing machine. A long time that elapsed in the real-world generation of an object does not correspond necessarily to a logically deep time: If, in a long period of real time, there did not occur many events that con­tribute to the generation of the object, this stretch can be summarized by a few steps in the simu­lated causal history of the object. Thus, a logi­cally deep object is one whose simulation on a computer proceeds slowly when the shortest pro­gram outputting its digital representation is run. In contrast, logically shallow objects are either very regular objects that can be simulated by short and fast programs or very irregular objects that can be simulated by incompressibly long and fast programs. Under the general constraint, stated by the second law of thermodynamics, on the possible growth of order in physical systems, a fast increase of order, measured by logical depth, is very improbable.

From an information-theoretical perspective, the logical depth of an object is not equal to the Shannon entropy of its digital representation, given a probability distribution of the binary digits. It is equal to the number of operations necessary to generate an object starting from its most concise description. These operations can, at least princi­pally, be reconstructed if we know the digital rep­resentation of the object and search for the most concise program that outputs this representation and then halts. Logical depth is, thus, correlated with the redundancy in the causal history of the object, that is, with the difference in the number of bits between an explicit digital representation of the generation of the object and the shortest program that outputs the digital representation of this object and then stops.

The simple definition of logical depth given above is, however, not robust enough. If, for example, a program that generates the digital rep­resentation of an object much faster than the short­est program is just a few bits longer than the latter, the exclusive focus of logical depth on the mini­mum length program leads astray. The simplest solution to this problem is to define a mean logical depth of an object on the base of a rate of exchange between run time and program size. Such a rate of exchange must guarantee that, whereas short and fast programs outputting the digital representation of the object contribute most to its logical depth, long and slow programs contribute least. In calcu­lating the average run time of all programs that output the digital representation of an object, the shorter and the faster the program that generates the desired output, the higher its run time must be weighted.

Stefan Artmann

See also Entropy; Information; Maxwell’s Demon; Time,

Asymmetry of; Time, Measurements of

Further Readings

Bennett, C. H. (1988). Logical depth and physical complexity. In R. Herken (Ed.), The universal Turing machine (pp. 227-257). New York: Oxford University Press.

Bennett, C. H. (1994). Complexity in the universe. In

J. J. Halliwell, J. Perez-Mercader, & W. H. Zurek

(Eds.), Physical origins of time asymmetry (pp. 33-46). Cambridge, UK: Cambridge University Press.

Gell-Mann, M. (1994). The quark and the jaguar:

Adventures in the simple and the complex. New York: Freeman.

What do you think?

Speed of Light

Speed of Light

Longevity

Longevity