# Algorithmic complexity

Al­gorith­mic com­plex­ity is a for­mal mea­sure of the min­i­mum amount of in­for­ma­tion re­quired to spec­ify some mes­sage, bi­nary string, com­puter pro­gram, clas­sifi­ca­tion rule, etcetera. The al­gorith­mic com­plex­ity or Kol­mogorov com­plex­ity of a mes­sage is the num­ber of 1s and 0s re­quired to spec­ify a Tur­ing ma­chine that re­pro­duces the mes­sage.

(The rele­vant Wikipe­dia ar­ti­cle is filed un­der Kol­mogorov com­plex­ity. Not to be con­fused with “com­pu­ta­tional com­plex­ity”, which isn’t the same thing at all.)

A string of a mil­lion 1s has around as much K-com­plex­ity as the num­ber 1,000,000, since get­ting a Tur­ing ma­chine to print out ex­actly a mil­lion 1s and then stop, is mostly a mat­ter of en­cod­ing the num­ber 1,000,000. The num­ber 1,000,000 can it­self po­ten­tially be com­pressed fur­ther, since it’s an un­usu­ally sim­ple num­ber of that mag­ni­tude: it’s 10 to the 6th power, so if we already have a con­cept of ‘ex­po­nen­ti­a­tion’ or can en­code it sim­ply, we just need to en­code the num­bers 10 and 6, which are quite small.

When we say that a mes­sage has high Kol­mogorov com­plex­ity, we mean that it can’t be com­pressed be­yond a cer­tain point (un­less the ‘com­pres­sion al­gorithm’ is it­self large and con­tains much of the key in­for­ma­tion baked in). Things have high Kol­mogorov com­plex­ity when they’re made up of many in­de­pen­dent facts that can’t be pre­dicted from know­ing the pre­vi­ous facts.

Shake­speare’s Romeo and Juliet will com­press by a lot us­ing sim­ple al­gorithms, be­cause there are many words used more than once, and some words are much more fre­quent than oth­ers. But we are ex­ceed­ingly un­likely to find, any­where in the first trillion Tur­ing ma­chines, any Tur­ing ma­chine that prints out the ex­act text of this play. 2^40 is greater than a trillion, so if we con­sider the set of all 40-bit bi­nary strings, it’s clear we can’t print out all of them ex­actly us­ing the first trillion Tur­ing ma­chines. Find­ing a 40-bit Tur­ing ma­chine that printed out the ex­act text of Romeo and Juliet would be vastly im­prob­a­ble.

On the other hand, it wouldn’t be par­tic­u­larly sur­pris­ing to find a small Tur­ing ma­chine that printed out $$3\uparrow\uparrow\uparrow3$$ 1s and then stopped, be­cause the al­gorithm for this enor­mous num­ber seems sim­ple, and easy to en­code as a com­puter pro­gram or Tur­ing ma­chine.

The al­gorith­mic com­plex­ity of a sys­tem shouldn’t be con­fused with the to­tal num­ber of visi­ble, com­pli­cated-look­ing de­tails. The Man­delbrot set looks very com­pli­cated vi­su­ally—you can keep zoom­ing in us­ing more and more de­tail—but there’s a very sim­ple rule that gen­er­ates it, so we say the al­gorith­mic com­plex­ity is very low.

As a corol­lary, a piece of a big-look­ing ob­ject can eas­ily have more Kol­mogorov com­plex­ity than the whole. If you zoom far down into the Man­delbrot set and iso­late a par­tic­u­lar piece of the frac­tal, the in­for­ma­tion of that image now in­cludes both the Man­delbrot-gen­er­at­ing rule and also the ex­act lo­ca­tion of that par­tic­u­lar piece.

Similarly, the Earth is much more al­gorith­mi­cally com­plex than the laws of physics, and if there’s a mul­ti­verse that de­vel­oped de­ter­minis­ti­cally out of the laws of physics, the Earth would be much more com­plex than that mul­ti­verse. To print out the whole mul­ti­verse, you’d only need to start from the laws of physics and work for­ward; to print out Earth in par­tic­u­lar you’d need a huge num­ber of ad­di­tional bits to lo­cate Earth in­side the mul­ti­verse.

Or more sim­ply, we can ob­serve that a pro­gram that prints out all pos­si­ble books in or­der, is much sim­pler than a pro­gram that prints out only Romeo and Juliet. To put it an­other way, Borge’s “Library of Ba­bel” con­tain­ing ev­ery pos­si­ble book has far lower al­gorith­mic com­plex­ity than an Earth library con­tain­ing only some books. The moral is that the amount of visi­ble stuff and its seem­ing sur­face com­plex­ity should not be con­fused with the no­tion of al­gorith­mic com­plex­ity or the­o­ret­i­cal com­press­ibil­ity.

Children:

• Most complex things are not very compressible

We can’t prove it’s im­pos­si­ble, but it would be ex­tremely sur­pris­ing to dis­cover a 500-state Tur­ing ma­chine that out­put the ex­act text of “Romeo and Juliet”.

Parents:

• It is not clear to me that:

Or more sim­ply, we can ob­serve that a pro­gram that prints out all pos­si­ble books in or­der, is much sim­pler than a pro­gram that prints out only Romeo and Juliet. To put it an­other way, Borge’s “Library of Ba­bel” con­tain­ing ev­ery pos­si­ble book has far lower al­gorith­mic com­plex­ity than an Earth library con­tain­ing only some books.

Is true. As far as I can tell, the above state­ment re­quires some of the fol­low­ing to be true:

1. The set of all pos­si­ble books is some­thing more like the set of all pos­si­ble strings above a cer­tain length (that the con­tent of those strings be co­her­ent, and con­sti­tute le­gi­t­i­mate books is not re­quired).

2. The short­est pro­gram en­cod­ing Romeo and Juliet in our refer­ence UTM does so by ex­plic­itly lo­cat­ing it within the set of all pos­si­ble books.

I see no rea­son to as­sume that 2 is true, and I feel that if 1 is as­sumed, then it should be stated as such or it would be mis­lead­ing.