Matt's Blog

The Game of Life, Metacircular interpreters and Hashlife

Sun Nov 12 09:33:31 EST 2006

J.H. Conway describe the "Life" cellular automata in 1970. This cellular automata consists of cells that may be alive or dead, and a transition rule that says a cell stays the in the same state if it has two neighbours, becomes alive if it has three neighbours and dies otherwise. The cells are arranged in a regular array of square cells, with the neighbourhood of a given cell being the surrounding eight cells.

• This simple setup is Turing complete
• see "The Recursive Universe" by William Poundstone or "Winning Ways (For Your Mathematical Plays)" by J.H. Conway.
• This means that it is possible to construct a Life pattern that acts as an interpreted ie it takes other programs translated (compiled) to a Life pattern and then evaluates the program described by this input pattern. In particular, the pattern of the evaluator itself can be run by the evaluator - this is the key of a language being Turing complete.
• Any Turing complete language can be implemented in any other Turing complete language.
• So we could construct the Life pattern corresponding to a Lisp interpreter for example.
• The Hashlife algorithm speeds up the simulation of cellular automata exponentially.
• See "Exploiting Regularities in Large Cellular Spaces", R. Wm. Gosper Physica 10D p75-84 (1984) for the original paper, there was also an article in Dr Dobbs Journal mid-2006 (don't have the reference at hand).
• The two key ideas acting as the basis of the algorithm are:
• a hash mechanism to store the output configuration of a previously encountered input configuration
• a macro-cell data structure to compress the spacetime behaviour of configurations.

Here is the idea: We can write a compiler that translates any program into the equivalent Life pattern to be run on the Life Lisp machine. We can then use the Hashlife algorithm to quickly simulate the evolution of the pattern and then translate the output pattern into a usable output format. Compilation to or from a Life pattern seems intuitively to be an easy (i.e. sub-exponential complexity) problem, although I could be wrong here. We seem to gain an exponential speedup in the Life evolution step due to the Hashlife algorithm (although again the speedup may just be polynomial). Ignoring the reservations, it seems that we can construct a computer that is exponentially faster than a normal computer.

Possible flaws:

• Translation to or from Life patterns could be a hard problem.
• The Hashlife algorithm may not give an exponential speedup, or may slow/require large amounts of memory for non-repetitive computational patterns.
• The size of the Life Lisp machine pattern may be too big to fit in the Universe.

Addressing the last point, Poundstone estimates the size of a self-reproducing Life pattern to be somewhere in the region of 10^13 pixels, ie 10 trillion pixels or approximately 1TB of storage. This is possible with present technology (virtual memory on a few hard disks or a networked cluster). Alternatively, choose a cellular automata with more states, in which it is easier to create a general Turing machine (although the trade off between number of states and total number of pixels is not obvious).

[code] [ideas]