It was Pi day recently and one of the interesting facts about Pi is that it is believed to be normal. Pi has been proved to be an irrational and a transcendental number. A real number x is *normal in base b* if in its representation in base b all digits occur, in an asymptotic sense, equally often. Proofing this has eluded mathematicians until now.

However, if we take this conjecture for granted. Then any finite string of numbers will be found in Pi. Which means that theoretically, given infinite computation power you would be able to compress any amount of information into two numbers. The first number stores the length of the information you want to compress and the second stores the starting point in Pi where you could find that string. We might want some more information in the metadata for example if a particular piece of information is very far into the Pi sequence than storing the number that represents where it can be found might be larger than the data in which case we might store a pointer to that number and so on. Ultimately, this is futile because there might be situations where storing the first pointer and it’s length and the depth that we need to follow the pointers might still be longer than the data we need to store at that point it might be worth gzipping the data and calling it a day, of course if you are committed you might consider breaking up the data into partitions and then finding the subsequences in Pi and storing those, the odds are those subsequences will appear earlier.

An important point to note is that this type of storage mechanism does not violate information theory. Although in general the number of bits it takes to store or transmit anything is a function of Shannon’s entropy. In general, Shannon’s definition of entropy is the minimum channel capacity required to reliably transmit the source as encoded binary digits. However, because there is actually infinite entropy in the digits of Pi and Pi is deterministic, we can essentially convey the message as a function of Pi. Now, this whole post promised something practical and this discussion so far has been theoretic.

One of the information theory problems in biology is how to account for all of the behaviors and advanced functions in advanced life forms. We know that learned behavior gets modeled in the brain after birth and has many places to hide (researchers try to catch this learning with relatively primitive imaging techniques). However, unlearned behavior or what we would call instincts and lower all fit into about 700mb of space, human DNA. The problem is that given how much stuff goes on that is unlearned it seems unlikely to fit into this very small amount of space. The DNA codes for how much saliva to release into your mouth cavity, as well as how to focus your eyes, or what to do during a sneeze….

So what is the solution to this seemingly paradoxical puzzle. I propose (I haven’t seen this anywhere so please correct me if there are better answers) the solution is something very similar to the impractical storage solution. Essentially the laws of nature are an infinite calculation engine and though there is a bit of probability implicit in them, in the macroscopic universe the answers are deterministic. This means that DNA can simply be a mechanism to force all of the behaviors necessary to survival to happen. Any DNA that didn’t cause the infinite calculator to spit out a species that survived would die out. In this view DNA is not the total information content of our species, we need the observable universe there as well, which includes a lot of Physics and Chemistry (possibly all, but some of these subjects deals with conditions never occurring on Earth’s surface).

Of course, I am exaggerating a bit in telling you that this Natural computer is infinite, in fact, Einstein limited its computing speed ton the speed of light, which means no information travels between individual components in the computer too quickly. Still, nature does do an impressive amount of computation.

This also explains, some of the advantages of carrying a child in utero. A child in utero learns about the environment from its mother. There have been studies that show that a baby can recognize its mother’s voice, in other words information about the environment seeps into the neural pathways of a fetus. Perhaps that is the further advantage of marsupials which get to have a guided sighted tour of the world. Whereas newborn reptiles and birds rely predominantly on instinct which limits the amount of knowledge they can have about the world.