Optimal Churn

There is a new approach that studies systems as organisms and classifies them as super or sub linear.

The idea is simply to plot the growth against the rate of growth. In other words at each point what will the next increase look like. For instance, cities are systems that seem to have superlinear growth. That is once they start growing they keep growing. Because more people make create more opportunities and so this positive feedback cycle makes the city grow without dying. Of course any system with uncontrollable positive feedback will go out of control (like a petri dish where eventually the bacteria grow faster than the available food source and they all die mass extinction) luckily we have some negative feedback mechanisms namely prices. So when cost of living goes up near by areas become part of the City. And thus the boundary of the City expands. Though at times, it should be noted cities experience shrinkage (which the authors of this research seem to ignore )

However companies are sublinear. That is why they die. The claim is they usually they die because as an organization gets bigger its tolerance for diversity decreases and the administrative tasks don’t scale. These combined forces cause the company to become less competitive in their industry, especially when combined with an ecological shift. That is all the advantage of being large is lost when the competition can be nimble and this is highlighted when the landscape changes.
This article talks about how scaling works.

Another system entirely ignored in these studies, which I believe is revealing are universities. They tend not to die and they tend to not grow all that much. One of the ways that they maintain a steady state is that they have a constant stream of new students who want to contribute to the institution this keeps the institution alive and relevant.

Companies are often wary of churn,  but it can be a good thing. Especially when combined with the right incentives. If young employees come in work really hard get promoted and then retire but pass on their experience to the next generation and leave. That is actually a good thing. The faster you can train employees, the less bloated your bureaucracy can become. Also an additional benefit is that you get highly motivated employees who stay highly motivated until they leave.  Furthermore, adding to the work force becomes easier since jobs are well defined and adding people happens often.

An example of this is: the classical trading job. Most people fall into tracks from this job. Rise and retire or burn out. Few would rise and become senior managers. The knowledge that spots on top are constantly being vacated motivates employees to work hard to get those spots. The commmon practice of retiring or leaving in general provides some incentive to pass along experience to the next generation. The constant flow through the system provides feedback and molds it to be efficient at training and self maintaining the flow.
In fact, one possible causal mechanism for failure to adapt to technology by the banks was that the average age of traders rose during the 80s because of several trends. There was economic uncertainty and low birth rate during the 1970s. That meant that there wasn’t a significant enough flow through the companies so instead of adapting these companies, became conservative and static.


St Petersburg Paradox:

I learned about this paradox recently through an interview question.

The paradox comes from the following game. more here at wiki:

The casino flips a coin until it gets heads. The number of tails is squared and payed to the player. How should they price this game?


Having never seen this question, I proceeded to take the expectation. That is the weighted average of the payouts. Well the answer is infinity, this can be seen intuitively from the fact that the payout grows exponentially at the same rate as the diminishing probability of the event happening.

The paradox is that no one will place huge sums of money to play this game, even though theoretically, they should – since it has an unlimited expected payoff.

The variance of this game is also infinite which stems from the fact that the payouts are finite but the expectation is infinite. E(X) = (X-E(X))^2

There are several resolutions:

A utility function approach which says we need to discount the payoffs by the utility function. In other words, the first million dollars brings me more utility then each next million. However, if this is not bounded, you can simply keep increasing the speed of payouts given any unbounded utility function.

Another approach suggests that people discount the probability of negligible events. However, lottery ticket sales undermine this argument, since those seem to be low probability events that people pay too much for. However, this counter argument neglects to mention that certain events are discounted completely if they are below a certain probability. As an example, the chance that all the air will decide to stay in one side of my tire, nobody will pay any amount of money for that event. Same goes for the law of gravity to stop applying, there is some negligible probability for that event, but no matter how large the payoff no one will by that ticket.

An experimental approach, suggests that you should pay around 10 dollars. Having played the game 2048 times.

Another approach suggests that you would need the casino to be good for an infinite sum of cash and since they aren’t no one would place that money.

A combination of utility and the previous reason, gives an answer that is close to the experimental result. Suppose you would be willing to play and suppose it takes a minute to flip a coin. You have a magic genie that has guaranteed that the coin will flip tails until you say heads after which it will be heads. How long will you play?

Most likely, you will stop after you are the richest person in the world, but that only will take an hour. After that you apparently have more money than the value of earth by three orders of magnitude. If you discount to that best case scenario, you get no paradox at all, in fact the most you would then pay is 1/4*60 = 15 dollars. If you understand, that casino can’t possibly guarantee more than a billion in winnings, that brings the value down to 29.8973529 ~ 30, which says it’s closer 7.5 dollars. If you are playing against another person you can’t expect more than a million dollar payout so you shouldn’t bet more than 1/4*20, unless you are playing against Harry Kakavas.

One last way to think about this problem. What is the expected value of the game itself. The answer to this is 1. That is the total number of flips will be 2 but you will only expect one tail. In which case, you expect 1 dollar. So perhaps you should simply pay 1 dollar because that’s how the game will play out.This is called the median value of the game and is in fact what many people bet.

The fact is, the more money you have the more valuable your time becomes, and it makes less and less sense to keep playing the game. So you will tell your genie to stop, because now that you have earned all this money you want to go out and spend it.


A response to Peter Norvig – Programing is also a Tool: Learning Programming Through Composition

I recently came across a blog posted several years ago (can’t find a date on the page)  http://norvig.com/21-days.html

The blog post is basically a combination of two things:

  • A straw-man argument against how programming books and skills are marketed.
  • An excellent guide on how to master programming. (not sure about the two bullets concerning language standardization efforts) 

Dr. Norvig’s main point is that programming is a skill that takes a long time to master.  (read: years)

To the extent that we ever believe marketing hype, like made from the “best stuff on earth”, the argument is strong. In other words, books like Master x in 24 hours. are obviously marketing hype. But we didn’t need Peter Norvig to tell us and if we did, it wouldn’t help. So of course, I concede that programming cannot be learned overnight.

However, much of what we call programming is simply the art of putting working blocks together to create useful tools. Programming is, in a way, more similar to Legos than it is to piano.

A child can use Lego blocks and follow a booklet to create a city. They can even create interesting variations on  that city without much experience. By over emphasizing the idea that programming is some kind of deep craft, you miss out on the fact that many projects can be completed with very limited knowledge of programming principles.

I would argue that articles that emphasize how difficult programming is actually turn away many people who could otherwise benefit from even basic knowledge of the craft. It would be like saying

If you aren’t going to build the tower of Dubai with Lego, why bother?

Instead, a few simple principles can actually make a person modestly competent at getting the job done and often little else is required. When it is, more significant resources can be marshaled.  This is the essence of Just In Time  (JIT).  JIT is a principle that has found uses in many fields of engineering and optimization in general.

So yes, Dr. Norvig, no one is going to build the next ZeroMQ overnight, by reading Learn C++ in 24 hours or in a week. But plenty of people can learn how to use ZeroMQ to build a thin service layer around  a node.js rules engine in a day using exactly those types of resources, books and tutorials. (If you don’t know what any of that meant, don’t worry. It is not important and that’s my point.)

What follows are the minimum principles you need to program.

(I’m sure I’ll refine this list)

Teach yourself programming through Composition:


(OED 1.1) The action of putting things together; formation or construction:

(OED 1.2) A thing composed of various elements

  • Programming  is composition
  • Every program is a black box with an input and  an output.
  • This is recursive, that is, every program itself is a composite of black boxes.
  • View everything in the world this way.
  • Imagine what sorts of tools you would need to get the job done
  • Find those tools
  • Combine them (glue them together using whatever means you have)
  • Fill in the missing pieces
  • Iterate.
  • When in doubt ask someone who has more experience? (or search stackoverflow it’s probably already been asked.)

That is the essence of programming as a pragmatic tool to get the job done.

There is no need to become a master programmer. If that happens along the way, and you create the next Pandas or Beautiful Soup, great!

If not, join the millions of people who have made their own and other’s lives easier by automating something.

I think the field of programming, and people in general, would benefit if the attitude toward programming was more similar to the way people approach cellphones (everyone uses them at different skill levels, to do a wide array of tasks) instead of the way people approach quantum gravity (not at all, for most people).