### What Would Sleeping Beauty Do?

My friend Stuart Armstrong has published a preprint of his paper [1110.6437] Anthropic decision theory.

It is a different take on anthropic reasoning: rather than trying to figure out probabilities based on the number of entities like you, you try to maximize the things you care about. This circumvents the problems of the self sampling assumption and the self indication assumption. No doubt enterprising philosophers will find fun problems with the anthropic decision theory too. But it is a clever way of turning the problem on its head: rather than deciding on probabilities and then acting according to them, in a hard decision problem where the probability theories are hard to select you first figure out what actions will make sense and then accept the probability approach that works with it.

Posted by Anders3 at November 2, 2011 12:19 PM