October 14, 2009

Old errors

Art Deco ornamentThermodynamic Cost of Reversible Computing is making the rounds on Slashdot.

What the paper shows is basically that reversible computation with errors has to dissipate energy. Even an amateur like me had noted back in 99 that error correction requires paying kTln(2) J per fixed bit. When doing reversible computation the corrections will occur during the computation and hence induce a certain average cost per computation. The temperature here is not necessarily physical temperature but the "noise temperature" defined based on the entropy production.

For a qubit being flipped about they derive an lower bound on energy cost to be hR2ε/2 Watts where R is the rate of computations and ε the error probability per computation. In general, the dissipation grows with the square of the computation rate.

So if I have energy E and want to make the most calculations, I should slow down indefinitely. This is of course why I can never meet a deadline. However, eventually I will reach the dissipation limit I considered in the 99 paper of random bit flips - storing information for a long time costs too. So the optimal rate if I have plenty of time is going to be the rate where I produce equal amounts of heat from processing and from maintaining my stored bits. On the other hand, if I have a time limit T in addition, then I should use up all the energy by time T, and hence have a computation rate of sqrt(E/hT).

Posted by Anders3 at October 14, 2009 03:58 PM
Comments