Updating Landauer’s principle

Physicists have long sought to study the relation between data acquisition/processing and thermodynamics (or statistical mechanics). Due to the generality of the notion of data processing (or information processing) compared to the much more restricted setting of thermodynamics, the study of such relations has suffered from overestimation of the generality of different results. The earliest result is from Szilard’s 1929 analysis of a singe-atom gas confined to a box. Szilard concluded that there is an entropy cost associated with measuring the position of the atom in the box. In other words, acquisition of information about the atom’s position leads to an increase in entropy. Specifically, measuring in which half of the box the atom is located (i.e. 1 bit of data about the location) requires a heat dissipation of at least kT \ln(2).

Three decades later, in 1961, Landauer reanalyzed the relation between data and thermodynamics and concluded that there is no necessary entropy cost associated with measurements. Instead, Landauer concluded, it is data erasure that is associated with an entropy cost. In other words, logically irreversible operations on bits (binary digits) are also thermodynamically irreversible and consequently incur an entropy cost. As an example, resetting a bit to zero, independently of its current value, is a logically irreversible operation since the original value of the bit cannot be reliably retrieved. According to Landauer, resetting a bit requires a heat dissipation of at least kT \ln(2). It easy to overestimate the generality of Landauer’s result, because it depends on assumptions about how bits of data are represented physically that many texts gloss over. For example, the result does hold if each bit is represented by a classical particle in a symmetric double-well potential, with one minimum representing ‘0’ and the other ‘1’. Recently, the generalization to asymmetric potentials was studied by Barkeshli, who shows that the entropy cost need not be associated with the reset step. The entropy cost can be shuffled around to any step in a sequence of operations, including steps that are logically reversible.

In closely related work, Sagawa and Ueda describe a scenario that enables them to derive lower bounds on the entropy costs of measurement and data erasure individually, as well as on their total cost. In their scenario, data is stored in quantum mechanical memory whose state space has a tensor structure. The Hilbert space of the memory is taken to have the form of a finite tensor product H^M = \otimes_k H^M_k, with each subspace H^M_k spanned by an orthonormal basis |\epsilon_{ki}\rangle, i = 1,2,\ldots There are N subspaces and data in the form of a number k\in\mathbb{N}, 1 \leq k \leq N, is stored in the memory by projecting it onto the subspace H^M_k. The memory is in contact with a heat bath and an additional system. A measurement is indirectly performed on the additional system by letting it interact with the memory, and the measuring the memory so that the memory state is projected on a subspace H^M_k. With some additional assumptions about how the three subsystems interact, Sagawa and Ueda are able to show that the work required for measurement satisfies

W^M_{\text{meas}} \geq -kT(H-I) + \Delta F^M,

where H = H(p) is the Shannon entropy of the probability distribution of measurement results p_k, I is the “QC mutual information” between the memory and the additional system (as defined by Sagawa and Ueda, 2008), and \Delta F^M is the expected amount of work performed on the memory with respect to the probability distribution p_k. Furthermore, the minimal work required for data erasure satisfies

W^M_{\text{eras}} \geq kTH - \Delta F^M.

Since $\Delta F^M$ need not be zero, it is possible to erase information at no cost. However, the sum of the measurement and erasure costs is bounded from below by the QC mutual information,

W^M_{\text{meas}} + W^M_{\text{eras}} \geq kTI.

In my view, the main lesson, besides interest in the results per se, is that one should be very careful when trying to connect data or information to thermodynamics. It took three decades before Landauer pointed out the lack of generality in Szilard’s results. Though the articles discussed here are not the first to point out limitations in Landauer’s reanalysis, it took several decades before limits of generality of Landauer’s results were seriously analyzed. As seductive as such connections to thermodynamics can be, they are also bound to be of limited generality due to assumptions about how data is encoded and represented physically.

References

M. Maissam Barkeshli. Dissipationless Information Erasure and Landauer’s Principle. eprint cond-mat/0504323

Takahiro Sagawa and Masahito Ueda. Minimal Energy Cost for Thermodynamic Information Processing: Measurement and Information Erasure. Phys. Rev. Lett. 102:250602 (2009) eprint 0809.4098

Takahiro Sagawa and Masahito Ueda. Second Law of Thermodynamics with Discrete Quantum Feedback Control. Phys. Rev. Lett. 100:080403 (2008) eprint 0710.0956

Advertisements

Comments are closed.

%d bloggers like this: