Talk:Landauer's principle

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

The argument on one of the links is plainly wrong[edit]

I had a read of "Logic and Entropy Critique by Orly Shenker (2000)"

I Am Not A Physicist, but it seems to me that it is flat wrong. The author states:

Bennett's construct is a measuring device that determines which of two chambers contains a particle, and stores the information in a memory (the key with notch and peg). The device is a clever combination of gears, in which the only possible source of dissipation is friction. Bennett rightly concludes that this device shows that measurement cannot be associated with any minimum amount of dissipation. This conclusion is based – as it should - on the details of the counter example, regardless of any general or abstract argument that might be offered regarding the entropy of measurement.

It seems plainly wrong that "the only possible source of dissipation is friction". For the machine to move from the "ready" state into another state, the gears have to move. Even assuming a frictionless environment, if the gears have any mass at all, then this means that kinetic energy must be imparted to them. When the peg drops into a notch, this kinetic energy must be dissipated.

One can reduce the amount of energy involved by making the gears lighter, or by moving them more slowly. But then environmental heat becomes a problem. Even in a hard Vacuum, at any temperature above absolute zero the machine is going to be buffeted with photons at infra-red frequencies, which carry energy and momentum. If the gears are too light, it will take so little energy to accellerate them that these photons wil scramble the works. If it moves to slowly, it will take so long to move from A to B that once again enough photons will hit it to screw it up.

You can reduce this effect by reducing the temerature of the environmrnt - but that's the entire point.

So ... should this link be dealt with? Am I wrong? Pmurray bigpond.com 22:59, 22 February 2006 (UTC)[reply]

Yes, please deal with this link. 198.145.196.71 16:49, 28 September 2007 (UTC)[reply]
It isn't the place of Wikipedia editors to debate the accuracy of sources. If you have a reliable source that says the other source is wrong, that can be included in the article. Jibal (talk) 20:19, 19 August 2022 (UTC)[reply]

kT ln 2 is not necessarily Joules[edit]

A pet peeve of mine is when people give a dimensioned expression such as "kT ln 2" (which is already dimensioned in energy units) and then suffix it with some arbitrary energy unit such as Joules. This is technically incorrect since the expression "kT ln 2 Joule" would have dimensions of energy squared, since it multiplies the correct value kT ln 2 (already dimensioned in energy) by the arbitrary amount "1 joule" (also dimensioned in energy). Properly speaking, Boltzmann's constant k itself is not a number, but is a *physical quantity* that is dimensioned in units of energy over temperature, and it is *not* associated with any particular unit such as joules. You could express the very same physical constant using eV or ergs or any other unit. The expression "kT ln 2" only gives the number of Joules if you (say) measure k in J/K and T in K, and then drop all units from the result before adding Joules back in with your suffix, but none of these choices is dictated by the expression itself. So, I changed "joules" to "amount".

von Neumann-Landauer / Landauer bound[edit]

This expression for the minimum energy dissipation from a logically irreversible binary operation was first suggested by John von Neumann, but it was first rigorously justified (and with important limits to its applicability stated) by Landauer. For this reason, it is sometimes referred to as being simply the Landauer bound or limit, but if we wish to be more generous to its originator, we may prefer to refer to it as the von Neumann-Landauer (VNL) bound instead.

With all due respect to von Neumann for the idea, Wikipedia isn't the place to make up new terminology. Unless demonstrably established terminology, we shouldn't prefer to refer here to it as VNL, regardless of our generosity.

Questionable references; "physical lore"[edit]

"The principle is widely accepted as physical lore; but in recent years it has been challenged, notably in Earman and Norton (1998), and subsequently in Shenker (2000)[3] and Norton (2004)[4]."

This sentence, these references, and the blog comment supporting them look suspicious to me. Calling Landauer's principle "lore" is rather opinionated, too, especially since it is inextricably linked with the well-established exorcism of Maxwell's demon. Wikipedia is not the place to spout such non-mainstream opinions and support them with blog comments, of all things. There should also be a reference to Landauer's later paper, "Information is Physical", too. —Preceding unsigned comment added by 198.145.196.71 (talk) 00:01, 28 September 2007 (UTC)[reply]

The whole Landauer's principle is quite suspect, because it tries to define neccesary losses of thermal waste Joules with regards to destroyed information, but actually there is no SI unit for information. In fact there is no etalon for information, neither an abstract construct like seconds of time derived from the speed of light nor an artifact like the original kilogramm preserved in Paris. Even Shannon himself was handwaving around the basic question: what is information? Bit is OK, but noboy can explain what a Bit denotes at the very basic level. Quite like fiat money, we use it conveniently every day, but nobody can be sure what is behind it, at the extreme it could have absolutely no basis at all (hyperinflation).
Nobody knows what information is, we can't even be sure if there is a universal phenomenon called information or maybe there are multiple things which all appear as information to us, because our perception is too primitive to understand the Universe (we are sitting in Platon's cave). Definitely, you cannot establish a new physical law on nature on thermal changes governed by loss of information until you can formally and most rigorously define information as is! Landauer is not even "lore" it is just a conjecture. 91.83.17.127 (talk) 21:55, 22 June 2008 (UTC)[reply]
You might not know what information is, but that's not the same as "nobody knows what information is". I second the move for deletion. — Preceding unsigned comment added by 2603:7000:8800:1A86:E5B3:3F63:D315:674D (talk) 09:53, 18 November 2021 (UTC)[reply]
This comment if from 2008, and the quoted text does not appear in the text. (Information is a somewhat real thing, see for example Entropy (information theory) and A Mathematical Theory of Communication.) · · · Omnissiahs hierophant (talk) 10:31, 18 November 2021 (UTC)[reply]

http://www.v-weiss.de/chaos.html reference should be removed[edit]

"In 2003 Weiss and Weiss, on a background of psychometric data and theoretical considerations, came to the conclusion that information processing by the brain has to be based on Landauer's principle." with a reference to http://www.v-weiss.de/chaos.html

Could somebody please read this page, I just did and one of its claims is that the fundamental resonance of our brain waves ought to be exactly the golden ratio (measured in our historical but voluntary units of Hertz)

the whole page is one big pile of numerology... however I won't be the one to delete the reference... — Preceding unsigned comment added by 213.49.110.23 (talk) 21:45, 31 January 2012 (UTC)[reply]

Looks like someone stepped in and removed it, as suggested. Kudos! Jdickinson (talk) 06:04, 25 March 2022 (UTC)[reply]

Experimental evidence for the principle[edit]

I have included reference to a freshly released Nature paper which's authors claim to have experimentally validated the Landauer principle.--SiriusB (talk) 10:36, 8 March 2012 (UTC)[reply]

nothing clever here - just a complaint about poor choice of english which brings the reader up short[edit]

The article says "Also, there is theoretical work showing that there can be no energy cost to information erasure [8] (instead the cost can be taken in another conserved quantity like angular momentum)." In normal english usage, "there can be no energy cost to information erasure" means "there cannot be an energy cost to information erasure" or equivalently (perhaps labouring the point) "it is impossible that there can be an energy cost to information erasure". That's not what's meant here (it would go against the central premise). The part following "instead" indicates that the intended meaning is "it is possible that there might be no cost, in energy, associated with the erasure of information, and that instead the cost can be taken in another etc". Alternatively, you could write "Also, there is theoretical work showing that there might be no energy cost to information erasure [8] (if instead the cost can be taken in another conserved quantity like angular momentum)." — Preceding unsigned comment added by 109.152.168.158 (talk) 03:43, 24 October 2012 (UTC)[reply]

What's the deal with the references being notes?[edit]

Why are the references notes? It seems very hard to follow the references, which are numbered in the text, to the bullets in the reference section. Tedsanders (talk) 21:34, 16 September 2014 (UTC)[reply]

Brilliuon not Landauer was first!?[edit]

The article Entropy in thermodynamics and information theory states this: In 1953, Brillouin derived a general equation[4] stating that the changing of an information bit value requires at least kT ln(2) energy. Leon Brillouin (1953), "The negentropy principle of information", J. Applied Physics 24, 1152-1163. which suggests that the claim that Landauer said it first is incorrect. Perhaps Landauer indpependelty obtained this result, as stated it in a way that engineers could undrstand, in an engineering journal? Certainly, very few comp-sci researchers would ever read Briliuon... or is it something else? 67.198.37.16 (talk) 21:09, 23 September 2015 (UTC)[reply]

Definition of erasure is needed please[edit]

OK most people have an idea of what erasing stuff is. But How many ways are there to erase a bit? Why is there one way that is objectively the most efficient way, thus enabling us to reason about how much energy it takes? What is it exactly? How does it compare to combining computational paths (I believe it is equivalent??)?

The definition is exactly what you would expect: you can no longer retrieve that bit. It doesn't matter how you erase the bit (I prefer thermonuclear devices for a really secure erase.) Landauer's principle simply sets a minimum amount of heat that must be produced by erasing a bit no matter how you do it. At 2.85 trillionths of a watt created by erasing one billion bits per second, this theoretical lower limit is far, far smaller than the heat any real-world bit-erasure produces. Especially my favorite one. --Guy Macon (talk) 17:34, 28 January 2019 (UTC)[reply]

Beating the Landauer limit[edit]

I am far from being an expert in the area, but I think it should be worthy to add a section pointing out some relevant literature showing how to circumvent Landauer's limit. For instance, a recent paper by Jan Klaers seems to beat the Landauer's limit by timing operations to match naturally occurring temperature swings within the device:

  • Klaers, Jan (28 January 2019). "Landauer's Erasure Principle in a Squeezed Thermal Memory". Physical Review Letters. 122 (4). doi:10.1103/PhysRevLett.122.040602.
  • Commentary about the paper: Schirber, Michael. "Focus: A Cooler Computer".

Another paper from 2016 seems to beat Landauer's limit by dropping the assumption that the physical states representing the two possible logical states are close to thermal equilibrium:

  • Gavrilov, Momčilo; Bechhoefer, John (10 November 2016). "Erasure without Work in an Asymmetric Double-Well Potential". Physical Review Letters. 117 (20). doi:10.1103/PhysRevLett.117.200601.

Could someone more knowledgeable in the area make these edits? — Saung Tadashi (talk) 03:22, 12 February 2019 (UTC)[reply]

“Non information bearing degrees of freedom” could use elaboration[edit]

As well as the second paragraph:

Another way of phrasing Landauer's principle is that if an observer loses information about a physical system, the observer loses the ability to extract work from that system.

I’m not an expert, but these concepts phrased this way clash with intuition on the subject, which seems tied to conservation of energy. For instance when a rubber band is contracted, it’s long molecules arrange themselves in complex random configurations, which are observable and encode a great deal of random information. On stretching it, they all line up, resulting in erasure of observable information about their original state, along with a release of heat, in accordance with Landauer’s principle. In this low entropy state you can now extract work from the rubber band, as it seeks a higher entropy unstretched state. I’m aware of the slaying of Maxwell’s demon with the ideas in the second paragraph, but it shouldn’t be put out there as something so general without deeper explanation, as the capacity of a system to do work is usually inversely proportional to the information needed to specify it’s microstate. It is precisely the rubber band losing encoded information that corresponds with it’s increasing capacity to do work, just as with compressed gas or anything. RegularZephyr (talk) 20:37, 4 December 2019 (UTC)[reply]

Circular reasoning[edit]

It is worth noting that the issue with circular reasoning is that the explanation of why it should be this way is also an explanation of why it can't. Reasoning is commonly related to the 2nd thermodynamic law. The catch with that reasoning is that losing a bit should increase entropy. But what about making up a random bit? It can't decrease entropy, according to that law. What happens when we then lose the random bit again? Can it increase entropy despite making it up did not decrease it? How would the device responsible for the bit loss know whether it was a bit of information or just a random bit?

It gets more confusing when you try to apply the logic to some meta-information. E.g. the number Pi. The law says that calculating part of its representation can't decrease entropy. What happens when we lose the information later? Does it increase entropy? Is the information really lost? We can calculate the that part again using any of the known algorithms. The irrational numbers cause yet another issue by being infinite. Which means infinite information. Does that make them an infinite source of entropy?

This "one way road" postulated by the 2nd thermodynamic law makes it a tool usable for both approval and denial of this principle. 94.112.152.197 (talk) 21:24, 27 June 2023 (UTC)[reply]

Modern Computers: Million vs Billion[edit]

Edit on 4 May 2023 changed to "Modern computers use about a billion times as much energy per operation", whereas prevously it was a million times instead of a billion.

Whether it is true or not, it is not supported by the references.

The references in question are 10 years old, and no doubt modern computers have changed in efficiency, but it is highly suspicious that the energy efficiency of modern computers would get 1000 times worse over ten years.

I would suggest the text should at least be changd back to "million" or the references should be removed, or both. At best, some new citations should be identified and the current estimate should be corrected. 66.64.9.218 (talk) 15:38, 6 December 2023 (UTC)[reply]