Time and information

Yep. For instance, I had to have a word with my guitar string demon this morning.
After which, the bloody strings still need tuning.
Of course, if you train it properly it knows how to vibrate correctly.

This morning, I had to torque to my car.

The first order response was surprisingly linear.

In keeping with the fact physics is a big joke. You laugh on the inside.

This morning, I had to torque to my car.
When you turn the "start" button, does your car start from memory?

Jokes aside, I see myself as someone who is good at taking things apart and putting them back together.
In order to avoid taking my car apart, I took a problem apart and now I'm fairly sure it's a damping problem; I need some new engine mounts.

My car has been aged; all the rubber in some mounts is aged beyond usefulness. I see that as damping being a resource, you see. There's a solution to an equation, and all I really need to know about it is, there is a solution.
Of course I'll take a financial hit, but I can do it all myself with a jack for the engine in place; another useful resource.

Taking information apart, I find that it's a thing we can scale as we see fit, by choosing how to constrain or restrict a flow.

The computer you're using is restricting the flow of electrons, as little units of charge, by controlling the direction and sequence for the small charges, using "gates", so a digital chip is a big "sum over histories" for these charges. The paths are controlled, whereas a "free" charge would probably just dissipate.

So the chips in your computer are charge-control networks. Classical digital chips have to erase information, by allowing the energy in the stored charge to dissipate, information (not charge, which is always conserved), is replaced with heat; there is no trace of the information which is "a charge was in this capacitor at time t" (time, with information and a side order!), left behind. Information is a thing that gets left behind by erasing other information (a different resource).

Inversely, heat is "erased" when information is stored. To solve the conundrum, electronics cools the chips down to erase the heat. It's just a simple concept, about what happens to "charge information" and "heat information", because of the physics.

I suppose you could toy with a "charge demon" and a "heat demon"; but what are they going to be doing, and what can they see--what information is avavilable, and what will they need to erase to not violate the 2nd law?

What about a pair of slits with electron deBroglie wavelength dimensions? If you block one, is that an erasure of information? If you measure how many electrons go through one of the two, is that an erasure of information? You need to ask a question about where the demon is, and what they can do.

Last edited:
The computer you're using is restricting the flow of electrons, as little units of charge, by controlling the direction and sequence for the small charges, using "gates", so a digital chip is a big "sum over histories" for these charges.
I don't think "sum over histories" means what you think it means. Perhaps google it?
Inversely, heat is "erased" when information is stored.
No. While a computer is operating, electrical currents are flowing constantly, and waste heat is constantly being produced. It doesn't matter whether a memory bit is being stored or erased; heat is produced in both processes.
To solve the conundrum, electronics cools the chips down to erase the heat.
The heat isn't erased. The cooling mechanisms of a chip just put the chip in better contact with a cold reservoir, into which waste heat can be dumped. That prevents the chip overheating.
What about a pair of slits with electron deBroglie wavelength dimensions? If you block one, is that an erasure of information?
It depends on how you are defining the "information content" of an electron-slit system. How are you doing that, exactly?
If you measure how many electrons go through one of the two, is that an erasure of information?
What system are you considering? What is included in the system, and what is outside it?

I don't think "sum over histories" means what you think it means. Perhaps google it?
Given a charge q, in a digital computer, which transistor is it flowing through or which capacitor is it in? How many charges in a digital circuit end up flowing to a ground connection? Each charge has a lifetime, and a path through the circuit or part of the circuit. It's called electronics.
No. While a computer is operating, electrical currents are flowing constantly, and waste heat is constantly being produced. It doesn't matter whether a memory bit is being stored or erased; heat is produced in both processes.
That contradicts Bennet and Landauer. Storing a bit of information is less costly than erasing it. However in most computers, the energy levels exceed Landauer's limit. Which is an efficiency problem; most computers waste energy.
The heat isn't erased.
Yes it is. Heat is part of a computational system; if it gets actively pumped to the environment, i.e. dissipated, that constitutes erasure (of a resource). Landauer's limit is the minimum energy needed to erase a unit of information. This is only apparent in a computer which is computationally efficient; it can be shown that not erasing bits, but just keeping them somewhere means there is no waste heat. This isn't usually a practical design consideration--computers don't have the room, so information is erased and it costs more to do that than it does to keep it stored (i.e. it costs less to keep a transistor on once it is on, than it does to turn it off and dissipate the heat).

Last edited:
It depends on how you are defining the "information content" of an electron-slit system. How are you doing that, exactly?
How would you do it?

I notice James has taken the usual approach to the "heat mode" and put it into the category of "waste product".

But he knows that isn't really true: remove all the heat and the circuit won't function: frozen transistors are inactive circuit elements.
Heat is modelled a little better, by acknowledging it's required to "warm up" the system to an operational temperature range.

Then transporting heat out of the system is "processing a resource", to keep the system in a range of temperatures; too much heat resource and a thermodynamic limit is reached--things get fried. Smoke appears.

However, consigning the heat to an unwanted stack means your heat engine is ill-defined. You have forgotten there's an interdependence in the system Hamiltonian.
You didn't read the instructions properly, they say a little bit of heat is a system requirement, but most of it has to be erased locally, by transporting it.

There are some who disagree that Landauer's limit really means erasure is more costly than storage; for instance it costs more to turn a transistor on and "erase a 0", they point out. In digital electronics transistors operate in "full-throttle" or "closed" states, and the control is the minimum change in a space-charge region that is the throttle valve opening and closing. Just as with a carburettor in a real heat engine, a small input results in a much larger power output.
What electronics engineers say is the transistors are in saturated mode.

So the information "the throttle is closed" gets erased, or "the throttle is open" gets erased--you hear an engine noise and experience acceleration if you're driving the car, but if it was in neutral, just the engine would accelerate. What happens to the heat "information"?

No takers? Your chance to prove Bennet and Landauer have a toy model.

Last edited:
Maybe the discussion, with books, online articles, and myself so far, is really just orbiting the question: what is friction?

Frictional elements in mechanics, of the sort cars and combustion engines dwell in, are usually just constant "resistive" elements, heat generated by friction is usually suppressed with lubricating oil, grease, and shock absorbers and other damping devices. The engine mounts in your car are large vulcanised rubber bushings, they heat up when subjected to external forces.

So there's a lot of "sources" of heat in your usual combustion engine, even when it's idling. But notice, some of the heat, the "internal" stuff, is converted into something useful, called work. To do this a heat engine needs a closed cycle; a Stirling engine is a little easier to understand but, combustion engines use the impulse from a detonation, to force a piston away and rotate a linked driveshaft. Power is "delivered" to the rotating shaft by transferring--something--from the exploded fuel mix.

But the cycle has to be closed and, in order to engineer this, part of the cycle is the active dissipation of hot gas to the environment. You increase the entropy of the environment because, it heats up, but where does the heat go? Into an ocean of heat; even saying anything more definitive, or claiming you can find some clue about where it went, is like mining moonbeams. It ain't gonna happen, bro.

The engine has to give some heat, a useful resource, many more degrees of freedom. It has to free the particles forever.

So that's what Bennett and Landauer are on about. Heat information is erased from the system by "dissipating" the information--it gets erased and leaves absolutely zero evidence behind.
A heat product, or store of information, vanishes at a boundary. It's at infinity and it stays there.

The principle of information erasure explains why designing and building an efficient heat engine can only approach a limit.

Last edited:
Here we are, for those who might be thinking this isn't relevant.
Charles Bennett said:
Landauer's principle, often regarded as the foundation of the thermodynamics of information processing, holds that any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment.
What he means there is heat doesn't constitute an information resource, it's not information as far as the computer is concerned, but it is as far as physics is concerned.
Conversely, it is generally accepted that any logically reversible transformation of information can in principle be accomplished by an appropriate physical mechanism operating in a thermodynamically reversible fashion. These notions have sometimes been criticized either as being false, or as being trivial and obvious, and therefore unhelpful for purposes such as explaining why Maxwell's Demon cannot violate the Second Law of thermodynamics. Here I attempt to refute some of the arguments against Landauer's principle, while arguing that although in a sense it is indeed a trivial and obvious restatement of the Second Law, it still has considerable pedagogic and explanatory power, especially in the context of other influential ideas in 19'th and 20'th century physics.
https://arxiv.org/abs/physics/0210005

So once you see what it is, it seems obvious, trivial and tautological.

One more example of what it might mean:
We endeavour to illustrate the physical relevance of the Landauer principle applying it to different important issues concerning the theory of gravitation. We shall first analyze, in the context of general relativity, the consequences derived from the fact, implied by Landauer principle, that information has mass. Next, we shall analyze the role played by the Landauer principle in order to understand why different congruences of observers provide very different physical descriptions of the same spacetime. Finally, we shall apply the Landauer principle to the problem of gravitational radiation. We shall see that the fact that gravitational radiation is an irreversible process entailing dissipation, is a straightforward consequence of the Landauer principle and of the fact that gravitational radiation conveys information.. . .
--https://arxiv.org/pdf/2003.07436.pdf

Last edited:
How would you do it?
That would depend on the particular type of "information" I was interested in, I suppose.

Just waving your hands vaguely and saying "information" doesn't usually communicate anything very precise.

That would depend on the particular type of "information" I was interested in, I suppose.
Well, let's iook at what's available. The electron beam has an energy eV; the two slits have a width and a depth; the screen is positioned an appropriate distance from the slits.

Dots on the screen indicate where an electron collided with it. You have that position information but also some information about how much energy created the dots.

Ooops. For the electron diffraction experiment (or any experiment I'd say), you should also decide which parts are reversible; otherwise I'd also say you should decide which parts are irreversible.

When you have some results, the idea is to formulate something about logical and physical irreversibility. Again, I'd suggest that the dots on the screen be placed in the irreversible category. Even though it's hard not to notice an interference pattern (a logical output), the logical/physical reversible/irreversible properties of the system are the fundamental reason the pattern is there.

One more example.

The discovery of the cosmic background radiation meant that eventually, its meaning was determined.

The CMB doesn't contain this meaning, however. If it did it would have been obvious when Penzias and Wilson first detected it.

Microwaves propagating in all directions don't contain information about their source, or when they first appeared. Right?

Yes. That's right. And this example underlines what Shannon entropy is and how it says exactly nothing about what information means.

We know one thing. The original source was small and that is not just a deduction from an expanding universe.

The CMB wavelengths are missing very low frequencies (long wavelengths)which is an indication of small size.

We know one thing. The original source was small and that is not just a deduction from an expanding universe.

The CMB wavelengths are missing very low frequencies (long wavelengths)which is an indication of small size.
Yes. We know now after doing an analysis of the radiation that the best theory is it originated from a universal event.
Way back before stars or galaxies existed.
When matter was mostly hydrogen and helium with a dash of lithium.

"Small" is a relative term. The visible universe at the point of so-called recombination was large enough that it could happen.

But as I say that isn't information which you can find in the microwave background. It's an explanation of where it came from and when it was emitted.

The point I'm trying to make is, information itself doesn't contain meaning unless the receiver of it already knows extra stuff--has knowledge of how it was encoded.

That is obvious about the words I'm sending, in this post. You understand them because of the protocol and an agreed encoding namely the English language. Right?

Last edited:
That is obvious about the words I'm sending, in this post. You understand them because of the protocol and an agreed encoding namely the English language. Right?
I agree and even then my understanding but a best guess by my brain. However the simpler the code the more precise the guess. This is why the fundamental mathematics of the universe allows us to study, understand, and project the future based on the initial coded message. i.e. 1 + 1 = ...........? (2) always.

Information and the erasure of it mean having to formulate thermodynamic constraints so there is some connection between information entropy (of which Shannon entropy is but one type), and thermodynamic entropy.

Not all the questions have been resolved it seems. Recently some researchers published results that violated Landauer's limit. A different group has also claimed information can be erased without generating heat, instead "absorbing" spin in a "spin bath".