Reversing the Thermodynamic Arrow of Time:

I could argue that thermodynamic, or Boltzmann-Gibbs entropy, is the same as information entropy, modulo the Boltzmann constant.

Entropy is really just uncertainty; in thermodynamics it amounts to information which isn't known about the internal states. That is, since a system of particles will always tend towards maximum randomness (e.g. adding milk to a cup of coffee, or covfefe if that's your tipple), the description of the overall state, at some instant, amounts to one of a very large number of random strings and you can't know which one, so you effectively have to 'write' all of them down.

Does that make any sense?
Tell you what, I suggest having a nice read through this comprehensive piece: https://plato.stanford.edu/entries/information-entropy/
You will find parts that tend to your pov - and other parts that don't.
 
Back
Top