Is entropy really just a quantification of something that isn't known about a closed system, and is information only then defined by a difference in entropy, i.e. by relative entropies? Why is there a problem with thermodynamic entropy being defined by heat and differences in heat, whereas one can also say that entropy is equivalent to what isn't known about particle momentum (or for instance in a gas, what isn't known about particle velocities)? To even be able to say there is an entropy, doesn't there have to be a closed system. and don't you have to define probabilities? Is entropy even definable without probability?