Well, there's a problem. The amount of information, or any amount of information about a system isn't a measure of entropy, because entropy isn't information. If you had complete information about all the particles in a system it would have zero entropy. Saying "entropy represents the amount of information . . ." isn't an objective statement. I'm claiming that information is necessarily a physical thing, whereas entropy isn't because it's based on probabilities and a probability isn't an intrinsic property of physical things. It just isn't. I think probabilities are determinable, but is probability objective?