Energy, Entropy and Information

Discussion in 'Pseudoscience Archive' started by wellwisher, Dec 15, 2012.

  1. wellwisher Banned Banned

    Messages:
    5,160
    Entropy can be defined as the added information needed to define a change in the status quo of a system. For example, going from one cell into two increases entropy and information. For entropy to increase it needs energy. This is why entropy is associated with the inefficiency and/or waste heat coming from a system. The waste heat provides energy, while the inefficiency requires additional information to define the system compared to the ideal system. In Chemical engineering, entropy is a measurable quantity and is needed to close energy balances.

    Since the entropy of the universe has to increase, and since entropy needs energy, this means that the available energy of the universe decreases over time. The energy is conserved within entropy, but is net irretrievable since entropy needs to increase. You can't push two cells back into one. That enegy is conserved by irretrievable. Eventually all the energy of the universe will become irretrievable but conserved via entropy, since entropy needs to always increase.

    An interesting implication is, since energy is conserved within the entropy, and since entropy has a connection to all that information needed to define changes within the status quo, then this information has to be conserved. If the information is not conserved then the conservation of energy is not valid.

    Add it up. We start with energy. Some of the energy goes into entropy and becomes irretreivable. You can get that energy back, but it requires more energy than you will get back. It is net gone since entropy has to increase. If entropy is connected to information, and energy has to be conserved, and all the energy is in entropy, then the entropy information is conserved. It has irretrievable energy to perpetuate it.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    You still don't understand what entroy is.

    Only you would define entropy that way. As far as information and entropy you could say that as the entropy of the system increases the infromation of the sytem becomes more ranndom, but that is not really a useful definition in most concrete examples.

    No, why can't you get this? Waste heat does not provide energy, it is energy, the waste energy is defined as entropy.

    The rest of the post is just more ramblings as far as I can tell.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. DaS Energy Registered Senior Member

    Messages:
    223
    Entropy, a quantitative measure of the amount of thermal energy not available to do work.

    Waste heat is not energy- its heat. Energy is a force that can be arrived by heat activation or other means, heat alone moves nothing.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. DaS Energy Registered Senior Member

    Messages:
    223
    wellwisher,

    What is your basis? "since entropy needs energy"
     
  8. wellwisher Banned Banned

    Messages:
    5,160
    In Chemical engineering, entropy is connected open systems, like our universe, where there is heat and mass transfer across boundaries. The entropy is also connected to free energy G, where G= H - TS, where H is enthalpy, T is temperature and S is entropy. This equation works in the real world of open systems and is part of an energy balance in chemical systems, where the preponderance of science lies.

    The problem might be that chemical engineering applications has to deal with real systems, since you have to make real things and need to put theory to the test. This is different from the entropy taught in first year thermodynamics where you deal with ideal closed systems. That is not the universe we live in. We live in a non-ideal universe of open systems.

    If you look at the free energy equation, when entropy increases, or as the -TS terms gets larger, the G or free energy gets smaller. The free energy is all the available energy, in very general terms. Based on conservation of energy, free energy goes into entropy as entropy increases. The energy is conserved, but irretrievable, in the net sense, since the net entropy of the universe has to increase.

    In the end, all the free energy G goes into entropy. Since energy has to be conserved it is conserved as entropy. I defined this as information since how else would you define this irretrievable energy conservation? The net result will be a universe deplete of unable energy but with the energy conserved as entropy. Maybe you can tells us how better to describe this state.

    This is based on two laws of physics; energy conservation and entropy has to increase, and test proven theory used to make millions of tons of real things everyday; heat and mass transfer in open system and free energy.
     
  9. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Thermodynamic entropy is not energy; this is obvious since it needs to be multiplied by T in G = H - TS to get a balanced equation (i.e. units on the left equal units on the right).

    A gas has positive pressure; its entropy is maximised when all the particles have the same average energy and greatest separation. A gravitationally bound object (a solid) has negative pressure, its entropy is maximised when all the particles have least separation (in other words, when it's maximally compressed). In both cases all the particles "look" the same, ideally. Which means they each "carry" the same information (i.e. energy), on average.

    What happens when gravity collapses matter to its state of maximum entropy? Is there still an average amount of information per particle?

    I have to say that understanding the connection between thermodynamic and information entropy isn't an easy thing to do, although information entropy does seem to be easier to get one's head around.
     
  10. Prof.Layman totally internally reflected Registered Senior Member

    Messages:
    982
    I don't think cell division is an increase in entropy, but it is actually a decrease in entropy. Two cells could do more work. The particles that make up those cells are more ordered. So then a living thing would be a highely ordered system that has the potential to do a lot of work. The wiki defines it well saying that an increase in entropy causes energy to become unavailable for doing useful work.

    http://en.wikipedia.org/wiki/Entropy

    Entropy does not require energy to increase. A system should always have energy within that system. So then the energy that is already in the system then "always" tends to increase the entropy of that system. It means that the energy will just move things into an even mixture. Cells are not an even mixture, so then the entropy has not increased, an even mixture cannot do as much work.

    Entropy doesn't use up energy. It is more like a description of what the energy in the system will do to that system over time. If the energy is not destroyed in that system then it will still have the same amount of energy. I think there are only a few special cases where energy is created or destroyed. Most of the time those cases don't influence the total energy of a system, like particle pair creation and internal reflection that is not under ordinary conditions.

    Again, entropy doesn't "use up" energy. There can be places in a system that become more ordered and have more of an ability to do work, but overall if you increase the size of that sytem overall the entropy will look like it increased, or can do less work. But then even in the places that entropy has decreased, it will eventually increase as well. A cell will die, when it does it will not be able to do as much work. It will just be a random mixture of particles. So for entropy to always increase the overall system has to be rather large, because the random events in that system can have small regions in it where entropy decreased.
     
  11. eram Sciengineer Valued Senior Member

    Messages:
    1,877
    Here's the easiest way to start on Entropy, formulated by Sadi Carnot.

    In an engine, energy (heat) is transferred from a hotter to a cooler region, from temperature T1 to T2.
    Q1 amounts of heat do useful work, Q2 goes to waste.


    Carnot said that \(Q_1T_2=Q_2T_1\). The engine has greater than 50% efficiency.
    \(\frac{Q_1}{T_1}=\frac{Q_2}{T_2}\)


    \(S=\frac{Q}{T}\)

    So Entropy (S) is conserved.



    However, real world engines can never be that efficient. More heat becomes useless, and
    \(\frac{Q_1}{T_1}<\frac{Q_2}{T_2}\)


    So the entropy must have increased.

    Starting with this basic idea of the ratio between heat and absolute temperature, we can surmise chaos, disorder and information. In the OP, he has not taken into account the temperature of the system.
     

Share This Page