Entropy as a measure of unusable energy, thought experiment?

Discussion in 'Physics & Math' started by Secret, Sep 14, 2014.

  1. Secret Registered Senior Member

    Messages:
    299
    My question
    If I have two chemical reactions
    I start with pure 1 molecule of A and C respectively, Intenral energy A=C, assume no heat nor work transfer in or out of the system

    Reaction 1
    A ⇋ B
    Reaction 2
    C ⇋ D ⇋ E

    (Real life examples of these include isomerization reactions)
    All equlibrium constants for each step = 1

    After the reaction achieved equlibrium, gibbs=0, I expect (from calculations) that reaction 1 will have the molecule 0.5 of the time in the form of A and 0.5 of the time in the form of B, and similarly in reaction 2, 1/3 of the time for the molecule to be in C,D,E respectively

    Clearly (?) using Boltzmann entropy (counting microstates), reaction 2 has higher entropy than reaction 1

    But then how to interpret the energy distribution?. Is the average internal energy for each species in reaction 2 less than that in reaction 1 because there are more forms of the molecule for the same amount of energy to be distributed to?

    If bolded is true, is this why when entropy increases, the amount of usable energy decreases?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. wellwisher Banned Banned

    Messages:
    5,160
    If you solve the Gibb's free energy equation for entropy, for entropy to increase, the (G-H)/T has to be positive. In other words, entropy needs positive energy to be able to increase. Therefore, entropy is an expression of energy capacitance, but in a form that is often not usable energy.

    Life is interesting in that life is able to retrieve or circumvent the lost energy within entropy, by reversing and preventing entropy gain. For example, both left and right handed protein helixes are possible, life only uses left handed helixes, thereby keeping energy destined for entropy, in play.

    The value of cheating entropy, with only left handed helixes, is this creates an entropy potential. The analogy is we have a coin that has heads and tails, with both equally likely to occur; 50/50 odds. If I started to throw the coin and got 10 heads in a row, with is possible but rare, the odds now increase greatly that more tails will need to fall in the future to reestablish the entropy of 50/50 odds. The loss of entropy, causes a random process, to gain a sense of direction based on what can lower the entropy potential; tails are now more certain than 50/50 due to 10 heads in a row.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Motor Daddy Valued Senior Member

    Messages:
    5,425
    Stop right there...

    The odds of the coin being heads is 50/50, for 1 throw. You flip, it's tails, you lose, the 50% that was possible for tails...happened.

    Repeat as necessary.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. DaveC426913 Valued Senior Member

    Messages:
    18,935
    You need to read up on random walks.

    In a nutshell the lowest state is not the centre.
     
  8. exchemist Valued Senior Member

    Messages:
    12,451
    If you have only a single molecule of each, then you cannot apply thermodynamics, because you do not have an ensemble at thermal equilibrium. There is no population distribution among states. It does not really make much sense to think of a single molecule being half the time present as reactant and half as product. There is no reason why this should be the case. If you have a single isolated molecule, there are no collisions to allow energy to migrate among the various degrees of freedom that theoretically exist. So you would not expect chemical change to occur.

    At least, that's the way it seems to me - rusty on all this though I am.
     
  9. Dywyddyr Penguinaciously duckalicious. Valued Senior Member

    Messages:
    19,252
    Nope.
    The chance remains at 50/ 50.
     
  10. Aqueous Id flat Earth skeptic Valued Senior Member

    Messages:
    6,152
    wellwisher doesn't know what independent random events are. He is confusing "probability of the next toss being tails" vs "probability that the next N tosses are all tails" or some similar kind of confusion. Of course he has posted elsewhere that there is no such thing as a random event and/or Probability Theory is broken.

    like exchem I was surprised you were treating one molecule out of an ensemble. I stopped at the mention of reaching equilibrium, and wondered what that means for one molecule. It's probably better to think in moles since that fits with the definition of "standard enthalpy of reaction".


    I don't understand. I think it would be better for you to think in terms of specific reactions and then maybe you can arrive at the higher rule you have in mind. You wrote these as reversible reactions. But that does not mean that the odds of synthesis is equal to the odds of decomposition. Let me suggest you consider a simple redox reaction: Na[sup]+[/sup] + Cl[sup]-[/sup] ⇆ NaCl. One the left side we have 1 mole of sodium ions and 1 mole of chlorine ions (no molecules there) and on the right one mole of salt molecules. If, as you say, no external energy is applied, then the reaction will synthesize salt. It won't reverse because it takes energy to do that, and you're setting the condition that no such energy is available.

    As for the second case, C⇆D⇆E, here you would want to think of an actual case for this. It suggested to me the dissociation of water C⇆D into hydrogen ions and hydroxyl, followed by synthesis of hydronium D'⇆E, but if have to write D' because the it doesn't balance with D (an additional H[sup]+[/sup] ion is required). But even so, when you look at the probability of finding either hydronium or hydroxyl in water, it's very low.

    I think I interpreted your question according to what you had in mind, but I'm not sure. It would help if you could say more about the idea you're thinking about. Did this come to you as a question based on reading and thinking about entropy in chemistry or are you think of a particular application?

    It was interesting to see you citing Shannon's entropy next to the thermodynamic entropy. This is a can of worms of a completely different kind!

    Please Register or Log in to view the hidden image!

     
  11. Secret Registered Senior Member

    Messages:
    299
    Like this
    \(\Delta S = \frac{\Delta H - \Delta G}{T} \)?
    But I still don't see how this means the energy involved must be 100% unusable. Since G measures the max work you can obtain for a system under constant temperature and pressure, H measures the internal energy plus the work done to push aside the atmosphere to make room for the system's expansion. so for (G-H)/T > 0, it means G>H, but how does this must mean the added energy is unavailable to to do work?

    Mind elaborate (anyone will do)
    Yes, and it applies for all independent throws

    Throwing 10 heads in a row you expect (0.5)^10 chance of success
    Bolded statement confuses me


    O right, I almost forgot this. I still remember how my statistical mechanics professor remind us how temprature is not really well defined when you started to consider quantum systems and states, is it because of the bolded point?

    At least, that's the way it seems to me - rusty on all this though I am.[/QUOTE]
    I am pretty sure you are right, as every chemical reaction has an activation barrier.
    On a more related note,

    1. is the bolded principle is what allow chemical reactions to be possible in closed systems? (that is, energy exchange due to molecular collisions is the driving force that allow the activation barrier to be overcome and a chemical reaction to take place (e.g. isomerization for this case))

    2. Can chemical reactions also occur in isolated systems, since we are not allowed any heat or work exchange in such systems. If chemical reactions can still occur, where will the heat evolved or consumed go? Do they help raising the temperature of the system or even causing a phase transition?

    Concept Check
    If my maths class taught me right, P(N tosses tails)=(0.5)^N

    I'll modify my question a bit and rephrase it in terms of moles, as exchemist pointed out you cannot really do thermodynamics with a single molecule as a system

    I do have one reaction that I know of being of the type A <-> B, the isomerization of cyclohexane between the chair and the boat form
    But I currently cannot think on top of my head a A <-> B reaction where the equilibrium constant is close to 1

    Because the equilibrium constant is very small pKa=10^-14 if my memory serves

    It's something that came up since the day I learnt about entropy in my 2nd year thermal physics courses, and now in the nonequlibrium thermodynamics of biological systems.
    Basically when pondering about the statistical definition of entropy (measure the no. of microstates), I was wondering about how it naturally implies energy must be spread out.

    Since microstates does not necessary have a spatial nature (e.g. they are not necessary a position of something e.g. magnetic spin states, or in fluids, velocities of the individual particles), I then started to wonder what "energy dispersal" or "unusable energy" (The most common way to interpret entropy in physical chemistry) means in such context

    Just like how when students were first introduced entropy in thermodynamics, dS=dq_rev/T seemed to suggest entropy is always tied to temperature and heat

    And then we are presented systems such as Gibbs mixing and adiabatic free expansion, which has no heat nor work transfer, yet entropy increases, which then illustrate to students that entropy is more than just about heat and temperature

    Here I am trying to find a system (chemical reactions of the type A <-> B <-> C etc.) to investigate how entropy implies "measuring the amount of unusable energy", since in such systems when isolated, the energy is not spatially dispersed. And (I might be wrong in my reasoning) since each molecule can potentially do work to other molecules when colliding with the other molecules, if the entropy of system 1 A <-> B <-> C and system, 2 A <-> B are different, and using the intepretation that entropy is a measure of unusable energy, than I don't see how this implies in such systems that one of the system is less capable to do work (Since they both have the same internal energy, same no. of moles, same initial condition, and at equilibrium, they really only differ in the no. of species a portion of the population (determined by the equlibrium constant K) take, so isn't by counting microstates, system 2 has a lot more microstates thus a higher entropy and less capable to do work?)

    Another aim of this exercise is to try to understand whether the many interpretations of entropy is as accurate and general as each other, or whether there's some interpretations that are more general and accurate than others

    By using very special cases such as adiabatic free expansion, we know that entropy is more than just heat and temperature
    Now I want to test the "unusable energy" and "energy dispersal" interpretation, in order to have a deeper understanding of entropy, on whether entropy really means unsable energy in the most general sense

    Tl:dr:confusing
    See REPHRASE QUESTION below

    It is something I don't really understand also, since information as defined in information theory is still pretty abstract to me, I might be able to get my head around it later

    While we are at it...
    I am also wondering, why the equation for Gibbs entropy (that thing defined for a grand canonical ensemble, which involve a system in thermodynamic equilibrium) look so similar to the Shannon entropy (save for the presence of the Boltzmann constant). Are they somehow related?

    ====================================
    REPHRASING THE QUESTION
    ====================================
    (Note this is a (possibly) idealised toy model)
    Consider two isolated systems 1 and 2
    Initially U, V, n_Total are the same (n_Total=sum of all molecular species at any point in time for the system)
    Thus S are also the same (Since U=U(S,V,N) and is a function (?))
    For convenience, set n_Total=1
    All equilibrium constants are = 1
    Since both systems are isolated, U,V,n_total remains the same

    Now system 1 has 1 mole A, which can undergone the following isomerization reaction
    A <-> B
    System 2 has 1 mole C, which can undergone the following isomerization reaction
    C <-> D <-> E

    We expect at equilibrium, the chemical potentials of A, B are equal and that of C, D, E are also equal separately
    Thus using \(\mu_i=RTlna_{i}[\tex], [tex]\mu_A=\mu_B[\tex] and [tex]\mu_C=\mu_D=\mu_E[\tex] Thus using [tex]\Delta G=-RTlnK\), Delta(G)=0
    Since there is no external heat nor work supplied, Delta(H)=0, thus Delta(S)=0 as expected at equilibrium

    And at equilibrium, we should also expect there's 0.5mol A and 0.5 mol B in system 1, and 1/3mol C,D,E respectively for system 2

    But what about the entropy changes of system 1 and 2 respectively when they move from the initial condition to the equilibrium state. Are they different or are they the same?

    If they are different, how to explain why the entire 1 mol of molecules in one system is less capable to do work to another reference system when compared with the other system doing work to the same reference system, i.e. Where exactly the energy is dispersed, since with all the conditions given in this toy model, the energy does not seemed to be dispersed spatially

    More details of what I am doing in this toy model

    1. Prepare 1 mol of A in system 1 and 1 mol of C in system 2
    2. Isolate the systems separately and allow them to equlibrate
    3. Prepare two replicate reference systems
    4. Place 1 next to reference system and 2 next to a replicate reference system, measure the amount of work done to/by each system to the references
     
    Last edited: Sep 15, 2014
  12. Manifold1 Banned Banned

    Messages:
    181
    No... there is still a small chance it can land on it's edged side, whether or not there is a small weight distrubution on the actual face sides.
     
  13. Manifold1 Banned Banned

    Messages:
    181
    A coin falling on the edge rather than two specific sides, is an analogy to quantum physics in which it is in superposition of both sides at once.
     
  14. Secret Registered Senior Member

    Messages:
    299
    In that case, we just need to take account of this outcome in the probability calculations also
    and the result will be something like mostly head, tail and rarely edge

    It would also depends on the thikness of the coin and other factors

    However this is kinda a digression, we will discuss this in detail in another thread, better keep the entropy topic of this threat which is the OP
     
  15. exchemist Valued Senior Member

    Messages:
    12,451
    I am pretty sure you are right, as every chemical reaction has an activation barrier.
    On a more related note,

    1. is the bolded principle is what allow chemical reactions to be possible in closed systems? (that is, energy exchange due to molecular collisions is the driving force that allow the activation barrier to be overcome and a chemical reaction to take place (e.g. isomerization for this case))

    2. Can chemical reactions also occur in isolated systems, since we are not allowed any heat or work exchange in such systems. If chemical reactions can still occur, where will the heat evolved or consumed go? Do they help raising the temperature of the system or even causing a phase transition?


    [/QUOTE]

    Secret it's a bit hard replying to sections of this long response to everyone but I'll try to stick to this bits addressed to me.

    Yes, with isolated molecules that are not colliding frequently enough, you do not get a Boltzmann distribution of energy among the available states and if you don't have that, you don't have a defined temperature, nor will the various bulk thermodynamic entities have meaning, if I recall correctly.

    And yes, if you have collisions, you can get occasional chance accumulation of enough energy in one mode of excitation for a chemical bond to break - and thus for a reaction to take place. Thermally initiated reactions proceed by this process. Alternatively you can of course shine a light in and via absorption create an electronically excited state that involves breaking - or forming a precursor that can break - a bond. Photochemistry, in other words.

    You can have reactions in isolated systems if the temperature is high enough, sure, why not? And the energy released will go towards raising the temperature of the reactants and products, causing a self-accelerating reaction - sometimes with unpleasant consequences if you have not thought about it in advance.
     

Share This Page