Second Law of thermodynamics vs. Evolution

Discussion in 'Physics & Math' started by Saint, Apr 6, 2020.

  1. exchemist Valued Senior Member

    Messages:
    12,517
    The Thermodynamic Argument of creationists suffers from a second basic flaw, which post 18 exemplifies. This is the false idea that there is a progressive decrease in entropy as a species changes from one form to another. But a species does not have any given level of entropy - it's meaningless to talk in this way. Entropy is a property of a physical system. A species is not a physical system. It is a classification, of a type of organism, without reference even to the number of organisms involved. Trying to associate a level of entropy with a classification is just scientifically illiterate.

    What you can, just about, do is talk about the entropy of an individual organism, in the course of its life. It takes in nutrients and energy from its environment, converts some of this into its internal structure, and rejects metabolic byproducts and waste heat into the environment.

    The change in entropy resulting from a change to the genome, in a mutation that may be passed on onto the next generation as part of a step in evolution, is absolutely infinitesimal compared to its day to day metabolism and growth. The real entropy reduction in an organism that has to be accounted for is not this at all: it is the development of the individual organism from its seed or embryo as it grows.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. billvon Valued Senior Member

    Messages:
    21,644
    Nope. The lifetime of an animal - which includes the process of evolution - results in an INCREASE in entropy. A typical 80kg person (or ape) eats about 35 metric tons of food in their lifetime and turns it into fecal matter. That is a HUGE increase in entropy.

    So evolution involves a decrease, not increase, in entropy.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. QuarkHead Remedial Math Student Valued Senior Member

    Messages:
    1,740
    I confess I am puzzled by this thread - nobody here seems to know what they are talking about.
    First some facts......

    Evolution is defined, repeat DEFINED, as a stable change in allele frequency over time.

    The second law of thermodynamics states that the probability of a system transitioning to a state low-energy state to one of higher energy is vanishingly small. Note that "entropy" is not part of the second law - that is the third law.

    Like to even make sense of these two definitions taken together? I cannot
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. exchemist Valued Senior Member

    Messages:
    12,517
    Er, well, no, entropy does feature in some expressions of the 2nd Law (of which there are many). For instance, I had in mind Planck's formulation of the 2nd Law, which is:
    "Every process occurring in nature proceeds in the sense in which the sum of the entropies of all bodies taking part in the process is increased. In the limit, i.e. for reversible processes, the sum of the entropies remains unchanged."

    https://en.wikipedia.org/wiki/Second_law_of_thermodynamics

    Furthermore your "definition" of evolution as something about allele frequency is far too narrow to encompass the way the term has been used over the last 150 years since Origin of Species was written. The word is not the exclusive property of a science that did not even exist in Darwin's time.
     
  8. iceaura Valued Senior Member

    Messages:
    30,994
    Not really. That's an illusion fostered by certain common ways of presenting data - how one draws the tree of evolutionary change. People tend to draw it with sharp corners, as if recording instantaneous events, and if the time scale is broad enough that's hard to avoid - if a pixel height is a million years, the Cambrian Radiation is going to look like an Explosion. If the pixel height is a hundred years, it's going to look like a river's drainage basin.

    All one can say is that the rate of evolutionary change has varied considerably over time and space, while the mechanisms apparently remained the same Darwinian suite over the entire span of life on this planet - the rate at which one decides to declare a "punctuation" is more or less arbitrary, and without significance as regards the mechanisms.

    Darwinian mechanisms enforce stability as much as they foster change - and that is just as important in the establishment of a "species" as the fostered change. Evolution plateaus, and a given species (no matter how rapidly evolved) will hang around for millions of years (until one day some giant meteor resets the whole system to factory default).
     
  9. QuarkHead Remedial Math Student Valued Senior Member

    Messages:
    1,740
    exchemist: I am sorry to say that, for the first time, I profoundly disagree with you
    Then a "Law" ( the second) that includes another Law in its formulation (the third) is not a law at all - it is at best a corollary

    By whom? I can assure you, with my professional hat on, that is the currently accepted definition in evolutionary genetics.
    This makes no sense - genetics gives evolution a mechanism, molecular genetics an even better one.
     
  10. exchemist Valued Senior Member

    Messages:
    12,517
    Everyone familiar with the 2nd Law knows there are many formulations of it, not just one. And entropy plays a key role in explaining how the 2nd Law works, hence Planck's formulation of it, which is very familiar to many people. I've even seen the 2nd Law expressed as ΔS ≥ ∫dQ/T, which is essentially a mathematical version of Planck's formulation of it.

    As for the point about evolution by natural selection, defining it purely in terms of an allele change would seem to fail to capture the essence of the concept, viz. the idea that natural selection determines which changes become amplified in a population, by differences in rates of reproductive success. However, if I'm wrong about this, I'm willing to hear your explanation of how this mechanism is implicit in the definition you provided.
     
    Last edited: Apr 14, 2020
  11. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    So the argument here is about whether the concept of entropy can be "defined" in terms of the 2nd, and not the 3rd, law?
    Or, can we safely put the cart before the horse?

    The first LOT says that energy is conserved; the second says energy can be transferred from place to place as long as it's conserved, and that this defines the direction entropy increases; it defines time in that sense.

    Where is the third law? It defines entropy as a limit point (I think). It defines an entropy minimum in terms of an absolute temperature.
     
    Last edited: Apr 14, 2020
  12. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Ok, life is sufficiently complex that 'algorithms' are seen everywhere; in particular genetic information is encoded and this represents a process, of conservation of information.
    If information is conserved, then by definition a faithful transmission has occurred (information-theoretically). Indeed a faithful copy operation is just another definition of a faithful transmission of the same information, down a (possibly abstract) channel.

    So what does entropy have to say about this genetic information, its copying and so on? Or is that the other way around? How do you relate a constant temperature to entropy of information? Given that in mammals, DNA ambient temperature is more or less constant?
     
  13. James R Just this guy, you know? Staff Member

    Messages:
    39,421
    The second law doesn't include the third law. What are you talking about?

    The second law, roughly speaking, says that entropy tends to increase over time in a closed system. The third law says that at absolute zero temperature, a system has minimal entropy (possibly zero).
     
    exchemist likes this.
  14. QuarkHead Remedial Math Student Valued Senior Member

    Messages:
    1,740
    OK, thank you.
     
  15. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Which underlines that entropy is closely (perhaps locally) defined; if you define it in terms of a single organism then you have a defined increase in entropy in the environment, so relatively, the organism has less entropy than its environment. But that's possible only if the organism maintains the difference, by continually increasing the external entropy; it has more order than the disordered environment it creates (metabolic cycles are ordered, structured etc like algorithms).

    Entropy only makes sense (even in information theory) when it has a context, and the context must include a difference, between order and disorder, or randomness and non-randomness.
    And those who say information entropy isn't thermodynamic, well, in the case of DNA replication, the thermodynamics are a background. Information only makes sense physically if the transfer of energy is defined, and information is a physical metric (arbitrarily imposed).
     
    Last edited: Apr 16, 2020

Share This Page