I have a decent understanding of what entropy is, but I don't get how people can say that entropy gives a direction to time???

Entropy is the tendancy of things to become more disordered over time. So, as entropy increases, time increases. In other words, they have a one to one relationship. That way, entropy is a measure of time. I hope that explains it?

Drop an egg. It breaks and makes a big mess. Entropy (the amount of disorder) increases, since there are more ways the egg can be broken than whole. Film the egg dropping and breaking. Now you notice something interesting. You can easily tell whether the film is played forwards or in reverse. The reverse picture shows the broken pieces of the egg reforming and flying upwards to your hand. But you know this is a reversed picture because you <b>never</b> see that kind of thing happening in real life. Thus, the direction that entropy increases is always the direction of forwards time.

order hello everyone, just "understood" what entropy was about....thanks Please Register or Log in to view the hidden image! But now i wonder, how could things in the universe stay so ordered ? I mean it seems that thermod. second principle (dunno Please Register or Log in to view the hidden image! ) stipulate that nature seems to prefer disorder, but look at our universe who, in spite of a max entropy, still stay ordered ?

Re: order Because energy counter acts entropy, especially in an open system. A system can be ordered provided energy is input. A classic trick exam question is, "Does a flower break the second law of thermodynamics? Explain" It's assumed that as the Universe gets older and all systems have expended available energy the Universe will enter a state of maximum entropy, filled with isotropic cold dust and gas.

I thought the universe tended toward a state of less energy, less activity. If it is expanding, and if there is a finite amount of matter, then the interactions of matter must necessarily become less and less.

Crisp back on more familiar grounds Please Register or Log in to view the hidden image! Hi all, Adam, "I thought the universe tended toward a state of less energy, less activity." It is. The universe, as any closed system, tries to go to a state of minimal energy, and at the same time of maximum entropy. It is the ongoing battle between those two that determines the shape of the universe from a statistical mechanical point of view. It has been proven (so I was told by someone credible) that for a closed system of matter with gravitational interaction, the "struggle" for minimum energy/maximum entropy is resolved by clustering of matter. This could explain why galaxies are formed. James R, "The reverse picture shows the broken pieces of the egg reforming and flying upwards to your hand. But you know this is a reversed picture because you never see that kind of thing happening in real life." As a sidenote, and mainly because of me being sarcastic enough to complicate the discussion even further, I would like to add that it is possible for an egg to do that, but the probability is that incredibly small that you'd have to wait several universe-ages before such behavior is observed. Bye! Crisp

just to be sarcastic.... *** As a sidenote, and mainly because of me being sarcastic enough to complicate the discussion even further, I would like to add that it is possible for an egg to do that, but the probability is that incredibly small that you'd have to wait several universe-ages before such behavior is observed. *** Sorry but probability model can not be linked with the waiting time for an event to occur, let me explain what i mean: Let's take and throw 1000 perfect dice (ie equilibred) the probab. of having only 6 is incredibly small (1/6^1000) but it is still possible and furthermore still possible on the first throw. We can model the waiting time for a discrete event to occur (it is called geomitrical law) but once again it only gives a probability of what the waiting time could be, ie a proportion when the number of experiment tends to inf, but obtaining only 6 is still possible on the first throw, at the first attempt, at t=0. But still we never saw a egg reform.... But with enough energy, we can see molecules break and reforms... but not an egg Please Register or Log in to view the hidden image! What is the difference? Do you think that it never happened or that it has never been observed ?

Crisp, hi. okay,how can you call <i>any</i>system a closed one? consider this: ================================================= the electromagnetic radiation can go through a closed system(Energy) so in short packets of mass are being transmitted or transported inside a closed system.is there a violation here? so,strictly speaking there should only be two kinds of systems viz open(truely open) or isolated... bye!

Hi Ash, You can take your (correct) probabilistic discussion one step further and apply it to mechanics. In physics (theory of dynamics) there is a theorem called "the Poincare recurrence theorem" which roughly said states that any system, under some conditions, evolves in time but ends up in a state that is arbitrarily close to the initial state. In human language: because of the time reversibility of the microscopic world, the egg would eventually recombine almost perfectly. One can even make an estimate for the time that you'd have to wait to see this Poincare recurrence, and for a typical macroscopic system this is of the order of 10^10^23 seconds.... huge. This seems in total contradiction with entropy increase, but there's a long explanation attached to why this is possible and what entropy means from a microscopic point of view. Put very briefly, the increase of entropy is something you would expect as the "typical" behaviour of a system. If you would put 1000 identical systems next to eachother under the same initial conditions, probably 999 (or if you're unlucky, all 1000) systems would evolve to a state with higher entropy. With this in mind, the second law of thermodynamics is not absolutely correct, it only describes what one would typically expect to happen. This discussion is far from settled in the physics community. Especially from a quantummechanical point of view (and the study of quantum dynamics, entropy production in quantum systems), the second law has been questioned the last couple of years. If I remember some announcements I saw correctly, there's a symposium on the second law planned next year, perhaps something productive will come out of that Please Register or Log in to view the hidden image!. Hi zion, It is very difficult to create a totally isolated system. I think that we can only say that the universe as a whole is a closed system (since the universe, per definition, includes everything). Bye! Crisp

poincaré Hi crisp, I must say that you completely busted me, i sweared that i'd work on my exam today but i couldn't resist gathering some doc on the poincaré recurence theorem.... u got me Please Register or Log in to view the hidden image! dunno if i understood anything in it, but to summarize i retained that after iterating a given homeomorphism f on a finite volume space (let's call it X) that retains areas, you must found your initial space after k iterations. We can approximate the value of k with some volumetric arguments (simply put, if f act on a open part of X, called U, k can be aproximated by saying that it must be inferior to volume(X)/volume(U), ie the max number of disjoint open part you can put in X) What i retained from thermod 2nd principle is that total entropy tends to increase. Here is why i think that Poincaré isn't in contradiction with it: Let's assume we have an egg, perfoming a transformation on it (the one who broke it), after k iterations you got your initial state back, so we can say that the state of your system and so the entropy is a k-periodic function. And is not monotonous. But what happens when we put n eggs in the same system? we then have n entropy periodic function with different period (two eggs => two different transformation) so (by accounting that entropy of a system is the sum of the entropy of his sub-system) we end with a global entropy that has a period who tends to infinite as n grows. On a microscopic way, i don't think that such a transformation may exist, i mean that the argument of volume conservation is essential, but when we alter coordinates of a part of our system, new chemical reaction may happen in it's new configuration (especially at its frontier, this case is avoided in the theorem by considering that our part is open (mathematically)). And i just thought that these chemical reaction may alter the volume/area of the part. (Think i may be very very misleading on this point :bugeye: ) Anyway there is still something that bugs me, i saw that we can aproximate the order of the number of iterations, but how can we conclude a minimum duration? This implies that there is a minimum duration for a transformation to occur, independantly of the transformation (some kind of time limit). Never heard anything like that....more info maybe ? plz ?

Hi Ash, Okay, you probably now know more on Poincare's recurrence theorem than I do Please Register or Log in to view the hidden image!. However, the "volume" you talk about shouldn't be taken literally. The Poincare theorem is on the level of the dynamics of the system, which is described by a space that holds all possible configurations of the system (a configuration is a certain combination of positions and momenta of the atoms). For N atoms this would give rise to a 6N dimensional space (3 coordinates for the position and 3 for the momentum, and this for every particle = 6N in total). In this configuration space, one can look at 6N-dimensional volumes = a combination of possible configurations for the system. Under certain very basic conditions (read: nearly every physical system complies to these conditions), the volume of such a part of the configuration space remains the same when the system evolves in time. The Poincare recurrence theorem then states that the evolution of a part of the configuration space is such that after a number of timesteps, that part nearly matches the original starting part again. It's a bit hard to explain in words though. I suggest reading books on "classical dynamical systems" or something, but they mostly require quite a mathematically founded knowledge of physics. Bye! Crisp

oooopppss ok crisp, just reread my last post, you are completely right my second example is pure garbage (plz plz forget it )Please Register or Log in to view the hidden image! But promise, one last question and i'll definitely shut up, when considering a phase space like the one you mentioned, i understood that a point of this space is a state of the system, and as it looks like a metric space, we can define a distance/norm on it and i was wondering if there isa way of linking the entropy of a state with it's distance from the origin ? Or if not, maybe to use the entropy as our distance ? would be really glad if you have any idea on this, tia

crisp, you know so much!! what do you do for a living, a theoretical physicist??? professor at MIT???

Hi all, Ash711, "But promise, one last question and i'll definitely shut up, when considering a phase space like the one you mentioned, i understood that a point of this space is a state of the system, and as it looks like a metric space, we can define a distance/norm on it and i was wondering if there isa way of linking the entropy of a state with it's distance from the origin ? Or if not, maybe to use the entropy as our distance ?" No worries, you can ask any questions you like. Whether or not I know the answer is the only factor that will stop me from replying or not Please Register or Log in to view the hidden image!. Concerning your entropy-distance idea: I don't think it is possible to define a norm using the entropy. If you would plot all values of the entropy in function of the states of the system (by plotting something like S : R^(6N) -> R ), then you would get a surface that is dependent of the interactions of the system. This surface can be curved in such a way that the properties of a norm are not always satisfied (eg. ||x + y|| <= ||x|| + ||y||, how would you define the entropy of x + y in a classical system when x and y are distinct states ?). I can't think of any examples now, but perhaps if you define a statespace in a special way (and not in the "Euclidian" 6N dimensional way), that it could be possible though. But this would probably involve needlessly complicating things. And also, there are several microscopic definitions of entropy, it could be possible that you find a definition that is more suitable as a norm, but once again, this would probably involve creating a special statespace. Bottom line: I've never seen entropy used as a distance in a statespace. And most of the times, entropy is not really an observable that you'd want to calculate (this can get very messy and complicated for interesting systems). When studying a system, you're mostly looking at parameters that indicate phase transitions, ... Nanok, "what do you do for a living, a theoretical physicist??? professor at MIT???" (blush) - not really, I'm just a student in theoretical physics. Bye! Crisp