Quantum uncertainty on macroscopic scale

Discussion in 'Physics & Math' started by litewave, Aug 26, 2009.

  1. litewave Registered Senior Member

    Messages:
    128
    I don't quite understand why macroscopic objects we encounter in everyday life don't exhibit quantum probabilistic behavior. For example, the car standing in front of my house has a definite position and definite boundaries rather than being dissolved in some cloud where it's not clear whether it stands in front of my house or on the other side of the street.

    I have heard that the reason is because the quantum wave lengths of particles are very small relative to the size of macroscopic objects. If this is true, how large is the wave length of an electron? Doesn't the wave spread out across all universe? Or is the probability of the electron's position distributed in such a way that significant probabilities are concentrated in a very small area, perhaps a fraction of a milimeter, while the rest of the wave has negligible probabilities?

    Another reason I have heard is decoherence - that quantum waves are somehow suppressed because of interactions of particles. I'm not sure this answers my question though, because it is claimed that decoherence has been avoided for example in superconductors, but still I suppose that a macroscopic piece of superconductor has a definite position and boundaries and is not jumping unpredictably from one corner of the laboratory to another?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. kurros Registered Senior Member

    Messages:
    793
    The most simple (and naive) reason is that the uncertainty principle is negligible on a large scale
    \(\Delta x \Delta p \ge \hbar / 2\)
    \(\hbar\) is a very small number so on a macroscopic scale position and momentum can be known much better than you can notice, so you see no uncertainty. It is a little more complicated than just that though.

    This isn't always true, radio photons are still quantum particles but they have wavelengths of the metre scale. Electrons generally have much shorter wavelengths than this, but if you are tricky with your experiment you can make long wavelength electrons (it is a function of how fast they are travelling, or their momentum). So really slow electrons will have long wavelengths. A particle doesn't really have to have very much momentum to have extremely short wavelength however, so most of the time electrons have wavelengths more like nanometer scale (this is what they have in atoms anyway)

    Yes, and yes.


    Decoherence is so far the best explanation for why large collections of quantum particles (i.e. macroscopic objects) behave classically. This is not a simple subject and I don't really know so much about it, except that there is supposed to be a mechanism by which the "quantum" parts of the hilbert space (i.e. the 'schrodinger cat' type states) are made unstable through interactions with the environment, so that if you did create a macroscopic object in such a state, the state would extremely rapidly evolve into a "classical" part of the hilbert space, which is stable under interactions with the environment.
    Superconductors are different. The object that is the superconductor certainly has "decohered", but the electrons inside the superconductor have not. Some aspects of the crystal lattice retain a lot of quantum properties as well. Decoherence is a loss of the quantum degrees of freedom of a system, but not every individual particle in the system needs to decohere, it is more of a global process. I suppose some aspects of the superconductor are shielded from the environment in some sense because they are part of the internal structure of the superconductor.
    If someone knows more about decoherence I'd like to hear it too, it's something I wonder about a lot.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. CptBork Valued Senior Member

    Messages:
    6,460
    A lot of things in quantum physics are mathematically tweaked to make sure it reduces to classical physics at large scales and low energies. Neils Bohr called it the "Correspondence Principle", it basically requires that what quantum physics says about the daily world cannot disagree with the classical laws we have already tested and established to accurately describe these things. One of the questions that arises is how the uncertain behaviour of quantum particles can lead to the stable, classical behaviour on our scales. The key is that you have to work with averages, or "expectation values". You can have a bunch of quantum particles doing all kinds of strange things independtly, but on large scales most of these strange movements will cancel each other out and what you will measure is the net average of what all these particles are doing. So strange effects like tunnelling occur only on rare occasions with very low probability, hence we don't see it at mascroscopic scales, whereas the average value most quantum particles obey corresponds to what we observe in the classical picture.

    When talking about expectation/average values, the classical Newtonian equations of motion have direct quantum analogues.

    When surrounding a variable with angle brackets < and >, we are referring to the expectation value of a quantity like momentum <p> or position <x> (both being components along the x-axis). t will represent time, m the mass of the particle being described, and V the potential energy of this particle as a function of its position. From Schrodinger's equation you can deduce what are called the Ehrenfest equations:

    \(<p>=m\frac{d<x>}{dt}\) and \(\frac{d<p>}{dt}=-\left<\frac{dV}{dx}\right>\).

    These equations are equivalent to the Newtonian definitions of force and momentum along a single axis, but in quantum physics this is just the average behaviour over many randomly-behaving particles.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. litewave Registered Senior Member

    Messages:
    128
    The quantum wave of a photon is the same as the wave of the electromagnetic field? I heard that a quantum wave represents the probability distribution of existence of a particle in space, but an electromagnetic wave represents intensity of electromagnetic field.
     
  8. noodler Banned Banned

    Messages:
    751
    It's density you should look at - the energy density of long wavelengths is sufficiently extended that plane wave calculations are easy to do in an approximation. Energy of short wavelengths is more contracted and, if spread over the same area as "plane waves", is a lot harder to approximate (which is why we send the intensity signals down little fibers of glass).
     
  9. kurros Registered Senior Member

    Messages:
    793
    Pretty much, yes. A photon is just a quantised excitation of the electromagnetic field. So a beam of 500nm light is made up of photons with wavelengths of 500nm. The quantum and classical interpretations of the electromagnetic field are very different, but on a large scale they amount to the same thing. If you have a large quantum probability of photons being in a certain place, that translates into the electromagnetic field having a high intensity there, if you have zillions of photons.
     

Share This Page