I was always interested in the idea that you could generate electricity from ambient heat. Such a thing would not violate either conservation of energy or momentum and thus should be possible. Please don't bring up the second law of thermodynamics - it is a *tendency* not a certainty. Please refer to http://en.wikipedia.org/wiki/Second_law_of_thermodynamics if you don't believe me. Anyways, I've drawn up an equation as a starting point in the thought process. The model in my head is a two-chamber device where a heat pump creates a gradient where one chamber is the hot one and the other is the cold one. The model assumes that the energy put into the pump is exhausted into the hot chamber (to increase that chambers temperature, and thus the heat gradient). To keep things simple, I assume there is no outside environment. Only the two chambers exist and heat cannot be transfered out of both chambers. The equation I came up with calculates at what efficiency the electric generation must occur in order for a net gain in useful energy to happen. Eff = A/(2x + A) where: * A is the heat-energy added to the hot-chamber via powering the heat pump * x is the heat-energy moved from the cold-chamber to the hot-chamber (2x is the difference in heat-energy), and * Eff is the heat to electricity conversion efficiency needed for the system to break even. This comes from the equation: gain = (2x+A)*Eff, where gain is the amount of electricity generated. Now, since this kind of thing is generally regarded as impossible, I'm wondering what is wrong with my assumptions, or generally why this scheme doesn't work.