Reversible Computation and Freeing Memory

Discussion in 'Computer Science & Culture' started by MrMormon, Sep 24, 2011.

Thread Status:
Not open for further replies.
  1. MrMormon Registered Senior Member

    Messages:
    73
    The promise of reversible computation is that one-to-one logic gates don't have a theoretical minimum amount of energy they have to expend, lowering heat generation and thus allowing more operations per second safely. The problem with one-to-one logic gates is that they quickly fill memory. What I was wondering is if reversible computers could be feasible if information that needed to be erased were 'transported' by a series of gates to a certain part of the computer. All heat from erasing memory would be generated in one place that could be designed for efficient heat exhaust and wouldn't introduce a ceiling on the highest safe operations per second.

    What do you think?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Chipz Banned Banned

    Messages:
    838
    I think you might not understand reversible computing.

    Given some signal through an integrated circuit there's some output signal. Often a transposition from state A to state B requires several intermediary states. It's the intermediary states which are considered 'waste', or wasted computer cycles. If you were to invert the final signal and send it through the integrated circuit in reverse...you're not going to receive the same bit signal as your input as the original signal for even a single cycle. Rather, to get from the output signal to the original input signal you'd need a unique instruction set which is likely completely different (non invertable) and of different cycle count.

    At a higher level unique states of an application are tracked by a stack which contains historical states at which it can 're-start' in.

    In any case, the computational idea of reversibility is a macro function in which
    f(state) -> g and g(f(state)) -> state

    Logically, you have to increase the uniqueness of each state for a translative function. Right now the size/costs of such an integrated circuit far outweighs the benefit, which is why people look to 'quantum circuits'.

    In any case, the idea of "transporting states" doesn't make any sense on a circuit. And heat from "erasing memory" makes equally less sense.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Reversible computation isn't about inverting outputs, it's about keeping enough of the inputs and outputs so the full set of inputs can be computed (forwards, not backwards).

    Landauer's principle says that the thermodynamic "cost" of storing (i.e. keeping) any information can be made arbitrarily small, but not if you erase information which always costs ln2 "units" of heat. It's called the erasure problem for that reason.

    So if you have a large enough memory to store all the inputs and outputs of each step in some computation and don't erase anything, the thermodynamic cost can be made almost zero.
    But if you need to erase some memory to make room for information, there is an unavoidable thermodynamic "loss": the system heats up as it erases information and the environment gains energy because the heat has to dissipate to it to avoid overheating, or in short: erasure is the loss of information, and information loss corresponds exactly to heat dissipating to the environment.

    Thus "information is energy".
     
    Last edited: Sep 26, 2011
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Chipz Banned Banned

    Messages:
    838
    I thought the primary methodology today was invested in expanding computational throughput.
     
  8. MrMormon Registered Senior Member

    Messages:
    73
    I thought it was irreversible logic gates like OR and NAND that generate heat, nothing macro. And quantum circuits only provide speed up for certain types of algorithms, correct?
     
  9. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    The reason 2-input gates are irreversible is because they have only one output.
    So there isn't enough information to recover any input states. But say you have two outputs instead, say for a 2-input OR gate you "pass through" one of the inputs to an output. Now you can reconstruct (using another gate, i.e. computing forwards) some of the input states of the first gate.

    If you move "up" to 3-input/3-output gates you have enough information to reconstruct all the possible combinations of two input variables, i.e. 2[sup]3[/sup]. Toffoli and Fredkin, Bennet, et al worked on this problem in the 1960s. You could look up those names and I'm sure you'd find something. Also try "the erasure problem", and Landauer.

    Any physical logic gate generates heat, yes. But Landauer and Bennet showed that this can be kept arbitrarily small. The main reason modern transistor gates and high gate-density chips generate heat is because of the speed we want them to run. IOW, if you accept that computation is independent of the speed it occurs, heat generation is not a function per se of computation, it's a function of convenience for us.
     
  10. MrMormon Registered Senior Member

    Messages:
    73
    Thanks for laying that out, but I knew that already. Since we see eye to eye, what do you think about my idea?
     
  11. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    I'm not familiar with the terminology: "one-to-one logic gates". Can you explain that?
     
  12. MrMormon Registered Senior Member

    Messages:
    73
    Same thing as reversible, sorry.
     
  13. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Alternatively you could "keep the inputs" by making them outputs of each gate. Either way you run into a problem: you need enough room for all the wiring, or enough memory (plus all the memory writing gates). I think Bennet discusses this somewhere.

    But that doesn't overcome the erasure problem, I think the conclusion is that the problem is fundamental because information and heat (a form of energy) are the same thing, at least they are in thermodynamic devices, which all digital logic is a subset of.

    I assume you've read about the computer that runs on Brownian motion?
     
  14. MrMormon Registered Senior Member

    Messages:
    73
    I just did by your suggestion.

    Please Register or Log in to view the hidden image!

    Apparently Brownian computation requires heat-generating error correction, so it's a non-improvement. I wonder if the statement that "Landauer erasure cost is negligible compared to other sources of dissipation in today’s computers" in that five-year-old article is still true, though?
    Could you explain?
    My idea doesn't overcome the erasure problem; it involves erasing information far from the main (reversible) processor(s) in order to prevent overheating and therefore would allow higher processor speeds.
     
  15. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    What I mean by "keep the inputs" is passing them through any gate. But then a circuit will use more room. This is one reason digital logic doesn't use any passthrough of inputs, to economise on space and allow higher densities. Another reason is that reversibility of a computation, or a partial computation, isn't generally a requirement. However, that is definitely not true of quantum computation, which has to be reversible--you have to "keep the inputs". If you erase them (or they dissipate) you have no output either. That's what "unitarity" means, sort of.



    Fredkin and Toffoli gates do use "passthrough". Have you heard of cNOT, and ccNOT? Have you looked at quantum computation?
     
    Last edited: Sep 27, 2011
  16. MrMormon Registered Senior Member

    Messages:
    73
    Yes, I understand ccNot etc. So you were just saying that reversible gates take up too much room to have an advantage in space-limited computers?

    My idea only has to do with classical computers, but we can talk about quantum computation as well if you want to. Conventional or adiabatic?

    Please Register or Log in to view the hidden image!

     
  17. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Sorry, I've been busy with assignments and stuff. But this thread is germane to the courses I'm doing.

    Anyhoo.
    I'm not sure the idea of storing "reversible" logic gate inputs/outputs is a starter in classical systems, because you will need quite a large storage. If the idea is to increase the speed of computation by using reversible logic and not erasing information, this unfortunately is equivalent to maintaining all the inputs up until a reverse computation is needed. The system will need to predict all this.

    It would be a trade off between maintaining the inputs or storing all their information "forwards", then reading the memory when a reverse operation is required. But that just shifts the complexity from one place to another, and the memory system would be at least as complex as the circuitry you want to make more efficient.

    Then along comes quantum computing which has its own storage problems.
    Again, the problem seems to be about how complex you want any computer to be. Storage in QC is a design problem because you need to copy states and you can't clone them, like you can with classical bits. This is tied fundamentally to the erasure problem and the fact that information is energy--you can transport quantum states only if the original state dissipates, i.e. is erased. And as mentioned, reversibility isn't just desirable in QC, it's a requirement.

    However the science of QC is maturing, there are plenty of model theoretical gates you can connect together and realise a system that uses Boolean logic. I borrowed the book "Quantum Computer Science" by David Mermin recently and it's a reasonably easy read.

    Quantum gates like cNOT etc, are general and there are a number of ways to implement them. You could use laser beams and optical gear like splitters and mirrors or other kinds of waveguide. Almost anything that can exhibit "useful" quantum behaviour could be used, in theory, so I'm not sure what you mean by "conventional" QC.
     
  18. MrMormon Registered Senior Member

    Messages:
    73
    Are we no longer talking about the distant erasure idea, then?

    Please Register or Log in to view the hidden image!

    It's been a while since I read about this stuff - since one can erase a quantum state, why must operations be reversible? For the parallelism? But yeah, I suppose this discussion is relevant whether we're talking quantum or classical.

    Please Register or Log in to view the hidden image!

    Conventional means with qubits like what we've been talking about. Adiabatic quantum computing involves starting with a simple system with known quantum states and transforming the system into a system that represents a problem to be solved. If the transformation is sufficiently slow (there's no exponential speed-up over classical computers, sadly), the resulting states will encode the solution. That's about all I know...
     
  19. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    It has been a while, hasn't it?
    I started posting a reply earlier today while in the lab, but canned it.

    Anyway, the short answer is that the only irreversible transformation (i.e. operation) in QM is measurement of states. Otherwise a state is unitary and reversible by definition. Not sure if it's worth delving into the algebra if it really has been that long. Are you familiar with Hermitian operators, and Dirac notation?

    Do you understand expectation values, affine logic and linear algebra? If not, I'd suggest strongly that you find out something about them, and about vector representation.

    I suppose a quick and dirty approach can be used in the case of a double-slit experiment: each 'dot' on the screen is an irreversible measurement of a state vector. Up until a wavefunction 'collapses' any transformation can be reversed, and the only operators are the double slit and the screen. In between, a particle is in a superposition of states which means it occupies all possible positions (potentially these are on the screen), i.e. it has "parallelism", but only one possible value when the screen acts as a measurement operator. That is, you can't tell anything about the superposition, and measuring a state (a dot on the screen) reduces all the possible positions to just one.

    Sorry if this seems a bit dismissive, I've been busy with a simulator and some rewriting rules.
     
    Last edited: Sep 28, 2011
  20. MrMormon Registered Senior Member

    Messages:
    73
    Oh, okay. Confusion destroyed. I know linear algebra and a bit of ket notation, but I haven't formally studied quantum mechanics. You still haven't commented directly on my idea, but where you're going might be way over my head. Where are you going?
     
  21. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    But I have said that the memory system your idea would need, would be at least as complex as the circuitry whose "input information" you want to keep. It would only shift the problem, I don't know that you can say it would be a solution that lets you run a computer by "relocating" the erasure problem.


    That problem is fundamental, and corresponds to having enough room to store information without needing to erase it (i.e. having a potentially infinite memory). In practice it's still more efficient to erase information that isn't needed.
     
  22. MrMormon Registered Senior Member

    Messages:
    73
    I think we're seeing this from different perspectives. Your concern is a loss in memory density. I think that's an acceptable sacrifice for an increase in speed. Do I understand you correctly?
     
  23. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    No, I said that in order to transport information that needed to be erased, you will need a memory system at least as complex as the circuitry you want to simplify somehow. I'm not sure how that translates into a loss of memory density (?)

    Let's look at your idea again.

    I don't know that I follow that argument. If you don't have any memory it isn't filled. Do you mean: if you keep all the inputs you need a lot of gates, in order to have a sufficiently general computer?
    And why would doing that be less of a problem than having enough gates that you don't need to erase information, i.e. reuse some of the gates?
    But the system would necessarily have a ceiling on the highest safe memory operations, in that case.

    It really only shifts the problem, as I see it, from having sufficient gates that you can "keep" the inputs where they are, without having to reuse any--so no erasure needed--to having sufficient memory so you can store any arbitrary combination of inputs. But then you need to keep track of multiple permutations of gate inputs and know when they are needed so they don't get erased, and be able to restore the inputs to the gates when that's needed.

    It doesn't reduce the problem but seems to make it more complex.
     
    Last edited: Sep 29, 2011
Thread Status:
Not open for further replies.

Share This Page