HUMAN BRAIN AS CPU(Speculative Estimates)

Discussion in 'Intelligence & Machines' started by Rick, Nov 19, 2001.

Thread Status:
Not open for further replies.
  1. Rick Valued Senior Member

    Messages:
    3,336
    Hi,
    this is some piece of interesting information i gathered about human brain,recently.i'd like to share.
    ==============================================
    Operation :::::::::
    ::::::::: Estimated Speed
    or capacity

    ==============================================
    Input: fast(1 gigabit per second.the human retina
    has a resolution of 127 million pixels!!
    (Jesus Christ!! that shoook me!!!!)

    Processing : Fast for pattern recognition
    (10 billion instructions per second)

    slow for calculation(2-100 bps)

    Output: Slow(speech:100bps)



    Storage Very large(10 terabytes)but retrieval can
    uncertain.

    any inputs or additions are welcome.
    BYE!.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. kmguru Staff Member

    Messages:
    11,757
    Human mind does not store every bit of information. It is first stored in the short term memory from about 30 to 100% of the bits. Then, when it goes to the long term storage, it is compressed and patterned and filtered so that only the stuff you need for survival are stored. Rest are dumped. Even then, only the summaries are stored that can be expanded based on other summaries.

    Some people have better short term memory so that thay can recall the details of a page of text. Others are not so lucky.

    But...it is amazing how much stuff we are capable of....
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Rick Valued Senior Member

    Messages:
    3,336
    what i found absolutely amazing was resolution of retina,imagine i was proud of our generation computer resolution,untill i checked my own eye.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Stryder Keeper of "good" ideas. Valued Senior Member

    Messages:
    13,101
    I would say that how the human brain stores information is a kind of Mantra (Or fractalating aura of electromagnetics)

    It's kind of hard to explain, but in Quantum Mechanics light is known to motion in waves, and at times all the waves intersect to create lulls or rises. I mention this because a Fractalating Mantra of these lulls and rises would be how our memory works.

    When the memory is first placed, it isn't fragmented so easy to remember, but the longer the memory is there, the more information is processed over that area making the information fractalised, which means when we have to pull that memory we have to have an area process a method to "Defrag" the memory out of a fractal pattern.

    (These lulls and rises if emminating from a neural cluster, can infact penetrate the centre of other neural clusters, giving a non-locality connection to the neural cluster without actually using a Synaptic pathway)
     
  8. kmguru Staff Member

    Messages:
    11,757
    While the present speculation for human memory is exploring fractal and quantum structures, I still think, it is more like holography where the resolution depends on size of the original recording. Take a piece, and resolution suffers but the pattern can still be recognized as long as the percentage of the original recording (short term memory) has not diminished to a point of no return.

    Since the brain is capable of heavy duty calculations (the bat can calculate distance and navigate too), it may be possible a fractal structure too.

    Jury is still out....
     
  9. daktaklakpak God is irrelevant! Registered Senior Member

    Messages:
    710
    Eagle's eye are 50 times more efficient than human. Do you know it's resolution?

    Please Register or Log in to view the hidden image!

     
  10. kmguru Staff Member

    Messages:
    11,757
    Eagle Eye

    " It's impossible to know for sure what the world looks like to an eagle, but we know from studying the anatomy of their eyes that their view must be enlarged and magnified compared to our view. Eagle eyes are the same size (weight) as human eyes (though a full grown adult Bald Eagle weighs no more than about 14 pounds!) But an eagle eye has a much different shape from ours. The back is flatter and larger than the back of our eye, giving an eagle a much larger image than we can see. And its retina has much more concentrated rod and cone cells-the cells that send sight information to the brain. Some animals, including humans, have a special area on their retina called the fovea where there is an enormous concentration of these vision cells. In a human, the fovea has 200,000 cones per millimeter, giving us wonderful vision. In the central fovea of an eagle there are about a MILLION cones per millimeter. That's about the same number of visual cells as the finest computer monitor has on its entire screen when set at its highest resolution. The resolution for a person would be similar to setting a computer's screen at a much lower resolution."

    Link:http://www.learner.org/jnorth/tm/eagle/VisionA.html
     
  11. Chagur .Seeker. Registered Senior Member

    Messages:
    2,235
    kmguru ...

    Link to 'Journey North' really decent.

    Thanks.
     
  12. tony1 Jesus is Lord Registered Senior Member

    Messages:
    2,279
    *Originally posted by zion
    ...
    Input: fast(1 gigabit per second.the human retina
    has a resolution of 127 million pixels!!
    (Jesus Christ!! that shoook me!!!!)

    Processing : Fast for pattern recognition
    (10 billion instructions per second)

    slow for calculation(2-100 bps)

    Output: Slow(speech:100bps)



    Storage Very large(10 terabytes)but retrieval can
    uncertain.

    any inputs or additions are welcome.
    *

    Accuracy: near zero
    Self-check: none detected

    Apparently, for post-slinging on the web, the last two mean nothing, so presumably that's why you left them out.
    For real life, one might want to develop ways to improve on them.
     
  13. Rick Valued Senior Member

    Messages:
    3,336
    Tony 1,

    Thread was called HUMAN BRAIN:SPECULATIVE estimates.


    bye!
     
  14. Stryder Keeper of "good" ideas. Valued Senior Member

    Messages:
    13,101
    Perhaps if we took a closer look at Tony1's Psyche we could work out the flaws that occured with pre-homo sapiens, that made them less agile, have an inability to communicate or structure solutions to problems.

    I would guess that his head shape would have a overhanging forehead for the unevolved fore lobes, presenting itself as the fear to trust equations and the need to believe in a god like being.

    Reminds me of a 286.
     
  15. Avatar smoking revolver Valued Senior Member

    Messages:
    19,083
    virtual reality generating output - great

    [ think of all the things you can imagine with your brain, make your fantasies visual (virtual reality).

    sound recognition - poor compared to other species.

    smell sensitivity and recognition - poor
     
  16. kmguru Staff Member

    Messages:
    11,757
    There is a new theory by a respected physicist that the human brain does have quantum computing capabilities. I am too lazy to research that. Check it out...

    I am not sure it was Wolfram or who! If it is true, it would be really interesting...
     
  17. Rick Valued Senior Member

    Messages:
    3,336
    KM,
    stryder started the topic way back about quatum brains,in that you have replied that you you"ll come up with answers soon...i am still waiting for the day...

    bye!
     
  18. Rick Valued Senior Member

    Messages:
    3,336
    The above interesting article is taken from website called:
    http://www.merkle.com/humanMemory.html

    How Many Bytes in Human Memory?
    by Ralph C. Merkle

    This article first appeared in Foresight Update No. 4, October 1988.

    A related article on the computational limits of the human brain is available on the web.

    Today it is commonplace to compare the human brain to a computer, and the human mind to a program running on that computer. Once seen as just a poetic metaphor, this viewpoint is now supported by most philosophers of human consciousness and most researchers in artificial intelligence. If we take this view literally, then just as we can ask how many megabytes of RAM a PC has we should be able to ask how many megabytes (or gigabytes, or terabytes, or whatever) of memory the human brain has.

    Several approximations to this number have already appeared in the literature based on "hardware" considerations (though in the case of the human brain perhaps the term "wetware" is more appropriate). One estimate of 1020 bits is actually an early estimate (by Von Neumann in The Computer and the Brain) of all the neural impulses conducted in the brain during a lifetime. This number is almost certainly larger than the true answer. Another method is to estimate the total number of synapses, and then presume that each synapse can hold a few bits. Estimates of the number of synapses have been made in the range from 1013 to 1015, with corresponding estimates of memory capacity.

    A fundamental problem with these approaches is that they rely on rather poor estimates of the raw hardware in the system. The brain is highly redundant and not well understood: the mere fact that a great mass of synapses exists does not imply that they are in fact all contributing to memory capacity. This makes the work of Thomas K. Landauer very interesting, for he has entirely avoided this hardware guessing game by measuring the actual functional capacity of human memory directly (See "How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory", in Cognitive Science 10, 477-493, 1986).

    Landauer works at Bell Communications Research--closely affiliated with Bell Labs where the modern study of information theory was begun by C. E. Shannon to analyze the information carrying capacity of telephone lines (a subject of great interest to a telephone company). Landauer naturally used these tools by viewing human memory as a novel "telephone line" that carries information from the past to the future. The capacity of this "telephone line" can be determined by measuring the information that goes in and the information that comes out, and then applying the great power of modern information theory.

    Landauer reviewed and quantitatively analyzed experiments by himself and others in which people were asked to read text, look at pictures, and hear words, short passages of music, sentences, and nonsense syllables. After delays ranging from minutes to days the subjects were tested to determine how much they had retained. The tests were quite sensitive--they did not merely ask "What do you remember?" but often used true/false or multiple choice questions, in which even a vague memory of the material would allow selection of the correct choice. Often, the differential abilities of a group that had been exposed to the material and another group that had not been exposed to the material were used. The difference in the scores between the two groups was used to estimate the amount actually remembered (to control for the number of correct answers an intelligent human could guess without ever having seen the material). Because experiments by many different experimenters were summarized and analyzed, the results of the analysis are fairly robust; they are insensitive to fine details or specific conditions of one or another experiment. Finally, the amount remembered was divided by the time allotted to memorization to determine the number of bits remembered per second.

    The remarkable result of this work was that human beings remembered very nearly two bits per second under all the experimental conditions. Visual, verbal, musical, or whatever--two bits per second. Continued over a lifetime, this rate of memorization would produce somewhat over 109 bits, or a few hundred megabytes.

    While this estimate is probably only accurate to within an order of magnitude, Landauer says "We need answers at this level of accuracy to think about such questions as: What sort of storage and retrieval capacities will computers need to mimic human performance? What sort of physical unit should we expect to constitute the elements of information storage in the brain: molecular parts, synaptic junctions, whole cells, or cell-circuits? What kinds of coding and storage methods are reasonable to postulate for the neural support of human capabilities? In modeling or mimicking human intelligence, what size of memory and what efficiencies of use should we imagine we are copying? How much would a robot need to know to match a person?"

    What is interesting about Landauer's estimate is its small size. Perhaps more interesting is the trend--from Von Neumann's early and very high estimate, to the high estimates based on rough synapse counts, to a better supported and more modest estimate based on information theoretic considerations. While Landauer doesn't measure everything (he did not measure, for example, the bit rate in learning to ride a bicycle, nor does his estimate even consider the size of "working memory") his estimate of memory capacity suggests that the capabilities of the human brain are more approachable than we had thought. While this might come as a blow to our egos, it suggests that we could build a device with the skills and abilities of a human being with little more hardware than we now have--if only we knew the correct way to organize that hardware.


    bye!
     
  19. Rick Valued Senior Member

    Messages:
    3,336
    The given below interesting article was also part of the same website that you would want to take a look at.

    Energy Limits to the Computational Power of the Human Brain
    by Ralph C. Merkle
    This article first appeared in Foresight Update No. 6, August 1989.

    A related article on the memory capacity of the human brain is also available on the web.

    The Brain as a Computer
    The view that the brain can be seen as a type of computer has gained general acceptance in the philosophical and computer science community. Just as we ask how many mips or megaflops an IBM PC or a Cray can perform, we can ask how many operations the human brain can perform. Neither the mip nor the megaflop seems quite appropriate, though; we need something new. One possibility is the number of synapse operations per second.
    A second possible basic operation is inspired by the observation that signal propagation is a major limit. As gates become faster, smaller, and cheaper, simply getting a signal from one gate to another becomes a major issue. The brain couldn't compute if nerve impulses didn't carry information from one synapse to the next, and propagating a nerve impulse using the electrochemical technology of the brain requires a measurable amount of energy. Thus, instead of measuring synapse operations per second, we might measure the total distance that all nerve impulses combined can travel per second, e.g., total nerve-impulse-distance per second.

    Other Estimates
    There are other ways to estimate the brain's computational power. We might count the number of synapses, guess their speed of operation, and determine synapse operations per second. There are roughly 1015 synapses operating at about 10 impulses/second [2], giving roughly 1016 synapse operations per second.
    A second approach is to estimate the computational power of the retina, and then multiply this estimate by the ratio of brain size to retinal size. The retina is relatively well understood so we can make a reasonable estimate of its computational power. The output of the retina--carried by the optic nerve--is primarily from retinal ganglion cells that perform center surround computations (or related computations of roughly similar complexity). If we assume that a typical center surround computation requires about 100 analog adds and is done about 100 times per second [3], then computation of the axonal output of each ganglion cell requires about 10,000 analog adds per second. There are about 1,000,000 axons in the optic nerve [5, page 21], so the retina as a whole performs about 1010 analog adds per second. There are about 108 nerve cells in the retina [5, page 26], and between 1010 and 1012 nerve cells in the brain [5, page 7], so the brain is roughly 100 to 10,000 times larger than the retina. By this logic, the brain should be able to do about 1012 to 1014 operations per second (in good agreement with the estimate of Moravec, who considers this approach in more detail [4, page 57 and 163]).

    The Brain Uses Energy
    A third approach is to measure the total energy used by the brain each second, and then determine the energy used for each basic operation. Dividing the former by the latter gives the maximum number of basic operations per second. We need two pieces of information: the total energy consumed by the brain each second, and the energy used by a basic operation.
    The total energy consumption of the brain is about 25 watts [2]. Inasmuch as a significant fraction of this energy will not be used for useful computation, we can reasonably round this to 10 watts.

    Nerve Impulses Use Energy
    Nerve impulses are carried by either myelinated or un-myelinated axons. Myelinated axons are wrapped in a fatty insulating myelin sheath, interrupted at intervals of about 1 millimeter to expose the axon. These interruptions are called nodes of Ranvier. Propagation of a nerve impulse in a myelinated axon is from one node of Ranvier to the next, jumping over the insulated portion.
    A nerve cell has a resting potential--the outside of the nerve cell is 0 volts (by definition), while the inside is about -60 millivolts. There is more Na+ outside a nerve cell than inside, and this chemical concentration gradient effectively adds about 50 extra millivolts to the voltage acting on the Na+ ions, for a total of about 110 millivolts [1, page 15]. When a nerve impulse passes by, the internal voltage briefly rises above 0 volts because of an inrush of Na+ ions.

    The Energy of a Nerve Impulse
    Nerve cell membranes have a capacitance of 1 microfarad per square centimeter, so the capacitance of a relatively small 30 square micron node of Ranvier is 3 x 10-13 farads (assuming small nodes tends to overestimate the computational power of the brain). The internodal region is about 1,000 microns in length, 500 times longer than the 2 micron node, but because of the myelin sheath its capacitance is about 250 times lower per square micron [5, page 180; 7, page 126] or only twice that of the node. The total capacitance of a single node and internodal gap is thus about 9 x 10-13 farads. The total energy in joules held by such a capacitor at 0.11 volts is 1/2 V2 x C, or 1/2 x 0.112 x 9 x 10-13, or 5 x 10-15 joules. This capacitor is discharged and then recharged whenever a nerve impulse passes, dissipating 5 x 10-15 joules. A 10 watt brain can therefore do at most 2 x 1015 such Ranvier ops per second. Both larger myelinated fibers and unmyelinated fibers dissipate more energy. Various other factors not considered here increase the total energy per nerve impulse [8], causing us to somewhat overestimate the number of Ranvier ops the brain can perform. It still provides a useful upper bound and is unlikely to be in error by more than an order of magnitude.
    To translate Ranvier ops (1-millimeter jumps) into synapse operations we must know the average distance between synapses, which is not normally given in neuroscience texts. We can estimate it: a human can recognize an image in about 100 milliseconds, which can take at most 100 one-millisecond synapse delays. A single signal probably travels 100 millimeters in that time (from the eye to the back of the brain, and then some). If it passes 100 synapses in 100 millimeters then it passes one synapse every millimeter--which means one synapse operation is about one Ranvier operation.

    Discussion
    Both synapse ops and Ranvier ops are quite low-level. The higher level analog addition ops seem intuitively more powerful, so it is perhaps not surprising that the brain can perform fewer of them.
    While the software remains a major challenge, we will soon be able to build hardware powerful enough to perform more such operations per second than can the human brain. There is already a massively parallel multi-processor being built at IBM Yorktown with a raw computational power of 1012 floating point operations per second: the TF-1. It should be working in 1991 [6]. When we can build a desktop computer able to deliver 1025 gate operations per second and more (as we will surely be able to do with a mature nanotechnology) and when we can write software to take advantage of that hardware (as we will also eventually be able to do), a single computer with abilities equivalent to a billion to a trillion human beings will be a reality. If a problem might today be solved by freeing all humanity from all mundane cares and concerns, and focusing all their combined intellectual energies upon it, then that problem can be solved in the future by a personal computer. No field will be left unchanged by this staggering increase in our abilities.

    Conclusion
    The total computational power of the brain is limited by several factors, including the ability to propagate nerve impulses from one place in the brain to another. Propagating a nerve impulse a distance of 1 millimeter requires about 5 x 10-15 joules. Because the total energy dissipated by the brain is about 10 watts, this means nerve impulses can collectively travel at most 2 x 1015 millimeters per second. By estimating the distance between synapses we can in turn estimate how many synapse operations per second the brain can do. This estimate is only slightly smaller than one based on multiplying the estimated number of synapses by the average firing rate, and two orders of magnitude greater than one based on functional estimates of retinal computational power. It seems reasonable to conclude that the human brain has a raw computational power between 1013 and 1016 operations per second.
    References
    1. Ionic Channels of Excitable Membranes, by Bertil Hille, Sinauer, 1984.
    2. Principles of Neural Science, by Eric R. Kandel and James H. Schwartz, 2nd edition, Elsevier, 1985.
    3. Tom Binford, private communication.
    4. Mind Children, by Hans Moravec, Harvard University Press, 1988.
    5. From Neuron to Brain, second edition, by Stephen W. Kuffler, John G. Nichols, and A. Robert Martin, Sinauer, 1984.
    6. The switching network of the TF-1 Parallel Supercomputer by Monty M. Denneau, Peter H. Hochschild, and Gideon Shichman, Supercomputing, winter 1988 pages 7-10.
    7. Myelin, by Pierre Morell, Plenum Press, 1977.
    8. The production and absorption of heat associated with electrical activity in nerve and electric organ by J. M. Ritchie and R. D. Keynes, Quarterly Review of Biophysics 18, 4 (1985), pp. 451-476.
    Acknowledgements
    The author would like to thank Richard Aldritch, Tom Binford, Eric Drexler, Hans Moravec, and Irwin Sobel for their comments and their patience in answering questions.


    bye!
     
  20. kmguru Staff Member

    Messages:
    11,757
    Every few decades, we advance in technology. Then the brain is compared to the technology of that time. The brain has been compared to abacus, telephone switch borad and now computer. There is even a book called holotropic mind. The problem is we do not have an equivalent gadget handy to compare. Consider this.

    Audio: The brain hears sound 24X365X85=744,600 hours of audio information

    Video: Visual is 16X365X85=496,400 hours of visual data
    Touch, taste and so on....add more

    The brain produces more result set after computation.

    Some of the raw data is processed and stored in mathematical form compressed so that they can be expanded when the need arises.

    The prediction system of the brain has no comparable artificial counter part. It works almost like a quantum computer.
     
  21. Avatar smoking revolver Valued Senior Member

    Messages:
    19,083
    no quantum computer has been built tht we know of, therefore we can not be sure how it will precisely work and we know little of how brain works
     
  22. kmguru Staff Member

    Messages:
    11,757
    Based on the news, quantum computer has been built and tested at 4 Qbit level. 6 Qbit is being worked on.
     
  23. Avatar smoking revolver Valued Senior Member

    Messages:
    19,083

    Please Register or Log in to view the hidden image!

    Please Register or Log in to view the hidden image!

    Please Register or Log in to view the hidden image!

    and I didn't even know of this

    Please Register or Log in to view the hidden image!

    Please Register or Log in to view the hidden image!

    Please Register or Log in to view the hidden image!



    can give me a link on this (if you have a good one, if not I can google myself)
     
Thread Status:
Not open for further replies.

Share This Page