Ray Kurzweil is probably totally wrong

Discussion in 'Intelligence & Machines' started by Roman, Apr 8, 2009.

Thread Status:
Not open for further replies.
  1. Roman Banned Banned

    Messages:
    11,560
    An interesting NYT article

    Some interesting excerpts:

    "Our colleague David Linden has compared the evolutionary history of the brain to the task of building a modern car by adding parts to a 1925 Model T that never stops running. As a result, brains differ from computers in many ways, from their highly efficient use of energy to their tremendous adaptability."

    "One striking feature of brain tissue is its compactness. In the brain’s wiring, space is at a premium, and is more tightly packed than even the most condensed computer architecture. One cubic centimeter of human brain tissue, which would fill a thimble, contains 50 million neurons; several hundred miles of axons, the wires over which neurons send signals; and close to a trillion (that’s a million million) synapses, the connections between neurons."

    "But unlike a computer, connections between neurons can form and break too, a process that continues throughout life and can store even more information because of the potential for creating new paths for activity. Although we’re forced to guess because the neural basis of memory isn’t understood at this level, let’s say that one movable synapse could store one byte (8 bits) of memory. That thimble would then contain 1,000 gigabytes (1 terabyte) of information. A thousand thimblefuls make up a whole brain, giving us a million gigabytes — a petabyte — of information. To put this in perspective, the entire archived contents of the Internet fill just three petabytes.

    To address this challenge, Kurzweil invokes Moore’s Law, the principle that for the last four decades, engineers have managed to double the capacity of chips (and hard drives) every year or two. If we imagine that the trend will continue, it’s possible to guess when a single computer the size of a brain could contain a petabyte. That would be about 2025 to 2030, just 15 or 20 years from now.

    This projection overlooks the dark, hot underbelly of Moore’s law: power consumption per chip, which has also exploded since 1985. By 2025, the memory of an artificial brain would use nearly a gigawatt of power, the amount currently consumed by all of Washington, D.C. So brute-force escalation of current computer technology would give us an artificial brain that is far too costly to operate.

    Compare this with your brain, which uses about 12 watts, an amount that supports not only memory but all your thought processes. This is less than the energy consumed by a typical refrigerator light, and half the typical needs of a laptop computer. Cutting power consumption by half while increasing computing power many times over is a pretty challenging design standard. As smart as we are, in this sense we are all dim bulbs."
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. kmguru Staff Member

    Messages:
    11,757
    What a Moronic comment! The comparison should be Brain verses Information Science. That is what computers do!

    Brain computes, so does the Computer.

    As to whether we will have computers, or whatever name it will be called (like Abacus to Excel Spreadsheet)...that is a no brainer....by people who have the brains to understand the reality!

    No Kurzweil is not wrong...

    Sam Wang is a Biologist, not a Information Scientist...never designed an Integrated Circuit or understood Information Science. It is same area when dentists try to do brain surgery....

    Otherwise, he would be the chief architect of the Knowledge Management for the Department of Defense!

    He is also a professor of Neuroscience, yet I bet he has not had a masters level course in cybernetics. Most pathologists fill slots in neuroscience and no body says anything.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. kmguru Staff Member

    Messages:
    11,757
    Do not know, if it will be printed, but here is my comment to NYT

     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Roman Banned Banned

    Messages:
    11,560
    So you really think we'll see singularities on everyone's desktop in the next 30 years? I guess if More's law has held true for this long, it must be true forever!

    Besides Kurzweil is a futurist wanking to his own delusions that he's worth preserving forever. He was wrong last decade, he's wrong this decade.
     
  8. spidergoat pubic diorama Valued Senior Member

    Messages:
    54,036
    Moore's Law did bridge several different paradigms of computer design. There is no reason that computing could not use less power in the future. Working computers have been made with DNA, which uses no power.
     
  9. Roman Banned Banned

    Messages:
    11,560
    Working computers have also been made out of neurons.

    Oh wait!
     
  10. kmguru Staff Member

    Messages:
    11,757
    People who are in the computer field for the last 30 years know where the technology is going. It has nothing to do with Ray. He just agrees with us.
     
  11. spidergoat pubic diorama Valued Senior Member

    Messages:
    54,036
    And future artificial computers could be made by simply scanning a human brain and reproducing it's components.
     
  12. Roman Banned Banned

    Messages:
    11,560
    Really?
    Just like that?
    I don't think so.

    A simple scanning isn't going to let you grow a brain. The TFs involved are highly complex, and it takes about 20 years for it to all mature.

    That, and you really haven't hit a singularity; you've just grown a brain. That happens to be something some of us have already done!
     
  13. spidergoat pubic diorama Valued Senior Member

    Messages:
    54,036
    Scanning technology is getting better and better, and the we don't need atomic resolution to create a working brain, since the brain itself is tolerant of error. The brain is limited by it's need to pass through the birth canal. We could merge two or more brains together. We wouldn't need to make it out of perishable flesh, but print it in durable silicon. The singularity isn't a single computer, but rather describes a point at which technology surpasses human intelligence and can start to direct itself. In other words, we can build a computer that designs computers.
     
  14. Roman Banned Banned

    Messages:
    11,560
    But a trillion silicon connections per cubic inch?
    Is that even feasible?

    Is there any evidence that shows a silicon brain works the same as a fleshy one? There may be (likely are) emergent properties of having everything squishy and wet and tetravalent.
     
  15. spidergoat pubic diorama Valued Senior Member

    Messages:
    54,036
    Maybe, but we aren't limited by the needs of a body. We might not have to provide blood. We might discover a better way to perform the same functions. Maybe we do grow it out of flesh instead of building it, a disembodied head a mile across!
     
  16. kmguru Staff Member

    Messages:
    11,757
    Human brain manages data and information. All you have to do is duplicate that data and information management with a automaton. We are not there yet. I am trying to get certain agency to allocate some funds since I am the architect for their simple solutions....but no funds are available due to our economic crisis for new applications.

    May be 5 years down the road....
     
  17. D H Some other guy Valued Senior Member

    Messages:
    2,257
    No. AI researchers simply assume that the brain computes -- and does nothing but compute. The brain is an analog device. Computers are digital devices, and are thus limited by that fact that almost all numbers are not computable. Digital computers are limited by Church's thesis. That analog devices are is an assumption, not a conclusion.
     
  18. hypewaders Save Changes Registered Senior Member

    Messages:
    12,061
    Not only are artificial "neural" networks becoming real- so are synthetic synapses. We're not purely analog (DNA for example) and the "digital revolution" won't stay purely binary. It's natural to intuit that AI and ourselves will not long remain separate organisms.

    On Singularity generally: How can we deny our trajectory?
     
    Last edited: Apr 16, 2009
Thread Status:
Not open for further replies.

Share This Page