Someone thinks that machines cane think as human brain ???

Discussion in 'Intelligence & Machines' started by Luis A.C.ROMANELLI, Oct 6, 2008.

Thread Status:
Not open for further replies.
  1. fedr808 1100101 Valued Senior Member

    Messages:
    6,706
    "machines cane think...." hopefully they cans spells bettuh two

    Please Register or Log in to view the hidden image!

    Please Register or Log in to view the hidden image!



    I dont think it is even fair to theorize about this, how can we make machines that emulate our brains when we have only seen the tip of the iceberg of our brains. We know very little of the human brain.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    6,606
    Mayb it wont be "us" who emulates the brane but the tecknology we spawn.!!!

    So to you... perfect emulaton of human branes hasnt hapened so it cant hapen.???

    Like our knowledge about the brane... the size of the tip of the iceberg concernin the potential of machines is also un-known... but equally irrelevent as evoluton marches on.... an not that it woudnt be posible... but i suspect that when "machines" are in charge... emulatin humans wont even be on the bak burner.!!!
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. fedr808 1100101 Valued Senior Member

    Messages:
    6,706
    im saying that we cannot make a machine that can perfectly emulate our brains when we don't even have a total understanding of our minds, its like trying to copy a page from a book when half of the page is scribbled out and is impossible to understand
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. John99 Banned Banned

    Messages:
    22,046
    an ant is much more intelligent than the most advanced microprocessor. actually there is no comparison.
     
  8. spidergoat Valued Senior Member

    Messages:
    51,740
    Sure we can. We don't have to understand everything about how it works, we can emulate it's structure, and how it grows. We can make neural networks that program themselves how to do new tasks. Even our own DNA does not contain the total plan for the brain, only a series of instructions on how to grow.
     
  9. John99 Banned Banned

    Messages:
    22,046
    as far as i know the reason artificial intelligence is called artificial is because it can never make a decision. there are various options to choose from but a human put those options there.

    i guess it is like saying that if you took a bunch of red cards and laid them one behind the other stretching within reasonable viewing range and somewhere in there you place a blue card and ask a person to pick out the blue card he can do it easily. on the other hand if you took another person with bad eyesight he would not be able to pick out the blue card but only because he couldn't see it. That does not mean he is less intelligent.

    my spell checker:

    i mistyped the word saying and spelled it as asying. my spell checker could not figure out what word i wanted to type. obviously a human would be able to because he\she would look at the words around the mistyped word and know instantly what the word should be. no of course if someone were to put the time into programming to look at words around and make an assessment it can do better but it is still just patterns that it is looking for and it could never make a definitive determination, it just cannot know.

    so the question is can it be more intelligent? no. can it be as intelligent? no because it cannot learn new things.

    if i put a serious of words into a program:

    love
    happiness
    contentment

    and a bunch of other, to a machine these are just meaningless character. a human can take those words and expand on them and create a poem out of them that would be insightful and others can learn from it.

    the thing is that organic intelligence and mechanical intelligence exist on two separate planes.
     
  10. John99 Banned Banned

    Messages:
    22,046
    i remember about ten years ago i was building a computer and my friend asked how i was going to connect to the internet and i told him i was not going to connect to the internet. and he told me that you are building a paperweight. sure this was an exaggeration but in some ways he had a point.

    so what is the point? the point is that without humans a cpu is just a rock or sand compressed together. and that is all.
     
  11. Algernon Registered Senior Member

    Messages:
    176
    I'd agree here. Even though we don't fully understand how our brain works, its really hard considering there isn't a whole lot of research that has been done on how people develop their personality. We've only managed to map out the direct correlations of body functions to the brain, such as motor reflexes, physiological homeostasis regulators, fight and flight responses and etc. Sensory functions have been mapped to certain parts, emotional stimuli can be linked to pituitary gland and its hormonal and chemical secretions, but as to how all these sensory stimuli and internal reception is received... we still don't have a definite answer to all that. I think it partly has to do with the fact that people's lives are so varied that it would be hard to control any factors that would affect the development of cognitive characteristics within the frontal cortex.

    However, it could theoretically be possible under extremely controlled situations if we had a ton of identical babies, twins/triplets/clones or what not from the moment they are fertilized into zygotes... but no way in hell would anybody find the means to justify such an experiment ethically.

    Given enough flexibility and ability to retain information and make decisions based on previous experiences or programmed emotional responses (here it would prove to be difficult because emotional response can be sometimes emulated with a specific random set of variables, ie CPU experiences what correlates to a painful experience to a human, there is 10% probability it will learn from the pain, 40% chance it will want revenge, or 50% that it will deal with it and learn. These percentages would still have to be calculated so it wouldn't be in fact genuine emotional response to a situation). There has been recent advances in "memristors" which would be able to retain previous voltages or electrical currents even when all power has been shut off. Even though the technology is still primitive and expensive, it can prove to be valuable and allow silicon based machines to be able to retain information even when turned off, cutting down boot times and possibly reduce board chipsets and space.

    I guess in a ways we would have the most difficulty in emulating our "soul" or "ghost" whatever you want to call it. There would be no way to actually measure or determine a machine's ability to recognize itself. I mean, we have no way to even test for another human being's cognitive abilities aside from measuring brain waves and topography. In essence, we'd only be emulating the abilities our brain has, such as calculating problems, deducing answers to social problems with information being presented based on logic factors, and machines would be able to follow protocols within corporations without falling to human errors such as corruption, mistakes due to personal conflicts, illnesses, attention span, etc. However it would lack the human aspect of all those, which includes emotional and personality.

    If it were up to me, and I were still alive in the times when computers would be advanced enough to be tested for "self-recognition", I would ask for a few things.
    1) Compose a tune, it wouldn't even have to be a good one. It would have to be somewhat unique, but could also be a mix of previous works. Then it would have to tell me what its 5 favorite songs of all time are and why it liked them.
    2) It would have to paint a piece of art or create an artistic work
    3) It woudl have to write a poem. Then it would have to interpret it
    4) It would have to find something it valued and explain its sentimental value

    These would be some of the questions I'd ask it. Reminds me of Robin Williams in Bicentennial man. lol
     
  12. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    6,606
    This is the part that struck me as mos odd:::

    Precously posted by BOB
    "I dont think it is even fair to theorize about this"


    I dont see "fairness" as havin anythang to do wit it... but not thankin about thangs until they happen souns regressive an thats not the way human nature works... an as far as completely understandin our branes... that will hapen sooner than later dew to progressive thout which includes continuin to build machines capable of what we ant... an 1 day the scales will tip... an it will be the machines desidin what to create nest.!!!
     
  13. John99 Banned Banned

    Messages:
    22,046
    but the problem is that fight or flight is an instinct. it develops as humans grow and are able to reason but the basics are already there even at infancy they are there but of course an infant can only do so much.

    in computers this can be known as hard coding but even that is a misnomer, well as far as we are able to tell is just different.
     
  14. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    6,606
    Two diferent planes... or effectivly the sam thang... jus at diferent ponts in evoluton.???
     
  15. Algernon Registered Senior Member

    Messages:
    176
    I don't quite understand why you are so against the idea of computers having intelligence. I mean, I guess its quite possible computers may never have that ability.
    Your argument is based on current technology. and the statement that without humans a cpu is a rock or sand compressed together.
    1) a CPU is not a rock
    2) a CPU is not compressed sand
    Metal can be included in rock, but a rock does not have to be metal.
    I guess compressed sand might be closer, but a CPU is mostly metal (Cu and Al?) and silicon.
    Granted we don't have the abilities to program computers to think and make decisions like us. But we already have programs which program other programs. Its quite primitive for now, but it does its job. You don't need to deny the possibility of ever happening, its just a possibility.
     
  16. Algernon Registered Senior Member

    Messages:
    176
    lol. I was thinking the same thing. I'd say computers are much smarter than bacteria, but who knows, i may be wrong. They have theorized about making an organic computer, where bacteria would be used to process things, although I don't remember how. It did require a ridiculous amount of them though, and the nutritional requirements and the heat produced by the mitochondria would be phenomenal.
     
  17. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    6,606

    Yes... you have ponted it out nicely that humans an machines are programed diferently... an of course... humans evoluton has a huge head start on machines... but when machines evolve to the pont of also bein able to create... compared to human creative abilities... machines will leave us in the dust.!!!
     
  18. John99 Banned Banned

    Messages:
    22,046
    Algernon,

    i acknowledge that i do not have inside information on future technology but the basic premise is that a computer can never...NEVER be smarter than its teacher or maybe instructor\programmer is a better word. this is slightly different than what the op is asking but i believe that they are somewhat related.

    i was being creative or just exaggerating and that is something else a computer will never be and why is that? because it just lacks the ability. there are just too many variable and the variables i am referring to are not present in machines\computers\AI.
     
  19. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    6,606

    I dont see "instinct" as bein anythang magical... i see it as meerly part of the increasinly complex programin which occurs thru evoluton.!!!
     
  20. John99 Banned Banned

    Messages:
    22,046
    but now we are talking about capabilities. you can say it was evolution but was the capacity always present in humans to develop intellectual capacity we see today? i think that the answer is obvious.

    just like an ant, which relies on collective intelligence, so do humans. you can argue that large networs do the same but faster does not provide the ability to learn and it in terms of computers\machines the network does not take away from the limitations whereas in organic intelligence it does. these are fundamental differences. so i stand by my original post, there is no comparison. we are talking about entirely separate methods here.
     
  21. John99 Banned Banned

    Messages:
    22,046
    as powerful as bacteria are and fundamental to life and death they still have predefined capabilities. but now that is very interesting because you suggest incorporating organic life int a machine but then is it still a machine? i dont think so. but even this is an acknowledgment on the inherent limitations of AI (machine intelligence) that can never be overcome.
     
  22. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    6,606
    Sure machines dont curently have that abiltiy... but that ability also wasnt present at som earlier pont durin the process of human evoluton... an yet it did com to be.!!!

    Machines are very young... still closer to an ants complexity than a human brane but ther potential is beyond emaginaton.!!!
     
  23. John99 Banned Banned

    Messages:
    22,046
    another huge problem is heat. as it stands now more capabilities is seemingly self defeating. the point of diminishing returns.
     
Thread Status:
Not open for further replies.

Share This Page