Building Gods or Building Our Potential Exterminators?

Discussion in 'Intelligence & Machines' started by cosmictraveler, Jan 4, 2008.

Thread Status:
Not open for further replies.
  1. Barry Flannery Registered Member

    Messages:
    64
    Remember the human brain is a machine. It can be built and it can be superceded.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. scifitm Registered Member

    Messages:
    46
    mmm I belive it can be superceded, that's inevitable. I don't think I agree that it can ever be "built", grown - maybe even duplicated, but of all man's achievements I don't think we'll ever be able to fully replicate a brain into an A.I. (this might not even be a negative conotation...) There's too much transition from us having sensory input and an actual "percieving" of our environment that I don't believe a replicated intelligence could share. You can make A.I. all day long, but how could they "learn" to be human without exact human conditions? Until there is a common ground for existance (I have no idea what that could be except for crazy scifi realm antics) we will be separate from any intelligence that doesn't share the same set of inputs for a similar experience.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. kmguru Staff Member

    Messages:
    11,757
    It is only possible if we can upload a human brain (the memory and the program) to an artificial brain. That is quite years away....
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. zarlok Banned Banned

    Messages:
    116
    All depends on your definition of "smart". A two dollar calculator is "smarter" than I or any other human on this planet, but only if I define smart as processing speed to carry out any number of logical or mathematical functions.
     
  8. ak.R Registered Member

    Messages:
    41
    intelligence is coupled in humans with emotions, willingness for survival and continuous existence at certain costs and efforts.
    to what degree should robots be programmed as such, is a matter of design and purpose..
    intelligence is not self sustaining, and wisdom is another issue, a world of intelligence pure and simple is without meaning and purpose..
    replacing peoples by robots is not only a matter of independent evolution but also of human design, emotion and intention..
    technology domination over human decision making is already a current case but these are emerging from a apparently selfless hi tech entity.
    creative solutions to the inherent technology dilemma are possible
    I think..
     
  9. kmguru Staff Member

    Messages:
    11,757
    But we would not have souls by definition, correct?
     
  10. scifitm Registered Member

    Messages:
    46
    I'm not certain "soul" and "definition" work well together in the same sentence

    Please Register or Log in to view the hidden image!

    . Theorhetically we should retain our "souls" no matter what, unless the provider of said souls decides to revoke the privilege of their bestowment upon us assuming there is a subsidiary of our selves that actually functions without physical presence. I think I've decided (with no actual proof) that A.I. wouldn't have need for the limitation of a soul, because they don't actually live - or coincidentally die. We assume that because we as humans fear our own demise, that something with like intelligence would also fear termination. I'm not sure that would be the case, and if the singularity occurs there's probably the point that not only would a super-intelligence be able to prolong it's existance indefinatly, but possibly ours as well. The coolest part about science is the unknown and the unimaginable, that's where I see A.I. leading us - augmenting our capabilities and patching our weaknesses.
     
  11. cosmictraveler Be kind to yourself always. Valued Senior Member

    Messages:
    33,264
    Or if the "wrong" people start to design and build it could be the demise of civilization, except for the very few who will carry on controlling what the "new order" decides is best.

    Please Register or Log in to view the hidden image!

     
  12. scifitm Registered Member

    Messages:
    46
    I think the most interesting topic within this dicussion is the fact that a lot of you see A.I. "pessimistically" with trouble brewing on the horizon for mankind... Why is that? (serious question, not meant sarcastically) Granted the military and various other totalitarian groups could use it it for ill, but there's also the other side of the coin where even the military might be nulled out when a superior intellect comes to be. The Subject line of the post even begs the question, but is it just pop culture that makes us think A.I. will "exterminate" us, or is this something those of you who are voting for the extermination outcome actually fear?
     
  13. madanthonywayne Morning in America Staff Member

    Messages:
    12,461
    My favorite story on this subject is:
    http://www.alteich.com/oldsite/answer.htm
     
    Last edited: Mar 12, 2008
  14. Billy T Use Sugar Cane Alcohol car Fuel Valued Senior Member

    Messages:
    23,198
    Yes. However, I think it wise to stick to making silicon and other inorganic machines. I am a little concerned about the hybrids in the brain etc. and more concerned about the purely organic computers - Some day when they are more advanced they may decide humans are nutritious and dumb as pigs.
     
Thread Status:
Not open for further replies.

Share This Page