Discussion in 'Intelligence & Machines' started by cosmictraveler, Jan 4, 2008.
Remember the human brain is a machine. It can be built and it can be superceded.
Log in or Sign up to hide all adverts.
mmm I belive it can be superceded, that's inevitable. I don't think I agree that it can ever be "built", grown - maybe even duplicated, but of all man's achievements I don't think we'll ever be able to fully replicate a brain into an A.I. (this might not even be a negative conotation...) There's too much transition from us having sensory input and an actual "percieving" of our environment that I don't believe a replicated intelligence could share. You can make A.I. all day long, but how could they "learn" to be human without exact human conditions? Until there is a common ground for existance (I have no idea what that could be except for crazy scifi realm antics) we will be separate from any intelligence that doesn't share the same set of inputs for a similar experience.
It is only possible if we can upload a human brain (the memory and the program) to an artificial brain. That is quite years away....
All depends on your definition of "smart". A two dollar calculator is "smarter" than I or any other human on this planet, but only if I define smart as processing speed to carry out any number of logical or mathematical functions.
intelligence is coupled in humans with emotions, willingness for survival and continuous existence at certain costs and efforts.
to what degree should robots be programmed as such, is a matter of design and purpose..
intelligence is not self sustaining, and wisdom is another issue, a world of intelligence pure and simple is without meaning and purpose..
replacing peoples by robots is not only a matter of independent evolution but also of human design, emotion and intention..
technology domination over human decision making is already a current case but these are emerging from a apparently selfless hi tech entity.
creative solutions to the inherent technology dilemma are possible
But we would not have souls by definition, correct?
I'm not certain "soul" and "definition" work well together in the same sentence Please Register or Log in to view the hidden image!. Theorhetically we should retain our "souls" no matter what, unless the provider of said souls decides to revoke the privilege of their bestowment upon us assuming there is a subsidiary of our selves that actually functions without physical presence. I think I've decided (with no actual proof) that A.I. wouldn't have need for the limitation of a soul, because they don't actually live - or coincidentally die. We assume that because we as humans fear our own demise, that something with like intelligence would also fear termination. I'm not sure that would be the case, and if the singularity occurs there's probably the point that not only would a super-intelligence be able to prolong it's existance indefinatly, but possibly ours as well. The coolest part about science is the unknown and the unimaginable, that's where I see A.I. leading us - augmenting our capabilities and patching our weaknesses.
Please Register or Log in to view the hidden image!
Or if the "wrong" people start to design and build it could be the demise of civilization, except for the very few who will carry on controlling what the "new order" decides is best. Please Register or Log in to view the hidden image!
I think the most interesting topic within this dicussion is the fact that a lot of you see A.I. "pessimistically" with trouble brewing on the horizon for mankind... Why is that? (serious question, not meant sarcastically) Granted the military and various other totalitarian groups could use it it for ill, but there's also the other side of the coin where even the military might be nulled out when a superior intellect comes to be. The Subject line of the post even begs the question, but is it just pop culture that makes us think A.I. will "exterminate" us, or is this something those of you who are voting for the extermination outcome actually fear?
My favorite story on this subject is:
Yes. However, I think it wise to stick to making silicon and other inorganic machines. I am a little concerned about the hybrids in the brain etc. and more concerned about the purely organic computers - Some day when they are more advanced they may decide humans are nutritious and dumb as pigs.
Separate names with a comma.