The Matrix: are machine the new form of life?

Discussion in 'General Philosophy' started by Dudeyhed, May 18, 2003.

  1. Dudeyhed Conformer Registered Senior Member

    Messages:
    246
    Before I outline what this thread will be about I want to express my view on something about life. I believe in reincarnation, why is not relevant, but even if I was wrong about reincarnation, I would still belive that the mind is seperate from the body, that is, our minds are not indefinitely bound to our bodies.

    Now, let's forget religion.
    ------------------------------

    After watching a few episodes of the Animatrix, especially the one about the rebellion of the machines, I started thinking about what a world where machines ruled would be like. What if all life was wiped out and all that was left on Earth were machines? Imagine that they had developed AI that was extremely advanced, and I mean insanely extremely advanced. The way I see it, sooner or later, the machines would develop something of a conscience, some sort of concept of their own existance and the existance of others around them. Would this then mean, that they are equal to humans?

    If my belief in a mind separate from a body is correct, does it make sense that the machines could take on their own minds?

    Think about it, if the mind was really something alone, would it matter what form of physical vessel it would take? If a machine's AI equaled the thinking capabilities of a human, would the machine then be a new individual, one with the ability to think for itself? Would it then be another form of life?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. ProCop Valued Senior Member

    Messages:
    1,258
    RE: Dudeyhed

    I think the machines have some kind of concsiousness already. Consciousness is a byproduct of very fast interaction between the processor (inteligence) and RAM (memory). Above some speed levels of info exchange between memory and processor the consciousness spontaneously arises in that process. (To compare it metaphorically with some experince of the humans: something happens which "stupifies you" your look gets blank brain functions lose their speed you are sort of "unconscious"... then your thinking speeds up and suddenly it's you again, the speed is back and the consciousness too). If the speed is the key to the arrisal of consciousness then machines will have it pretty soon.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Soulcry Registered Senior Member

    Messages:
    162
    What if we are machines ?

    Please Register or Log in to view the hidden image!



    "Did you take tha test Mr. Deckard" (from Blade Runner)
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Clockwood You Forgot Poland Registered Senior Member

    Messages:
    4,467
    So far most machines are purely reactionary like bacteria. Eventually though I think they will develop consiousness and we would have no right to get in their way.

    After a certain amount of complexity is achieved life and technology become indistinguishable.
     
  8. brainuniverse Registered Senior Member

    Messages:
    30
    Re: RE: Dudeyhed

    Thats not true, consciensness is about self-reflectivness, if machines would have self-reflectivness they would be totally independent of humans and would creat their own symbernethic world(which would be their "brain")

    Would there be consciensness machines one day ? Maybe, but in order that this be possible, it must have a processing syncronic ability that the brain is able to, that is of course a Quantum "CPU" it has not really to do with speed, but rather a syncronism... as for conscience, when you talk about the "you again"it may rather be explaned that it is you again, because your brain has recorded that you were conscient a fraction of second ago.

    I see the brain as a TV set, and the conscience as the cable, and each condenser and stuff in the set, process syncronically because the information from the cable is given to them at the same time. So a Quantum "CPU" will separate the mind from the brain, because of the syncronic processing of the information.

    So yes ! I believe Dudeyhed, when he say both are separated, Quantum physic leaves us to believe that, there is a work writen by a neurophysiologist bringing evidences of the control of mind on the brain. If you are interested I can give you the name of the individual, I do not buy the explanation of the "neurologic darwinism" to explan consciensness, those neurologists that brings those theories are just trying to save one of their disciplins tasks, because there is a kind of shift and that if this shift happen, they will be lost because of their lack of knowledge of what Quantum mechanic is.
     
  9. brainuniverse Registered Senior Member

    Messages:
    30
    They can't by their own, brain does not work like computer memory, the only way we can do such a thing is by building Quantum CPUs, that will separate their minds as an syncronic process to the .object" in which it is self-reflective... it is more then just building faster processors, it is more about redesigning the entire process of building a new chip, and the links we may do, we should replace silicium with Carbon, and we are far from having the technology to do that.
     
  10. Dudeyhed Conformer Registered Senior Member

    Messages:
    246
    I don't believe that machines have consciousness at the moment. Machine's don't really think for themselves at the moment, unlike humans. Perhaps its the ability to think for oneself that marks the presence of a 'mind'.

    It's interesting, what soulcry says. This is far out and probably just straight out wrong, but what if we are? Machines of another race before us? What if we rose against our creators when we gained our own minds?

    Seriously thinking about it, the human body, and the body of all living things are so machical when broken down. There are so many things that happen in an order than make our bodies work, just like a machine has its own little bits and pieces to make it work. Machines are far from the complexity of the human body but say they did reach that same level of intricacy, then how are they different?

    Disregarding the mind, we need energy sources; machines need energy sources. We produce waste in our function; machines also do this through exhaust fumes in cars... hmm, can't really think of any other forms

    Please Register or Log in to view the hidden image!

    but I'm sure there are shiteloads of waste at the powerstation that's providing power to your computer right now. The human body is really a very very very complex machine.

    So, I'm sure nobody doubts that it may be possible to create immensely complex machines in the future, machines that equaled the complexity of the human body. If life had to go on, if there were minds that needed physical vessels, could it be possible for machines to satisfy the conditions necessary for a mind to exist within a physical world?

    Or would it be that the AI of machines would be too perfect to allow free thought? Would their AI be a series of logical instructions that had no creativity?

    Please Register or Log in to view the hidden image!

     
  11. kyle_soule Registered Member

    Messages:
    20
    During The Second Renaissance you will notice when the machines present the Apple (peace) to the United Nations. This scene made me realize something about the consciousness of machines, that was, they have no motivation, their reasoning is absent. They have no means by which they can say, for example, "I want to go eat" when they are not hungry, and go eat anyways.

    Their motivation for trying to join the UN was not motivated the same way human's are. Human's will join the UN for protection, perhaps, or other reasons. The machines have no reason for joining, because they have no moral's, per se, and don't care if they destroy an entire species (viz humans). I think this is the key difference between human intelligence and AI.
     
  12. Ectropic Registered Senior Member

    Messages:
    195
    Then again, how guilty were you the last time you threw away a computer? How guilty were you when you ate your last steak? What if the level of complexity a machine had was comparable to you as you are to a cow? Would it be okay for the machine to kill you since you are the lesser life form?
     
  13. kyle_soule Registered Member

    Messages:
    20
    This is a very good question, but first the machines would have to achieve a higher status of life form, but they don't have life, do they? I suppose we first must define living...
     
  14. Clockwood You Forgot Poland Registered Senior Member

    Messages:
    4,467
    Will it be alive when it tells the user "screw you" and goes on strike?
     
  15. Ectropic Registered Senior Member

    Messages:
    195
    You are right. This is where all of these AI conversations seem to go. So to follow in the pattern I have seen in previous threads I will answer with more questions.

    Is the single worker Ant alive? She cannot survive on her own, nor can she reproduce or defend herself very well. If she finds food it takes her a very long time to follow her path home. Now, regardless of whether the ant is alive or not, what about the entire nest as a whole. The colony acts much more like a "normal" creature that each of the single ones do. I think this building of higher structure through small simple components was coined "Emergence" by an author I can't remember off the top of my head.

    So let's apply that to machines. A single computer sitting on my desk and recording every typo I type is not really alive. But what if we coupled it with a nuclear reactor core that generated electricity to sustain itself, then we add an assembly line robot to allow the machine to reproduce. Now we have a self sustaining machine that can let us play Quake anywhere in the galaxy.

    I think the important part of the machine (the brains) is still to come. As Moore's law works it’s magic computers WILL supercede our brain's complexity. There is no doubt about that, it is the nature of economics

    Please Register or Log in to view the hidden image!

    . After all, our brains are really just a bunch of binary computers sending analog signals to each other.

    Not only will the surpass out brain's complexity, but the interconnections between each sections of those brains will be hundreds of times faster that our sluggish neurons. So even with efficiency that is 1/100th of that which our brains (sometimes) enjoy they will be just as capable as us, and don't forget that we don't get twice as smart every 18 months like they do.
     
  16. kyle_soule Registered Member

    Messages:
    20
    Does living necessarily have a dependence on self-reliance? A baby is considered alive, but they are unable to provide for themselves to sustain their own life. Is it possible that living things only need a self-awareness? This certainly makes it possible a machine to be living, you could even argue that they are living today.

    The reproduction is a creation from pieces, this is not characteristic of the living. All living things start with everything they need already "in" them, they simply need time to grow and mature physically before they resemeble the final product. A computer doesn't resemble the final product at the beginning of the assembly line, because a harddrive has to be combined with a motherboard, video card, monitor etc before it has all the necessary parts to run.

    I agree that in the future machines will have brains better than ours, but when do they become conscious and make conscious decisions to revolt and rise up against us? When will they realize they are born/created into slavery?

    Computers don't get smarter in 18 months, they get faster. Stephen Hawking shows that you must sacrifice speed for intelligence (or vice versa), you can't have them both without giving the other.
     
  17. Ectropic Registered Senior Member

    Messages:
    195
    True, but I would say that all living things need a means of spreading. And on top of that, a fetus does not look like a baby just as a pile of computer parts does not look like a computer.

    That is a good point, there are lots of ways to argue with it, but I don't think it will be a problem. The machines will not be competition or slaves. They will be part of us in a symbiotic relationship. Except for things like toasters and blenders. Maybe they are better off not being too smart.

    Please Register or Log in to view the hidden image!



    Yeah, I need to go back and read that. Could you tell me which book that was in? Because I don't think I understand it, but it sound familiar.
     
  18. kyle_soule Registered Member

    Messages:
    20
    A fetus is all the needed parts in one package though, a pile of parts isn't in a package, it needs something to assemble it, humans don't need assembly, in this sense.


    I don't know if any species can live in harmony (on the same level). I think sooner or later one or the other would want to "put the other in its place" so to speak.


    I believe it's touched on in The Universe in a Nutshell, it seems that it is covered in more depth in another book, but I don't recall which book. :bugeye:
     
  19. river-wind Valued Senior Member

    Messages:
    2,671
  20. Mystech Adult Supervision Required Registered Senior Member

    Messages:
    3,938
    Re: RE: Dudeyhed

    Go take a few programming courses, you'll learn that computers are as smart as an abacus, there's no intelligence there, not even in AI.
     
  21. kyle_soule Registered Member

    Messages:
    20
    Are we truely amazed when a dog sits? We call the dog that learns lots of tricks smart, but we call a kid that has trouble grasping Calculus stupid.

    By the same token, are we amazed that the robot moved? and just so happened to move outside of it's 'cage'. We shouldn't be, nor should we believe this to resemble intelligence.

    As for the first link, they created that to 'evolve' into something else which is the same as creating a CD player to play a CD, we aren't surprised that it can play ANY CD we put in as long as it's music, so why should we be surprised that the machine evolved like it was supposed to?
     
  22. Ectropic Registered Senior Member

    Messages:
    195
    Re: Re: RE: Dudeyhed

    That might be true, but they are still slow. How smart are a few dozen of your neurons?
     
  23. ProCop Valued Senior Member

    Messages:
    1,258
    RE:Ectropic, Mystech

    Ever heard of Deep Blue? This abacus beated the worlds best human in chess (Kasparov). Please read news once in a while.

    If the computers were thinking already we wouldn't know that would we. If the computer develops consciousness for some very short periods in intense computing it cannot be contacted because there are no contact chanels. I believe that if consciousness is a byproduct of thinking, (which is moving infos from one place to an another place and comparing them) then computer (does thinking and therefore) has or will soon have consciousness.
     

Share This Page