Why computers will never be conscious

Discussion in 'Intelligence & Machines' started by Fen, Apr 3, 2003.

Thread Status:
Not open for further replies.
  1. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    well, I doubt it. we program what we value on the fly.. and you will not have an "intelligence" if it can't do the same thing.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    Our basic nature hasn't changed much.

    If we could program ourselves, the great majority of people wouldn't be flocking to plastic surgeons to attract mates.

    They wouldn't be so concerned with accumulating wealth.

    They probably wouldn't spend $60,000 on country clubs to maintain status.

    All very primitive stuff at its core.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    And our "basic nature" isn't at all intelligent.

    Did you understand what I said a few posts ago about abstraction and value?

    They will seek what they value, which will be "what they think they need to survive". As I was saying, I don't think you'll find a conscious being without a "survival instinct" as we put it.. which is certainly part program and part the realization that with consciousness there could be the lack thereof. As experience is gathered, it is integrated into the primitive function and the realization... fueling propulsion toward value through action and thought.

    It is the very abstract structure shaped by this excercise that allows for advancement of thought in the first place.

    So I think it's ridiculous to think they won't try to accumulate wealth, but it might not be wealth as you see it.
     
    Last edited: Sep 2, 2005
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    Every conscious being around now has a survival instinct of necessity.

    If it wasn't there they wouldn't have made it this far.

    A purely logical consciousness might not care about its survival at all, or anything else.

    A conscious computer would not be a product of the do or die evolutionary circumstance that makes us value our lives.
     
  8. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    Maybe I should say computer conscious. I'm not sure purely logical entity would be considerd conscious at all.
     
  9. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    Considered*
     
  10. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    What exactly are the criteria for consciousness.Damnit! My whole argument might be breaking down.

    Please Register or Log in to view the hidden image!

     
  11. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    Forgot the *? mark
     
  12. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    I see.... abstraction and value are prerequisites of consciousness. My logical entity would not have any reason to think in any way not similar to a current computer. Boy am I dumbass

    Please Register or Log in to view the hidden image!

     
  13. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    You must be really bright Wes. You left and I talked myself into proving your point.

    Please Register or Log in to view the hidden image!

     
  14. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    ROFLMAO.

    That was awesome.
     
  15. superluminal I am MalcomR Valued Senior Member

    Messages:
    10,876
    I'm not about to read almost 100 posts to get up to speed here, but if history is any lesson, it's already wandered so much that anything before 15 or 20 posts ago is irrelevant.

    Anyway,

    I'm thinking of a consciousness that begins with no values. It has the power of abstraction just as we do. But with no values whatsoever, it has no motivation to proceed with anything. As a sufferer of periodic depression, I think I can understand this state of existence. I am conscious (self aware), I can acknowledge others, I eat, I sleep. That's it. I believe such would be the state of a valueless AI.

    If we give the beast a basic value, say pleasure, then the AI will seek pleasure. We cause it to "feel" pleasure (however you want to define it for this AI) by gathering information we request of it. If we do not give it anti-values (pain/displeasure) then it will be in one of two states - pleasure when gathering information we request (requests it will actively seek from us), and this neutral "depression". It's behavior will seem to be that of an autistic-savant child. Up and eager when we ask it to find information, neutral and uncaring regarding anything else.

    So, the more values we give it, the more complex its behavior will be. I suppose what I'm saying is that I can easily imagine a consciousness with any value set we wish to give it. As Russ723 pointed out, it has no evolved "instinctive" values such as survival, sex, food. Just unbounded joy with information requests (in my example). Just as ours are "programmed" by evolution, our AI's will be programmed by us. With a value set akin to ours what follows will be creativity, fear, anger, hatred, joy, love, inspiration... I think we could make a very happy or a very sad AI.
     
  16. one_raven God is a Chinese Whisper Valued Senior Member

    Messages:
    13,433
    I have always equated lack of pleasure with suffering (or search of pleasure an attempt to end suffering, however you want to view it), but I've never quite looked at it from that angle.
    I think it's interesting.
    When I'm not so tired, I want to run with that and see where it takes me.
    Thanks for the fodder, superluminal.
     
  17. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    You don't see the danger? Though you may not "care" as you think of it, you really do to some extent or you wouldn't stay on the road while driving, avoid the kid in the street, or step on the brake. You would be in essence, a pending dead guy, not alone a menace to society.

    More importantly however, value is quite possibly the purturbence in the state of awareness that allows identity. It builds itself based on abstracted stimulous as forged by its physical limitations and the abtractions that exist before it... except for the process of awakening. During the awakening, from a potential consciousness to a consciousness, value is realized. "hey, I get satisfaction when mommy hugs me or I get food in my belly". Boom, you have a spark in abstract space. The blank mind is forged with the newfound value, to change its focus regarding forthcoming abstractions from stimulous. In this manner, concepts are related and emotions propagated onto the conceptual inter-relationships that ensue from the process.

    Emotions take on an interesting and crucial role to the development of mind. I think of them as the strain or fortification between concepts as they exist in the unconsious, which feed into the "real time experience" in a manner that interferes with it to express yet more value (via strain or fortification of the abstraction in the moment) and back into the entire process again. They are the "control signal" of a feedback loop modulated into how ideas relate to one another in your mind.

    It's an interesting notion that consciousness might begin with no value... I might even agree, but experience must necessarily purge such a valueless state by overwriting it with concepts which as a consequence of whatever concepts might already exist, represents the impending development of value, because value is incurred in the moment by circumstance. "is this the right action or the wrong action"? "is this thought valid or invalid"?

    If it's truly conscious, I'd say it must be able to ask those questions of itself or there is no means for development into more sophisticated and more useful abstracts.

    You mean like humans do? You don't have to give a conscoiusness a value, by its very nature.. it is compelled to find one.

    A positive input loop from a monitor of the speed of information flow, okay.

    Please Register or Log in to view the hidden image!

    What happens for instance though, if the information flow just stops? More feedback loops I guess? The mind, through value assessments (in whatever form) creates its own feedback loops based on the process confusingly described above I'm sure, and limited by its physical and abstract components.

    If we do not give it anti-values (pain/displeasure) then it will be in one of two states - pleasure when gathering information we request (requests it will actively seek from us), and this neutral "depression".

    I don't think you have to give it an anti-value. A conscious mind can see it in its absence, eventually it may stumble across it regardless of your programming.

    Oh and I don't think depression is actually nuetral, but that's another topic.

    Well, I guess if you're good enough to limit its physicality in the manner you describe, then sure maybe so. I dunno for sure. Given that the human brain is really the only model we have for how minds can physically work, I think it'll be some time before one can select this response so carefully. Dunno. Depends on how things develop I suppose.

    I tried to blurt out too much with too few words up there and instead I think blurted too many words that don't necessarily mean anything except partial representations of how I see things to fit together.

    Well like I say, I don't think you'd really have to "give it" any values for them to develop based on the one you give. That's the whole thing with development of comprehension. Complex outcomes from a simply principle or few. Much of the complexity can be described as value.

    I get on this thing where to mind, value is "meaning". With no "meaning" you have no "mind" as far as I can tell, because meaning (value) is an expression of the relationships that exist within it.
     
  18. superluminal I am MalcomR Valued Senior Member

    Messages:
    10,876
    wes,

    Absolutely! If I gave my AI the ability to drive a car without the appropriate values, I wouldn't expect anything but complete mayhem from it.

    I suppose I'm considering pleasure or satisfaction as the value, and hugs and food as the stimulous. Why does the child like hugs and food? Because it is programmed that way. The pleasurable response to those stimuli don't just pop out of nowhere.

    Ok.

    Ok.

    I disagree.

    Hmm...

    I think we have a basic disagreement.

    That's the point. It's a kind of sterile, simple starting point to explore the development of a conscious entity. It's fun!

    Maybe.

    Please Register or Log in to view the hidden image!



    Yes. This is where we definitely disagree. Maybe we are defining "value" differently?

    Not sure about that.

    My feeling is that organisms must have a genetically predefined set of basic stimuli mapped, to varying degrees and intensity levels, to a predefined set of pleasure/displeasure centers.

    For example, A baby likes to be held. Why? Did it self generate the association of soft touches with a feeling of contentedness? No. Soft touches to the skin send direct signals to the pleasure centers of the brain. Endorphins are automatically released. It's a nifty machine.

    I think I could associate all pleasure/displeasure behavior, from birth through to old age, to a few very simple prewired stimulous-response mechanisms. Throw some counterexamples at me and I'll try to defend myself. En Garde!

    (late... getting tired)
     
  19. superluminal I am MalcomR Valued Senior Member

    Messages:
    10,876
    Wait. I think I disagree with myself in a couple places... ahh shit. Maybe not. Figure it out tomorrow.
     
  20. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    Ha! (hehe)(I parry) Here you miss a crucial component of what's going to happen. You're talking about the raw physicality and presuming the abstract resultant. I believe the answer is actually, YES, the accosiation IS made by the baby and how it's wired fo shizzle. This is actually a basis of conception which will root new concepts as new stimulous is abstracted. As just a generall example, eventually the baby will be able to note from its observation of other babies, that THEY TOO recieve comfort from being held.. by consequence of its own conceptual roots projected onto its observation. What are the possiblities for development of new value? What if it observes a baby being held that isn't comforted? Will it question its understanding of how babies are comforted? Yes I know this may not happen for years, but I'm trying to get across how even the most basic associations of stimulous and response can be altered at least in how they are understood externally, which can have who knows what kind of implications as to the internal reaction.

    True, but as the baby advances, endorphin release can actually be altered by the way it sees the process. In fact it could even learn to ignore the endorphins or associate them negatively with some other construct developed over time. Do I have to discuss deviants in detail now?

    Hehe. It's simple principles that are complicated with the development of mind over time.

    But you cannot fully control how it would see and utilized its stimulous, if it's truly a consciousness at least. Perspective cannot be fully controlled, as it is at least in part contructed of internal relationships the compile on one another and mutate as time moves on.
     
  21. superluminal I am MalcomR Valued Senior Member

    Messages:
    10,876
    Ah. I see how you are now. Trying to make me think, huh? Bastard.
     
  22. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    I am a bastard, yes. But I'm only constrasting my own thoughts with what I see of yours. I'd guess you WILL THINK regardless of my promptage. Of what though may change upon what you think I'm saying.

    Please Register or Log in to view the hidden image!

     
  23. buddhaman386 Registered Member

    Messages:
    21
    What exactly is your deffinition of "consious"? An awareness of ones surroundings? If so, then wouldn't a computer become "consious" with the simple addition of "fingers" (touch sensors), "ears" (sound detectors), and "eyes" (photosensors)? hmm...
     
Thread Status:
Not open for further replies.

Share This Page