Why computers will never be conscious

Discussion in 'Intelligence & Machines' started by Fen, Apr 3, 2003.

Thread Status:
Not open for further replies.
  1. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,844
    I'm not going to waste time bantering with definitions of things that are damned near impossible to define, but I'll say "the process of abstracting stimulous into meaning" is a product of consciousness.

    The data itself is exactly meaningless unless there is "an observer" to render it meaningful. Unless the "observer" is "conscious" it cannot forge this "meaning".
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. KitNyx Registered Senior Member

    Messages:
    342
    I think...there has been much discussion on valueless beings...Could we, as beings with values, envision or much less design, build, and or program a being without values? If we all had the exact same programming/engineering training would our programs have the same code? I am guessing not. I would think that my values would show through my code as would your values in yours. Even if the only values are only simplicity, elegance, etc...

    So, in effect...the computer does have 100 million years of evolution designed into it.

    Our values are created and reinforced by emotional involvement. Emotions are involuntary/unconscience hormonal responses to stimuli...(internal and external). There are logical reasons why simulated emotions should be included in any complex computer system that will have many multilayered parallel and series processors working simultaneously with limited processing power. For example, I would definitely include a system that would simulate frustration. If the program is dedicating said amount of power and time to solving problem x then it would need to weigh the possible benefits of solving the problem with the amount of resources necessary to make it happen, based upon how much it has used. Eventually, if the problem in still not solved it will cross a threshold telling it that the problem is no longer worth solving. This should work with any effort...otherwise ask the computer the answer to pi and you would have a worthless AI.

    So what is the difference between emotions and simulated emotions? What is the difference between a machine with hormonal systems and a machine with simulated hormonal systems? In my opinion, there is none. In my opinion, machines are and will be the next step in our evolution. If we, homo sapiens, want to coexist with our machine progeny, then it will happen because we exist symbiotically.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,844
    We can envision whatever we fancy... The more pertinent question in this vein of thought is: Is the capacity to assign value an imperative facet of intelligence?

    What's your answer?

    I don't see how that is in any way relevant to creating an intelligent being.

    But they are part of the abstract component of self. The hormonal responses are reactionary to the abstract. As such, they are reflective of a component of something completely abstract, and not necessarily separate from thought itself. In fact "emotional involvement" is not the right term... I'd call it "impact", which is IMO, necessary to encapsulate a concept, or relate it to others.

    You're talking about chips and logic. I don't think that line of thinking will ever yeild a truly conscious being. Perhaps though it could happen by accident in quantum computers.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. KitNyx Registered Senior Member

    Messages:
    342
    I disagree...a newborn baby has no abstract concept of "self", but it still responds to emotional "impact". It can be frightened, startled, amused, pained, etc...I agree that the "abstract component of self" can in itself create hormonal responses (or at least envision situations that the hormonal system will respond to), that is why I included the blurb "stimuli...(internal and external).

    Chips and logic? True. Neurons and logic? True...Neurons either fire or do not...True or False, on or off, 1 or 0...same thing. What makes you think our mind works any different than any other system in our body? Our body accomplishes amazing feats, but it does so by breaking the problem down into tiny parts...I can bench 200 lbs because each of my muscle cells take on a tiny fraction of the effort. The brain works the same way...

    I agree that AI may come about as a result of the bizarre processes involved with quantum computing, but there is no evidence that our minds function as a result of quantum computing...If we can do it without quantum computing than we are the evidence necessary to show that it CAN be done without quantum computing.

    - KitNyx
     
  8. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,844
    Sure it does. Eventually the process of doing instigates conceptual development. I assert that the emotional "impact" solidifies and facilitates the relationship between whatever concept was created, and new or existing concepts, which are at that stage, generally unsophisticated.

    - a "threat" to the flow of being.

    - a "shock" to the flow of being.

    - a "fortification" of the flow of being.

    - a "stress" to the flow of being.

    If you are 'being', any of those conditions can happen, regardless of whether or not you were designed by humans (if you're AI).

    And I think that the hormonal response is simply a reaction to what happens to the flow of being as I was trying to demonstrate above.

    First and most obvious, because other elements of the body only exist abstractly to the mind. With it, they are useless blobs of biological machinery. Further, if half a brain's nuerons are killed off (being now simply zero, does consciousness go away? Some of the things you might have been conscious of might, but consciousness persists.

    I think you're missing something here, but I can't put my finger on how to get the message across.

    Perhaps I'm an idiot though, and my ego has convinced me my perspective has merit. I really can't tell if it does. I only know what I see. Perhaps I'm actually blind.

    I'm almost completely convinced it can't be done by a chip "as we know them", no matter how many of them you can run parallel to one another. Then again, maybe.. but it would have to take a funk approach.. hmm. Yeah I don't know how to facilitate the model, as I don't have teh model completely figured out. I need unlimited resources and an army of smart dudes. Given that (and control of it), I'm almost sure I could create AI if it's possible.
     
    Last edited: Sep 4, 2005
  9. KitNyx Registered Senior Member

    Messages:
    342
    I agree with you in the end. I do not believe it will be done with chips as we know them. In fact, I am not sure it will be done with chips at all. I believe it will be an engineer not a programmer who creates the first AI...unless, a simulated neural net built within a 3D array is utilized.

    - KitNyx
     
  10. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,844
    Read at your own risk. My head is spinning from having written it and at this juncture, I have no idea if it makes a smidge of sense at all. It's a blur that I can't read right now:

    For the purposes of understanding mind I've postulated a degree of freedom of mind that I call "the abstract component". It's the realm of meaning. I originally stumbled across this postulate as a consequence of my answer to the koan about the tree in the forest. My answer is "the tree makes a sound, but the sound has no meaning". This focused me on the last term of my answer as that which separates mind from mechanism. It is the blending of the abstract with the mechanistic that allows for the condition of mind, creates perspective and allows meaning to the answer for the koan. While there exists the tree, it is exactly meaningless with no perspective to grant it such status.

    In this idea, a model comes to mind. Let's assume "the tao" as "nuemenal reality" exists. It is by definition "nameless" in that it is a seamless function with no mind to discern between events. The act of observation assigns meaning (value) to an event. It is by the nature of the relationship of perspective to "the tao", reflected upon and filtered through that which it already internally exists to the observer. It is the contrast between the two states of the now that allow concepts to be formed in abstract space, and forces the brain to wire itself to represent this relationship to the best of its ability given its current conditions (chemical efficiency, existing relationships, blah blah, the condition in its entirety). In fact, consciousness of this process is what comprises dreams, as short-term events are translated into circuitry that allow more permanent (but they can be quelled by negative feedback later) place in the structure of potential thoughts. The short term is imprinted by the long term with pre-existing meaning (a section of "activated" conceptual relationships) which is then shaped by stimulous and overlayed to the current activated relationships like a puzzle piece to see if it fits. I think I got that part backwards, but in general, they merge to a resultant that then "falls forward" to the next branch demanded by the current resultant. How odd. My own circularity there got me stuck in a lil loop here trying to identify what it is about the current state that commands the next. It's a vector in abtract space perhaps, a "momentum" of sorts.

    Obviously, abstract space is created or "accessed" internally and is thus subjective. "abstract space" is as such, necessarily experienced differently to all. It is in effect, the "geometry" of the conceptual relationships established in your brain by its operating parameters. I think it is reflected in some manner through the relationships established by nuerons from one to another. (an interesting side note: if part is taken away, some of its content can be recovered in the part that remains through the dangling relationships left by its departure)

    And I lost my train of thought. *sigh*

    Hmm. I wanted to say something about value in here, and why its unavoidable and uncontrollable, at least ethically.

    Abstraction to me, is the act of taking a combination of inputs and giving them value, as in "placing them in relationship to others, such that there is contrast between this and the other". Whereby with no contrast between the two as I note from the analysis of the koan, there can be no meaning. With no value, there is no contrast. Without constrast, there is nothing to isolate one idea from another. It gets an "abstract assigment". "this thing" means [show some "picture in your head" (so to speak)]. Think of the word apple. Does it bring an image to mind? Okay now "tree". What is the difference between the two concepts? You see the tree as a container perhaps for the apple, or any number of things. Each of those things is potentially new value expressed in the contrast in value (meaning) between the two notions. And fall forward we go.

    I sort of visualize it as interacting 3-dimensional "chunks of time" creating circuits that can activate, which can be imparted into the status of mind in whatever "present time" basically by the momentum of whatever activity preceded that moment.

    (and to add a weak sentence extension: or as directed by biological processes as input into the 'status of mind' at the time)
     
  11. buffys Registered Loser Registered Senior Member

    Messages:
    1,624
    lol, that must have taken an hour to write.
     
    Last edited: Sep 4, 2005
  12. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,844
    My brain still hurts.

    Do you have any idea as to what I meant? Unfounded gibberish?
     
  13. buffys Registered Loser Registered Senior Member

    Messages:
    1,624
    oh crap. Sorry, I thought it was an elaborate joke.

    Is what you wrote "unfounded" or "gibberish"? that's hard to say. Almost every word passed my spell check so I guess that says something.
     
  14. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,844
    Well, I can't say it isn't a joke, but it wasn't intended as such.

    Spell check approval is quite validating. Thank you.

    Please Register or Log in to view the hidden image!

     
  15. buffys Registered Loser Registered Senior Member

    Messages:
    1,624
    It's so hard to recognise tongue-in-cheek or subtle humor from a serious point when it's written down. Can you summerize the point you were making? I'm honestly interested but your post tended to wander a bit.
     
  16. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    I don't think a computer consciousness must neccesarily value its own existence but, it must value something.

    A computer with no abstracts would do math based program interpretation constantly.

    A computer with no values would not prefer one operation over another.

    Therefore it would behave just like the computer in front of you, reguardless of whether its hardware could support consciousness.


    Having no personal sensory interpretation, it would not be within the "Taoist Trap"
     
  17. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    It also would not care to ridicule Russ for *Necesarily*
     
  18. Russ723 Relatively Hairless Ape Registered Senior Member

    Messages:
    158
    AAAAAAHHHH!!! I can't even reproduce my mistakes correctly!
     
Thread Status:
Not open for further replies.

Share This Page