Intelligence and Awareness

Discussion in 'Intelligence & Machines' started by wesmorris, Oct 2, 2003.

Thread Status:
Not open for further replies.
  1. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    Do they go hand in hand? Does something have to be self-aware to be intelligent?

    I suppose it all depends critically on how you define each term. I find a practical problem in that a strict definition is generally impossible to fully agree upon. That said, is the question irrelevant? It seems that the more specific you are about the definition, the more irrelevant the question becomes.

    It just seems that a requirement for an in-depth anaylsis is depth ya know? As such it is possible to process large volumes of data via a binary algorithmic blah blah but how can it do much more? Is in-depth analysis deep at all or merely an illusion? If it IS an illusion is that illusion necessary for this type of analysis "meaning of life" kind of stuff. Whassup?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. spookz Banned Banned

    Messages:
    6,390
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. G71 AI Coder Registered Senior Member

    Messages:
    163
    There are many systems which are generally considered intelligent and which know nothing about itself.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. one_raven God is a Chinese Whisper Valued Senior Member

    Messages:
    13,433
    They way I have always thought about it, is that the requirement to be considered "intelligent" a machine would need the ability to learn on its own.
    If something is programmed to have a specific response to specific stimuli, then it is not intelliegnt.
    If that response changes solely due to past experiences the machine had (without external intervention or manipulation) then that machine has learning ability.
    This definition, admittedly, is a bit loose. Someone could argue that something could be programmed to appear to evolve, however, that would be simulated learning ability. The machine (or software controlling the machine) must be able to reprogram itself when faced with the unexpected inputs at its various "senses".

    So, no, I don't think something must be self-aware to be considered intelligent.
    However, the question that DOES bring up (for me at least) is this...
    In order for something to be able to deal with, adapt to, learn from and evolve according to different unexpected scenarios, the machine would have to be built to re-program itself as it learns with no limits placed on its evolution...
    In that case, would intelligence LEAD to self awareness?
     
  8. G71 AI Coder Registered Senior Member

    Messages:
    163
    one_raven: Most of the current AI development has something to do with games. Creatures in many games have 100% hard coded behavior rules and still can act very intelligently in their environments. They are INDEPENDENT extensions of author's intelligence = they are intelligent. But as wesmorris said, it's all about definitions. My definitions are here. Imagine that you cannot learn any new concept from this moment forward. Would it make you not-intelligent?
    Not necessarily.
     
  9. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    I would say yes by my pre-existing notions. Not by your defintions though. It's difficult to say. I'm not entirely sure that "thinking" per your definition is possible if you cannot conceptualize, again depending on defs in the sense that "new ways" can be arrived upon randomly.

    EDIT: for instance I think that for something to be "self aware" it has to be able to alter its fundamental goals (not sustenance, but purpose) based on the results of its thinking and have the capacity to do so ad infinitum. hmm. something like that anyway.
     
    Last edited: Oct 2, 2003
  10. G71 AI Coder Registered Senior Member

    Messages:
    163
    wesmorris: I think there are many levels of self-awareness and the minimum requirement is the ability to distinguish between self and nonself. Many computer viruses have this ability. Can you give me an example of the fundamental goal change? I do not think we will go against our fundamental goals even if we have full control of our DNA and about our mind. We may be hitting the definition wall again. BTW the good thing about our definitions is that they do not always need to be prefect or generally accepted. They just need to be helpful in terms of moving us (hopefully forward). The only world where I see perfect and general definitions is the “world” of math. It's interesting that we are so successful tying it with the other “worlds”. Maybe the “other worlds” are also so perfect (just a bit harder to read).
     
  11. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
     
  12. one_raven God is a Chinese Whisper Valued Senior Member

    Messages:
    13,433
    Yes, they can have the appearance of intelligence, however, that does not make them intelligent.
    If they can not make decisions based on rationale, reason and knowldge gained from past "experience" they are not intelligent.
    If they are not autonomous in their decision making process and do not have the ability to not only acquire new information and incorporate it into their reasoning but also the ability to change they way they think entirely, then they are far from independent.
    They are no more independent than a pocket calculator.
    Would you consider a pocket calculator intelligent?
    It IS an extension of the inventor's intelligence.
    The illusion of intelligence, is not intelligence.

    That is FAR too broad.
    Under this definition, any batch file is intelligent.
    The pocket calculator from above is intelligent.
    Hell, even my washing machine could be considered intelligent under those loose constraints.


    Yes.
    That would absolutely mean that I am no longer intelligent.
     
  13. one_raven God is a Chinese Whisper Valued Senior Member

    Messages:
    13,433
    Wes,
    Please describe (at least a broad guideline) what you consider self-awareness.

    As general, debatable and almost ethereal as the meaning of that word is to different people, we need at least some kind of baseline definition to work from.

    What would be implied by your definition?
    Consciousness?
    Autonomy?
    Understanding of mortality?
    Introspection?
    Self assessment?
    Survival instinct?
    Differentiation of one's self from others, understanding the concept of individuality, individual goals and freedom to act upon accessing those goals (or choose not to)?
    Freedom of choice at all?
    Etc...
     
  14. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    One word seems to imply everything to me but I dunno... it's vague too: Comprehension.

    Often I think of it as "realization that one is separate from their environment". I believe that implies the ability to comprehend and all the implications thereof. I think the list of stuff you were saying falls out of that for the most part but then you'll ask what do you mean by "comprehension" and I feel circular because I mean "subjective awareness". It's maddening. Heh.

    Hmm.. awareness of self. That doesn't explain it well enough eh? Hmm.. how to do it? I think therefore I am? I'll try to get back to this in a bit.
     
  15. one_raven God is a Chinese Whisper Valued Senior Member

    Messages:
    13,433
    wes,

    So, if I am getting this right, your question boils down to:

    "Is the intelligence of an entity dependant upon the entity being cognizant of the fact that it is an individual that is separate from but included in observing its environment and surroundings?"

    Or is it more than that?
     
  16. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    Almost exactly. Well done. My slight rewording:

    "Can an entity be intelligent if it is not cognizant of the fact that is an individual that is separate from but included in observing its environment?"
     
  17. G71 AI Coder Registered Senior Member

    Messages:
    163
    Some of my conclusions: The simplest level of intelligence requires the ability to evaluate 2 options and pick the preferred one. No need for the ability to learn continuously. It learned something when it was built and it may learn something when it's upgraded. Do you learn the important stuff all the time? I don't think so. I see learning as something separate from thinking and the thinking = intelligent activity. So yes, we are surrounded by AI systems. The simplest level of self-awareness requires the ability to distinguish between self and nonself. Your criteria may be different - that's OK.. I personally do not think it's THAT important.. The important thing is how helpful the machines are when trying to solve our problems. And I'm sure they would be a way more successful if the time spent on defining terms like intelligence and self-awareness (world wide) was spent on the actual development of machines which can solve complex theoretical problems. I think it was a mistake to pick the AI term as a name for this branch of science.
     
  18. thefountainhed Fully Realized Valued Senior Member

    Messages:
    2,076
    Hey Wesmorris, you better give me "credit" for giving you the idea fool.

    Please Register or Log in to view the hidden image!

    . I think you are right, the more specific you get with the definitions, the more irrelevant the question becomes. Specificity in definitions will necessarily reflect preconceptions. A better question would be, is intelligence, as defined here, a prerequisite for selfawareness? Or does being does your degree of intelligence affect your degree of self awareness? Intelligent humans more selfaware than others. All this of course is for those believing in multiple degrees of selfawareness.
     
  19. Dinosaur Rational Skeptic Valued Senior Member

    Messages:
    4,885
    Intelligence and consciousness are a bit difficult to define in a way agreeable to all. I prefer the term consciousness to self awareness, which seems to be only one aspect of consciousness.

    Ignoring definitions, Deep Blue beating Gary Kasparov certainly indicates that behavior which seems intelligent can occur without consciousness. A person with no knowledge of the algorithms programmed into Deep Blue would consider the system intelligent.

    Deep Blue proves that AI behavior which mimics intelligence is possible, suggesting that AI intelligence cannot be dismissed as impossible.

    On the basis of the above, I would say that intelligence is possible without consciousness, but it is difficult to imagine the converse. I see no reason why it is not possible to develop an AI device with no consciousness.

    An interesting related question is whether consciousness is a necessary byproduct or merely an accidental byproduct of the complex intelligence we attribute to humans. Could intelligence evolve without consciousness? Off hand, I do not see any reason why the two must occur together.

    When I compare Deep Blue to a human playing chess, I visualize the human having thots like the following.
    • I am playing chess and I expect to win (or lose). I wish I were doing something else or I am enjoying this game. Focus! If you let your mind stray too much, you will lose to this dope. I wonder how many are rooting for me and how many for my opponent.
    To me, the human mind has two distinct components. One component (the intelligent part) does the thinking, planning, and directs the necessary motor activities to accomplish some task. The other component sets goals and seems to be watching the worker doing the job. I think of my life as analogous to a movie. One component is the main character in the movie, while the other is directing and watching the movie.

    I think the above suggests something about my concept of intelligence and consciousness without attempting a formal definition of either term.

    BTW: I do not think that true AI is possible using current computer architecture. I do not think that more memory and more CPU power will do the job using anything like current systems, even ones with a lot of parallel processors.
     
  20. Logan Aden Registered Member

    Messages:
    3
    Yep, seems to be just semantics, so it's hard to say yes or no. So, are computers intelligent? Depends on what you mean by intelligence. They can process information fast, do math equations near instantaneously, and remember everything. I think that qualifies as intelligence, even if they aren't sentient and creative.
     
  21. wesmorris Nerd Overlord - we(s):1 of N Valued Senior Member

    Messages:
    9,846
    /Intelligence and consciousness are a bit difficult to define in a way agreeable to all.

    Boy that's the stuff there.

    /I prefer the term consciousness to self awareness, which seems to be only one aspect of consciousness.

    Indeed, but it's valid to specify one or the other.

    /Ignoring definitions, Deep Blue beating Gary Kasparov certainly indicates that behavior which seems intelligent can occur without consciousness. A person with no knowledge of the algorithms programmed into Deep Blue would consider the system intelligent.

    Certainly.

    /Deep Blue proves that AI behavior which mimics intelligence is possible, suggesting that AI intelligence cannot be dismissed as impossible.

    Certainly.

    /On the basis of the above, I would say that intelligence is possible without consciousness, but it is difficult to imagine the converse.

    I see you've never met my mom's new husband.

    Please Register or Log in to view the hidden image!



    /I see no reason why it is not possible to develop an AI device with no consciousness.

    Certainly you must be correct, depending on your defnition of intelligence.

    /An interesting related question is whether consciousness is a necessary byproduct or merely an accidental byproduct of the complex intelligence we attribute to humans. Could intelligence evolve without consciousness? Off hand, I do not see any reason why the two must occur together.

    Again it depends on your def of course but I saw a doohickie on the science channel this one time regarding neandrothols. The hypothesis was I believe just as you said, that the ability to focus and perform tasks was as far as neandrothol ever got. They were imagining neandrothol as a "mind full of minds" if you will, each of the inner minds only able to manage certain tasks. One that let them focus intently on sharpening the edge of a stone for ten hours, one that wanted to mate, one for hunting, etc. So yeah good point there. I'm not sure if such a being would be "self aware" but I'd think them definately conscious.

    /When I compare Deep Blue to a human playing chess, I visualize the human having thots like the following.
    • /I am playing chess and I expect to win (or lose). I wish I were doing something else or I am enjoying this game. Focus! If you let your mind stray too much, you will lose to this dope. I wonder how many are rooting for me and how many for my opponent.
    /To me, the human mind has two distinct components. One component (the intelligent part) does the thinking, planning, and directs the necessary motor activities to accomplish some task. The other component sets goals and seems to be watching the worker doing the job. I think of my life as analogous to a movie. One component is the main character in the movie, while the other is directing and watching the movie.

    The part that directs and watches may be the one that ties it all together and moves from conscious to "self-aware"?

    /I think the above suggests something about my concept of intelligence and consciousness without attempting a formal definition of either term.

    Hmmm.. so "self awareness" now seems to me like a "tying together of multiple processing sections" or kind of "hive mind" all within the same mind. So first, intelligence. Hmm. Does intelligence have to be adaptive? What does that mean? I mean, I'm not sure "deep blue" is intelligent, because it cannot alter its code.

    /BTW: I do not think that true AI is possible using current computer architecture. I do not think that more memory and more CPU power will do the job using anything like current systems, even ones with a lot of parallel processors.

    Totally agreed. Seems to me that some sort of genetic/evolutionary architechure will have to be developed.
     
    Last edited: Oct 20, 2003
  22. BetweenThePoints Registered Senior Member

    Messages:
    68
    There was an article in popular science, I don't remeber the issue but it was from this year, that talked about something called a genetic algorithm. What they did was give the program a task, defined within certain laws, and then set it out to accomplish the task the most efficient way it came up with, without giving it any preconcieved possibilities. The algorithm went through several dozen possibilities, passing the best parts of the previous experiment on to the next, until it found the most efficient outcome possible. Convievably, given enough time, and a faster processing unit, this type of algorithm could find some sort of self awareness and creativity, if it were instructed to do so. I think that once it accomplished it's objective, it would become independant of any command given to it.
     
  23. Canute Registered Senior Member

    Messages:
    1,923
    Dino

    Yes, but don't forget that it may be neither of these.


    What does 'if it were instructed to do so' mean?
     
Thread Status:
Not open for further replies.

Share This Page