Can artificial intelligences suffer from mental illness?

Discussion in 'Intelligence & Machines' started by Plazma Inferno!, Aug 2, 2016.

  1. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    The program stared Steven Hawkins (there were a whole series of them where he took 3 average people and trained them to think like a genius).

    This particular segment involved them being hooked up to a computer reading their brain waves.

    Think it was more than propagation delay. Something like
    1. Pre decision spike
    2. Decision spike
    3. Propagation
    4. Action
    The readout showing a clear spike before the decision. Will try to track down the episode to rewatch and pay more attention.

    Yes

    No.

    Want the operator to be like engineer Scotty

    "I'm giving it all I got Captain"
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    Agree with you on some of these points. I'm thinking it would be really curious to see if a machine was programmed to stating out loud ''I'm in agony!'' ...how we might react to that? Maybe as humans, we could still be triggered to feeling empathy for even a machine. A machine by itself though, could it ever feel something without us attaching our own feelings to it? If a machine blurts out ''I'm in agony,'' our natural inclination (unless you're a sadist) would be to empathize with the machine. Even if logically, the machine experiencing pain or any feeling at all, fails to make sense. In that way, I could buy into this then. lol

    Please Register or Log in to view the hidden image!

     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    Really? Who is the programmer?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. billvon Valued Senior Member

    Messages:
    21,635
    Evolution. The good programming gets saved; the bad programming doesn't.
     
  8. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    But what triggers feelings in humans often comes from experiences, childhoods, peers, etc. All of our many reactions to various situations aren't pre-programmable. Some might be, but definitely not all. Unless one believes that the universe is pre-determined?
     
  9. billvon Valued Senior Member

    Messages:
    21,635
    The basics are - anger, fear, sadness, disgust, surprise, anticipation, trust, and joy. We come with these emotions (and their basic triggers, like fear of heights, disease and attack) pre-installed. As we grow and learn, we associate more and more things with those emotions, and thus what can trigger them (and what actions they elicit) change.

    These emotions are part of the pre-programming of our brain because they helped us survive for millions of years. If you are disgusted by someone with a horrible disease, and avoid them, you will tend to survive more often than someone who associates with them. If you are frightened by attacking animals, or heights, you will tend to survive more often than people without those fears.
     
  10. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    And how could this transfer to machines, do you think?

    The difference I see between machines and humans relating to suffering, is that a human can suffer alone...on a deserted island. Only he needs to feel the suffering to know he is suffering. But, a machine would require someone else to detect its suffering, because it simply can't feel its own suffering on its own. Even if it was programmed to suffer, it could only be programmed with a human sense of suffering in mind, considering a human would be the programmer/designer.

    We tend to attach human reactions to inanimate objects/machines when we say things like ''my phone just died.'' As if when it's operable or the battery isn't yet depleted, that the phone is somehow ''alive.'' So at best, we might just have a deep rooted need to attach our own feelings onto machines, but there really isn't any without our involvement.
     
    Last edited: Dec 19, 2016
  11. billvon Valued Senior Member

    Messages:
    21,635
    Via a very similar method. Future AI's will likely have basic programming that sets up the basic way their networks learn and adapt. They will fear things that can harm them (to prevent them from doing those things) and enjoy things that help them (like learning.) On top of that, new experiences will cause changes to their neural nets, allowing them to form new memories, incorporate learning in future decisions and elicit (and respond to) their basic drives/emotions in new ways.
    That's an assumption you make; that humans must be inherently different from machines.

    Let's take a thought experiment. Let's say a friend of yours was getting old, and she needed neural prostheses at some point. Let's say she got a deep-brain implant to stop her Parkinsonian tremors (that exists today) and a memory prosthesis to help her continue to form short-term memories (that's in trials now.) She got a visual implant to help her with her failing sight, a motor-cortex prosthesis to help her walk and an endorphin regulator to help combat her mood swings. During all this time you continue to be her friend, and the various implants seem to help - she's able to continue walking, continue remembering recent events and continue seeing. You are happy for her.

    Now let's say she was shipwrecked and ended up on a deserted island. Would she be unable to detect her own suffering, because she is now part machine and can't feel loneliness on her own?

    Would there be a level of replacement (assuming the replacements were identical in function to the originals) where you would say she could no longer feel loneliness? What would have changed?

    We are all biological machines. Incredibly complicated, intricate machines, but still machines that obey the basic laws of physics. It is the activity of that machine, not the machine itself, that determines things like awareness and the ability to suffer.
     
  12. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    You reply to me as if you are an authority on this topic, or that you have special knowledge and know with certainty that machines would in fact, suffer. lol You don't know anymore than me, what might happen...or could happen. We're all just offering our opinions, at this point. You seem to be passing your opinions off as facts. (I'm not.)
     
  13. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Not a authority

    But if we start to build a AI machine specifically to feel pain I guess we would keep going until we succeed.

    We might have the idea to use the machine to test the latest u beut pain relief medication.

    The moral dilemma as I see it
    1. Should we build such a machine?
    2. Can we justify using such a machine?
    3. The machine must know it is feeling pain to report back to the testers the level.
    Not sure if you could call just feeling pain suffering, even if the pain is jacked up to a very high level.

    Might need to be a whole lot more of humanity built in.

    The testers might just hear a disembodied robot voice

    "That pain was at 53 out of the range 0 to a 100"

    and/or

    "That last medication took the level down to 35".

    Incidentally does anybody out there know, without looking it up, the name of the pain scale?

    I do, so just curious.
     
  14. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    A moral dilemma could emerge in that humans use machines. Machines don't use us, really. So if they ever develop the ability to suffer, would we be faced with a moral dilemma when it comes to using them? Would using them, cause them to suffer? If someone uses me, I suffer. I don't like to be used by friends for example, it is hurtful. So, going with that idea, might that happen with machines? (I think it's a stretch, but if we were told that machines have the capacity to suffer, would we feel guilty using them?)
     
  15. billvon Valued Senior Member

    Messages:
    21,635
    Well, I was hoping to pose a thought experiment for discussion.

    Let's try another one. Let's say you live long enough that you can have parts of your brain augmented/replaced as they age/wear out. After many decades most of your brain has been replaced. From your perspective you feel the same, although much older. At what point will any suffering you feel not be real?
     
  16. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    I would.

    Can't report for anyone else.

    Think that is beyond the movement to stop using animals to test cosmetics.

    Empathy.

    Now "not tested on animals" is a marketing tool.

    Other research, to obtain cures for diseases, are used to study effects, particular unwanted side effects.

    It should be expected that some of the testing will cause suffering.

    So the dilemma becomes

    "If we (the testers) can lower or eliminate the pain without compromising the test should we continue?"

    and/or a judgement call

    "If this test causes suffering but we save 1 person is it worth it?"
    "If this test causes suffering but we save 100 people is it worth it?"

    I'm guessing most people will juggle the parameters to justify whatever action they decide on.

    Humpty Dumpty juggling
     
    wegs likes this.
  17. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    I won't feel the same though, because my brain will be gone and replaced. Can a machine experience consciousness? That's where we're heading with your ''thought experiment,'' and if a machine can't be independently aware of its surroundings as a human being can (I'm not saying it can or can't, but if it can't) then how will it feel pain, and other sensations/emotions?

    I don't think being a cyborg would be half bad, though. Maybe I wouldn't feel the same levels of pain as I do now, because I'm half hoping machines would be more efficient than human bodies. (I don't like feeling pain, lol.) In my eyes, pain is a physical and mental deficiency, but we process it and ''make the most of it,'' as humans. We talk ourselves into a lot of positive thought in order to deal with the horrors and sufferings of life. But, if pain could be lessened or removed from the human experience, it would be worth experimenting. I don't see cyborgs suffering on the same levels, if at all, as humans.
     
    Last edited: Dec 20, 2016
  18. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Easy.

    No point.

    Humpty Dumpty not suffering

    Please Register or Log in to view the hidden image!

    Poe the same
     
  19. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Your not feeling the same I contend would only be the result of knowing your brain has been replaced.

    Feeling I contend would range from OK with it to I wish I had my original.

    I wish I had good eyesight but I am OK with glasses.

    They could be MADE more efficient if you look at dialing down pain as being more efficient.

    In doing that you loose a warning system.

    Humpty Dumpty Oh my fractured shell.
     
  20. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    How interesting, I think the opposite. I don't think I'd know...it's not like a heart replacement. A brain replacement would be like replacing the hard drive of a computer system. It would in essence be replacing the ''me'' that I once was.

    You say this now lol but life as a cyborg would mean no glasses are needed, and you probably would process incoming information and external stimuli differently (better?) than you do, now.

    I personally do think dialing down pain would make us more efficient. Never use a sick day for work again.

    Please Register or Log in to view the hidden image!



    Now, that's a great point.

    Why do you reference humpty dumpty in your posts? lol
     
  21. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    When I use a word,” Humpty Dumpty said in rather a scornful tone, “it means just what I choose it to mean — neither more nor less.”

    Alice in Wonderland

    Trying to head off those who want to argue definetions not the subject.

    To your point about brain replacement. On rethinking I guess it would depend on how much of the brain information is copied into the replacement.

    ie if you have a 100gig drive almost full and decide to upgrade to a 200gig.

    Copying all files over you have a full replacement brain.

    If you decide to weed out your old files and finish up only copying over 55% in effect you have lost 45%.

    In the case of a brain did that lost 45% contain the memory section of your brain being replaced?

    Humpty Dumpty??? Am I smelling super glue?
     
  22. Write4U Valued Senior Member

    Messages:
    20,069
    Question: why wuld we want AI to feel pain?
    The purpose of using AI is the advantage that they don't feel pain, but continue their assigned task regardless of environment.
     
  23. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Part of a post I put in reply to a post from wegs

    We might have the idea to use the machine to test the latest u beut pain relief medication.

    Humpty
     

Share This Page