# Can Robots Make Ethical Decisions?

Discussion in 'Intelligence & Machines' started by sandy, Sep 21, 2009.

Not open for further replies.
1. ### quantum_waveContemplating the "as yet" unknownValued Senior Member

Messages:
6,671
Hi clueluss, long time no see,

.

Why do you question it?

3. ### cluelusshusbund+ Public Dilemma +Valued Senior Member

Messages:
7,767
Originally Posted by cluelusshusbund
quantum_wave---"A human however can change the chosen action given the same circumstances."

How do you know thats true.???

Yes its been awhile... i been busy tryin to learn people stuff

But to you'r queston... i cant emagine that such a thang has ever been tested... much less a conclusion arrived at.!!!

5. ### Billy TUse Sugar Cane Alcohol car FuelValued Senior Member

Messages:
23,198
No, I agree that you, as a human body, can move your arm etc. Where we probably disagree is about your claim that you chose to move your arm.

Every movement of the body (or thought) is caused by discharges in a network of nerves. The firing of each nerve in this network is controlled by the diffusion of neurotransmitters across synaptic gaps between nerves. This diffusion follows well know physical laws (as does every other process of nerve activity such as the depolarization wave that travels down the axon of the nerve that is in the process of "firing").

Thus your body is a complex biological machine, much like a hand held calculator with transistors switching in response to the pushing of its keys. I don't think it accurate to say that the calculator "choses" to active the display of 4 after the key sequence , 2 + 2 = has been pushed, do you?

If not, why do you call a much more complex, but also determined by the physical laws, sequence of nerves switching states in your body a choice?
So can the computer, if on the second occasion it is "running" a different program.

In my POV, the "you" is a program* running in parietal brain tissue, which I call the RTS program that is creating "you." At birth, this program is as your body's DNA structured it, but it is mainly a "learning" program. Thus, as you age, learn how to make both eyes point in the same direction, and much later how to do integral calculus, etc. I.e. This program that is "you" becomes much more complex and is constantly changing. Because it is a constantly changing program, today it may chose "a" and tomorrow "b" even though the circumstance of the choice are identical.

------------------
* I.e. you are not a body, but only information in a non-material program. Thus, "you" are not constrained to follow the physical laws. This is how "you" can differ from the calculator in principle, and not just in degree of complexity.

Last edited by a moderator: Nov 11, 2010

7. ### quantum_waveContemplating the "as yet" unknownValued Senior Member

Messages:
6,671
Probably not easy to test because the human subject would have to be able to encounter the exact same circumstances twice to prove that they might choose differently or illogically, which I maintain that a computer would have to be designed to do and we humans aren't designed; you may maintain otherwise.

The other test would be if two separate people were to encounter the same circumstances. One might choose one way and the other might choose another way; you may maintain that the reason that different choices are made is because they are different people with different learning or you might maintain that you cannot have exactly duplicate circumstances.

I agree that it probably hasn't been adequately tested. So it seems that each of us will make a conclusion on the issue if we want to take a position.

8. ### quantum_waveContemplating the "as yet" unknownValued Senior Member

Messages:
6,671
We agree. I don't think a computer chooses outside the code given to it, even if the code includes modification through "learning".
Because we don't understand the physical laws related to thought, decision making, consciousness, etc. There are unknowns and so if you imply that our thoughts and choices are all understood because we understand at some micro level how nerves work then that is the point of our disagreement.
Yes, that is the circumstance, if the computer is reprogrammed or has some "learning" programmed in. However, we design the computer and each and every action is programmed while humans are not designed and there is no known science that explains how we think or even how we are conscious.
And in my POV we humans cannot duplicate the body, we don't understand how we are conscious to the degree that we can build something that is conscious, and we are not even wired the same way we are born, the wiring takes on a conscious direction throughout our lives that is determined to a great extent by our own choices.

9. ### quantum_waveContemplating the "as yet" unknownValued Senior Member

Messages:
6,671
I'm going to bow out because there are two sides to this debate and it will be here to come back to for a long time

. I've played all my cards and no one has changed their view because ... :humor: we are all programmed differently :/humor:

10. ### Billy TUse Sugar Cane Alcohol car FuelValued Senior Member

Messages:
23,198
No I don't imply that. Certainly we don't understand all of brain functions nor can we at present make a machine that has "desires" and "beliefs" about how they may be achieved. I.e. currently no machine can make what I call a choice: A choice is a selection which the choosing agent believes will help satisfy its desires.

However, unless you believe in immaterial "spirits" that can control the body, especially its nerves, then the body is a deterministic system, governed by physical laws. Thus we need not fully understand how the brain works to know it is "rule following" - obeying the physical laws.

AS I understand your POV, humans can not make any choice because they are controlled by the physical laws. I.e. our ignorance about how the brain works does not open a path for "choice" if the entire body is controlled by the physical laws.

11. ### quantum_waveContemplating the "as yet" unknownValued Senior Member

Messages:
6,671
I did bow out because I played all my cards and this is an endless debate.

Here is my main card: The rule: Anything that appears to be non-algorithmic actually has natural causes that we don’t yet understand.

Any answer I give will be based on that philosophy. You can interpret it as agreement or disagreement but it is what I believe. You can draw conclusions about how I would respond based on that philosophy and there is a chance you would be wrong because no two humans think alike.

12. ### Billy TUse Sugar Cane Alcohol car FuelValued Senior Member

Messages:
23,198
That is a very weak "card" - I.e. it is simply a commonly accepted statement that all deterministic system follow rules even if we don't know what they are - no one will disagree with this relatively useless rule.
In even more simple words:
"Everything is caused, even when we don't know the cause."

13. ### cluelusshusbund+ Public Dilemma +Valued Senior Member

Messages:
7,767
Originally Posted by cluelusshusbund
quantum_wave---"A human however can change the chosen action given the same circumstances."

How do you know thats true.???

...i cant emagine that such a thang has ever been tested... much less a conclusion arrived at.!!! ”

My positon is... that not only hasnt it been "adequately" tested... its imposible to test (as you thoroughly esplaned)... so you'r positon that a human can change the chosen action givin the sam circumstances is based on beleif... not evidence.!!!

Last edited: Nov 11, 2010
14. ### kriminal99Registered Senior Member

Messages:
292
Of course they can, ethical concepts are just abstract classes just like any other abstract class.

Even current methods can do this if not exposed to novel examples.

15. ### RobertTenRegistered Member

Messages:
5
The other day i saw a video that showed a robot that could observe and learn like a child learns from their parents. The robot was able to discern when its "parent" was displeased or pleased with its actions. And based on that, the robot was taught what was a positive or negative action. So that looks to me as if a robot can have a learned conscience, the same way that we have one. Because we know what's "good" or "bad" from what we have learned to be "good" or "bad".

16. ### Luis A.C.ROMANELLIDon´t forget using mind ! ! !Registered Senior Member

Messages:
92
If we cant be shure about ethics in ROBOTS we cant supose love may be reached in any
mathine`s so that will be alwous the limit between humans and machines...in other words
THEY CAN BE SIMILAR BUT NOT THE SAME to stay in any place replacing humans ! ! !

17. ### domesticated omStickler for detailsValued Senior Member

Messages:
3,277
A routine that evaluates whether or not the task was performed within a given tolerance. 'Pleased' and 'displeased' are just measurements. That's nothing special.

...heck, even the array of correlative "human cues" (like facial expression, body language, or praise/criticism) can be defined using a couple of arrays LOL. Think of the program iterating through the "good" array to see how many significant cues were observed.

Did they smile: yes
Did they give praise: yes
Did they move towards the "deactivate" console: no
Did they communicate any scorn: no

18. ### universaldistressExtravagantly Introverted ...Valued Senior Member

Messages:
1,467
Can humans (flesh machines) make ethical decisions?

If yes, then the same is true of a machine that is built to the same spec, regardless of materials uses.

19. ### YellowDemonRegistered Senior Member

Messages:
7
No they are being programmed to cheat now. Cant find the link but read it

20. ### Honeyb35Registered Member

Messages:
21
I don't think we can program ethics into a robot, because we don't have a standard basis for what is ethical, on a worldwide basis that is. What is ok in some cultures in the world is not ok in other

21. ### RJBeeryNatural PhilosopherValued Senior Member

Messages:
4,222
Well isn't that the same thing as saying no human can have a set of ethics because there is no universal agreement on what ethics are?

Ethics and morals are really just "societal rules" restricting our selfish instincts from unduly harming others. There's nothing unethical about outworking someone for a promotion (because society deems it acceptable) but it unethical to take credit for someone else's work for that promotion (because we have generally agreed that this is harmful to society). If we were to program a strong survival "instinct", or function, or tendency, into a robot, and then gave it supplemental societal-etiquette algorithms to check its primary actions against we could say that we have programmed it with ethics.

22. ### EmptyForceOfChiBannedBanned

Messages:
10,848

Toaster: "Blasphemy, without crumpet"

(If your English you might get the joke)

peace.

23. ### universaldistressExtravagantly Introverted ...Valued Senior Member

Messages:
1,467
Recreate the human mind in a simulatory environ. Then hook it up to an advanced robotic construct. Then teach it.