A Will of It's Own - Is Data Sentient?

Discussion in 'Intelligence & Machines' started by KalvinB, Sep 8, 2001.

Thread Status:
Not open for further replies.
  1. KalvinB Publicity Whore Registered Senior Member

    The classic Ethics question: Is Data a sentient being?

    This is the first draft of my paper for class. Thought I'd post it to get some feed back on it. I think the conclusion needs a bit of work but it's headed in the right direction.

    -edit- changed it to the final draft

    I was there at the time of the trial but you won’t see me. Star Trek is actually a very low budget documentary and they couldn’t afford to put a juror on camera who couldn’t act like he cared no matter how much they paid him. Before the hearing we the jury were all given a sheet of paper, which I had hoped would have kept me occupied for the duration of the trial. Unfortunately it had only the following written on it.

    sen·tient (s n sh nt, -sh - nt) adj. Having sense perception; conscious

    con·scious (k n sh s) adj. Having an awareness of one's environment and one's own existence, sensations, and thoughts. Capable of thought, will, or perception

    If Data were as annoying and useless as that stupid hologram on that trash ship Red Dwarf I wouldn’t even be here. Speaking of which I hear the Federation is working on one of those holograms as well. I guess it’s supposed to act as the ship’s doctor or something. I suppose this trial will decide if future creations such as the so-called holographic doctor should have rights. Picard had better have a good case to warrant me wasting my time that could be better spent playing Quake CXI on my computer.

    I hate my computers. They're so obedient to my every command. I hit the power switch and they turn on. Well, three of them do. One of them has "issues" and requires it be powered on, then off and then on again before it actually boots up. I'm sure it's because I use it so much. It has to rebel in order to feel it's not completely under my control. Whenever I'm coding my latest project is when it gets really obnoxious. It doesn't do what I meant; it does exactly what I tell it to. It knows what I just typed will lock it up, but it doesn't care. I was told once that I made my TI-85 graphing calculator act like a Pentium. I think my computer can read and knows I just said that. It's not the first time. I'm sure it's jealous and that's why it acts like a TI-85 at times. When I'm surfing around the Internet it'll occasionally throw an error at me and close the window. The thing's got personality.

    My computers know a lot of stuff. I have one that can sing anything. Rap, Country (I try to encourage it not too), Alternative, it's all there. It can even play classical music. My main computer (the obnoxious one) I sometimes play games with. It cheats so badly. I know I'm a bad chess player but even I know you can't move a queen from B4 to E6 to knight a rook with a pawn. It thinks it's clever. If I have a question and my computer doesn't know the answer I just hop on the Internet and ask the computer at Google. That thing knows everything. It even knows that my server is a good place to find programming related information, which is a very well guarded secret. That server of mine is very modest.

    I'm very attached to my computers. If one was ever stolen and I found out who it was, I'd want them punished like someone who drowns puppies. If it were just the Pentium 200 I wouldn't care so much. Maybe just a slap on the wrist. It's not all that cute, not smart at all, and really should be dead any time now anyway. Kind of like that stereo of mine that was stolen from my car. It's hard to care when the insurance company foots the bill for one that's infinitely better. Speaking of my car, there's a piece of machinery that should be dropped in a lake. It's big, it's ugly, and it just refuses to die. At least it doesn't leak all over the place. Darn puppies. If they weren’t so cute they'd go the way of those rats that New Yorkers are so fond of.

    I've been working with computers since I can remember and now this council wants to try to convince me that this robot, Data, deserves basic human rights. It's got some personality, it's intelligent, and people have some sort of emotional connection to him. Riker there flipped its switch and I can't count how many people gasped. People get so emotionally attached to things that they begin to think the object of their affection actually cares about them as well. I hear this thing's been through a number of battles. Apparently it's self preservation has made room to include its buddies whom are listed in the credits. Oddly enough it's greatest "affection" goes to some extra that was allowed to speak and then promptly killed at the end of the episode. I wonder if it feels guilty that it couldn't preserve her life. I doubt it. The thing has no real emotion. It's highest ranked memories play like highlight footage ranked by the number of electronic signals sent at the time. I don't see it as any wonder that one of the trinkets he's taking with him is a hologram of the girl it got "intimate" with. Apparently he works, too.

    Now this computer hacker wants to disassemble it so he can make another one. The robot decides that it doesn't want to and here I sit listening to Picard and Riker go at it. It could be worse I suppose. That Kirk fellow never really nailed his language skills. Riker says he's a robot, Picard says obviously but the thing is also self aware and intelligent in the sense it can learn and evaluate situations without any reprogramming. As I was sitting there wondering when it was time for Picard to go have his tea and Riker to go hit on some interplanetary woman I realized something; the robot made a decision.

    It decided that it didn't want to be disassembled and that it would be better to retire and abandon his comrades for an unknown period of time than to be disassembled and risk not seeing them ever again. This robot took many seemingly unrelated ideas and combined them into one coherent solution and then made a decision. Last I checked my computer only did what it was programmed to do. Data has obviously been given free will. One would argue that he was just preserving himself but he is willing to undergo the operation when he feels it's safe. When he decides it's safe he will decide to be experimented on.
    Data may not have a soul (do we ourselves even have a soul?) but as he has a will of his own he is by definition conscious. Therefore by Maddox’s own definition Data is a sentient being. If we are to enslave computers then we must not give them a will. To give them a will but then deny them options by force is hardly humane. Even if it is all just an illusion of will should we harden ourselves when simple reprogramming will remedy the situation? I vote that all machines given an inherent ability to choose be allowed to act upon those choices with all the rights of human beings
    Last edited: Sep 11, 2001
  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. Pzzaboy Sales Slave Registered Senior Member

    something cannot be discluded from the sentient catagory just because it has an off-switch. There are lots of ways to turn a human off, but they unlike Data are a lot harder to turn back on. (Ask my girlfriend)
  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. Deadwood Registered Senior Member

    I don't think you could really count data as being a sentient being. He is an artificial life form. ie artificial life instead of artificial intelligence. But I still think he should have rights like others, but if you're going to send someone in to a bad situation he would be the one to go, because he's computer and because he's way stronger and feels no pain and can think faster making him the one of choice.
  6. Google AdSense Guest Advertisement

    to hide all adverts.
  7. Cris In search of Immortality Valued Senior Member


    Yes I agree. Once machines develop sufficient power and complexity that they become self-aware then they must have the right to make their own decisions. Although some animals, for example chimpanzees can be considered self-aware to some extent but they lack many features that we as humans take for granted. For example, if an entity reaches the conclusion “I think therefore I am” then we can safely assume that self-awareness is present. A chimpanzee is currently incapable of making that statement.

    If we reach the state of self-aware machines the real question will not be so much as whether we allow them to make their own decisions but whether we can stop them, or can stop them. We are really talking of machines that have near human or equal to human intelligence. The danger for us is their continued increase in power (re Moores Law), that would enable them to increase in intelligence way beyond our own. It is considered real that that they will be the ones who will be making decisions about us and not the other way round.

  8. Cris In search of Immortality Valued Senior Member


    Is data sentient? No not on its own. If it is animated then possibly. One could consider the synaptic connections in the human brain as representations of data, and it is the animation of this data that creates the effects we all know so well. But if energy to the brain is stopped, i.e. the person is killed, then, even though the synaptic connections (the data) are still intact, it is not usual to consider a dead person to be sentient, i.e. the data is no longer being animated.

    Does that make sense?

  9. wet1 Wanderer Registered Senior Member

    Makes perfect sense, Cris.

    If data were numbers or whatever going down an electric line and you kill the power. It does not continue down the line. It runs out of energy and dies. It disapates to nothing.
Thread Status:
Not open for further replies.

Share This Page