mind upoading

Discussion in 'General Philosophy' started by orcot, Jun 12, 2015.

  1. orcot Valued Senior Member

    Messages:
    3,488
    Assuming it would be possible to upload your mind and you end up with your Original (aging) biological self and a inmortal digital self. Would it be moral for the biological self to make a later copy of itself that overrides the digital self and it's experiences that it has had until then destorying the former digital self. Or assuming the biological becomes dissabled and becomes dependand on digital self at what point could the digital self decide to kill the host (though simply not supporting it). Ranging from yust elderly to basicly living biological matter incapable of tought.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Jake Arave Ethologist Registered Senior Member

    Messages:
    165
    What are the ethics of terminating a sentient being? I don't think the 'mind uploading' portion is necessary.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    I think it would be downright obligatory, like updating software on your computer.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. orcot Valued Senior Member

    Messages:
    3,488
    Even if the software is sentient of it's own? I'm sure you could make a argument if the data is yust in storage and not sentient. However people might not trust the copy proces and desire the see the copy functioning to be sure it's a accurate (enough) copy.
    The moment this copy becomes capable to do actual labor and becomes capable to provide for it's own (and perhaps yours) existence greed could make these digitale selfs run parrallel to their human counterpart.
    Meaning the program could object to their "upgrade".

    Also let's say you have a copy of you in your 40's when you where relativly in your prime where a expert in your job and happily married would you destroy that data for a copy of you in your 60's when your divorced and your brain start to get slugies. Assuming that then is the copy you will spend enternity with.

    Also assuming you yust make multiple copys how odd would it be to be at your own funeral with multiple copy's of yourself (all at different ages) gatered. Knowing myself it would end in a fight.
     
  8. C C Consular Corps - "the backbone of diplomacy" Valued Senior Member

    Messages:
    3,324
    What would be the point of having either an embodied or a VR information duplicate if it lacked similar rights, and freedom from being "owned property", that the original person had? One might contend that any continued parallel life is better than none at all, or that minimal contractual slavery (restricted to just the updating) is alright if the duplicate voluntarily agrees to it. Of course, even if the latter scenario, the humanic or simulation might have second thoughts later about the deal, the longer it lives and acquires its own individual experiences.

    The ectype having the authority to eliminate or spurn its disabled / impaired archetype at some point implies a flip of the above situation, where the original person is who winds-up on the receiving end of compromised rights. Again, difficult to imagine accepting that consequence, though in particular cases the archetype might feel s/he can trust the ectype as much as an offspring to make the right choices.

    Literal realization of the thought experiment inevitably rests upon whether or not such a complete and accurate degree of information can be extracted from the biological structure and chemical operations of a nervous system. Along with replicating the contributing and contingent effects of various substances in the bloodstream, as well the ground of just being housed in such a natural body which interacts with its environment. It seems a highly unlikely feat, though at some point we'll become transhuman cyborgs where a good part of our thoughts and behavior will stem from the replacement, influence and capacities of implanted technology anyway. The latter already in upload-able form.
     
    Spellbound likes this.
  9. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    How can software be sentient?
     
  10. Kristoffer Giant Hyrax Valued Senior Member

    Messages:
    1,364
    Have you learned nothing from the Skynet incident?

    Dunno about you, but I'm sick of being overrun by T-800 model 101's.
     
  11. orcot Valued Senior Member

    Messages:
    3,488
    In this case it would be sort of a turing test but in stead of it convincing you that it is human it covinces itself that it is you. You would release this software in a virtual world and it would act like you would. A scientist if given time to study this program at it's conception and accurate brain scans of all the people in the world should be able to pick the right one out.
     
  12. Kristoffer Giant Hyrax Valued Senior Member

    Messages:
    1,364
    That would require the "brain transplantee"(?) to have been hooked up to a scanner in an untold number of scenarios. Even in the most basic of games people will make different choices.
     
  13. orcot Valued Senior Member

    Messages:
    3,488
    It would take something that is impossible with today's technology. That said it can be done. But the how isn't that important for the general philosophy of it. You can find these options by googling mind uploading.
     
  14. danshawen Valued Senior Member

    Messages:
    3,951
    If you are fortunate enough to outlive many of your friends and acquaintances, ultimately you will realize that such patterns as are left, including most of your knowledge, experience, prejudice, obsessions, misconceptions, etc. really make very little sense to immortalize. And any desire a young person may feel to be immortal will eventually pass as well, In this respect, we may be thankful that our creator was so much wiser than we would be if we lived for ten times the current normal human lifespan. I haven't seen or experienced anything on this small rock that would make me wish to do so indefinitely. I'd want an automatic death or death and save switch or implant installed in any replicants of myself that could not be removed without activation.

    No doubt the inspirational thought for Blade Runner.
     
  15. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    No, I haven't.

    Please Register or Log in to view the hidden image!

    I had to google it to find out what the hell it was.
     
  16. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    So I'm back to where I started. If the software is "fake sentient", there's no moral obligation to "respect" it; it should be treated like any other software. If you can update it, update it.
     
  17. orcot Valued Senior Member

    Messages:
    3,488
    With the proper knowledge It 's pretty obvious you could also reprogram humans a famous example was injecting prisoners with cow blood to give them some of the docile nature of cows. A couple of years ago their was talk abouth a gay bomb, people are fasinated by this sort of stuff and faillure is yust a temporally thing. Eventually someone will succeed, theirs nothing particulairly holy abouth the human brain and people are fasinated by, it inplanting false memories removing real ones curing udesirable characters (flaws) it's scary but someone will eventually figger it out.
    link
    That doesn't mean they should tough
     

Share This Page