Page 12 of 16 FirstFirst ...
2
10
11
12
13
14
... LastLast
  1. #221
    Deleted
    No, because you gave them the sense of feeling and emotion...just don't program them to feel...problem solved. I just can't see any use in making a robot that had feelings. (emotional and physical)

  2. #222
    Old God Grizzly Willy's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Kenosha, Wisconsin
    Posts
    10,198
    Quote Originally Posted by Thassarian View Post
    It's artificial. Not natural.

    So no.
    What makes something that's natural deserving of rights compared to something that's artificial?

  3. #223
    The Patient Orestis's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    In the midst of failure.
    Posts
    240
    Quote Originally Posted by Thassarian View Post
    It's artificial. Not natural.

    So no.
    I agree with this! I will no longer treat people I know who were not conceived naturally like real people! Not natural after all...

  4. #224
    This will be a point of major contention in the future should AI develop to that point. I suspect that we'll have aliens before AI so maybe replace "human rights" with "living rights."

    And honestly it depends. Most of the rights granted to organics would likely be superfluous and useless to a self-aware AI. An AI would likely need its own set of rights drafted in conjunction with organic rights. Something to prevent discrimination and promote sociocultural integration but each in relevant-for-the-organism-in-question terms.

  5. #225
    So is this topic in preparation for when Skynet takes over and some hippies are protesting that we shouldnt fight them even though terminators are running rampant and killing everyone?

  6. #226
    Read R.U.R. by Karel Čapek - Oh look, someone thought the very same thing you did, only almost 80 years ago... edit: 91 years ago.

    Taken from wikipedia:
    The play begins in a factory that makes artificial people, made of synthetic organic matter, called "robots". Unlike the modern usage of the term, these creatures are closer to the modern idea of androids or even clones, as they can be mistaken for humans and can think for themselves. They seem happy to work for humans, although that changes and a hostile robot rebellion leads to the extinction of the human race.

  7. #227
    The Patient Goldpaw's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Kristiansand, Norway
    Posts
    346
    Quote Originally Posted by Wells View Post
    Its actually a fairly interesting question. Essentially you have to ask yourself why humans have rights. What is it about humanity that makes us different than animals?
    Not a single thing. Humans haven't got rights by nature. We defined those rights ourselves, and they only exists as long as there is a majority of people to support and enforce them.

    Human rights are nothing but another virtual concept created in order to set a standard for living conditions in our societies.

    As the world change, we see more and more laws and regulations online. If AI - or Artificial Sentience which might be a far more important thing to discuss - gets to the point where it is so complex and advanced that it have its own society, than that society will need rules and regulations too. Basic Sentient Rights might not be that far off at all.

    Or let me put it another way; why should a sentient intelligence of any sort do anything we asked it if we treated it like a slave, like an animal, and it was able to understand that? No, there can be no doubt. If we create a sentient intelligence that intelligence must have rights. Or we're asking for problems.

  8. #228
    if they develop true sentience, then yes they deserve "human" rights.

    to quote optimus prime here, "Freedom is the right of all sentient beings."

  9. #229
    The Patient Goldpaw's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Kristiansand, Norway
    Posts
    346
    Quote Originally Posted by Thassarian View Post
    It's artificial. Not natural.

    So no.
    Bad breath, farts, body hair, cancer, HIV, and hairy armpits. Earthquakes, tsunamis, asteroid crashes. All natural things. Not all natural things are good. We are humans. We thrive on the unnatural. Our entire society is made of artifical objects. Even a lot of our food. It can be argued there isn't a single natural cell in our bodies anymore.

    And also... are you some sort of god or higher being? If not, who are you to decide what is natural and what has the right to exist? You can't even say for sure if we are natural either. We're natural because we're made of flesh and blood? Then what if we're able to create artificial sentient beings in a biological form. They shouldn't have rights because they weren't natural? Not to mention artificial insemination.

  10. #230
    Despite being the theme for alot of sci fi novels and movies AI is set to carry out a function. It follows logic, if this this then that. Humans are motivated by evolutionary instincts such as the need to procreate, the need for shelter, the need for sustenance to survive. We have evolved to be survivors, a machine cant care if it lives or dies because it hasnt evolved. It has simply been designed by someone who has evolved with these evolutionary characteristics.

    In hindsight I think if one feels that a machine can adapt and survive the same way that a human can, and was created by another intelligence, then human beings would have had to have been created by another intelligence I.E. Intelligent Design.
    Last edited by Seani; 2012-05-06 at 07:56 AM.

  11. #231
    Quote Originally Posted by orissa View Post
    So I got to thinking, if AI ever got to a level where it could think, feel, perceive, and learn as humans do, if AI was capable of true sentience, would they then deserve human rights?
    Machine life would by its very nature be significantly different from biological life. There is no reason to assume that AI will be able to feel as humans do. Organic beings have emotions and instincts because that is the way life has evolved (on this planet). We feel pleasure or fear or sadness because of our brain chemistry, and not simply because of our awareness. It does not come automatically with achieving sentience. An AI that becomes self-aware isn't going to suddenly acquire self-preservation instincts, or compassion, or murderous intents, much less a desire to vote.

    So I don't really think that machine intelligence would deserve "human" rights just because they are sentient. Mind you I don't think that there is anything particular special or sacred about humans either, and in a broad sense humans are essentially robots made from organic materials. But our needs and wants would be so different that I'd say machine life would require its own set of rights. Not that I think it'll be a good idea to create sentient life at all in the first place, but yeah.

  12. #232
    Stood in the Fire strangebreed's Avatar
    10+ Year Old Account
    Join Date
    Jan 2012
    Location
    dreaming
    Posts
    434
    yes because have you seen in every game or movie when you don't treat them like people they turn and kill us.... so ya treat them nice

    besides movie and games

    i just think if they are developed enough where they can think for themselves why not? doubt it would cause more harm than not doing so.
    I'd torture a thousand souls just to see her smile.

  13. #233
    Mechagnome shootyadead's Avatar
    10+ Year Old Account
    Join Date
    Apr 2010
    Location
    South Carolina
    Posts
    590
    Nope. Human rights only apply to humans. Not saying they don't deserve some rights/laws to protect them, but not human rights. As far as we know, animals are self-aware and "free thinking" and they don't get human rights. Animal rights sure, but again, no human rights.
    "I am the hope of the universe. I am the answer to all living things that cry out for peace. I am the protector of the innocent. I am the light in the darkness. I am truth. Ally to good! Nightmare to you!!!"

  14. #234
    Quote Originally Posted by Ynna View Post
    This discussion hardly works if you assume an AI would automatically be evil. The premise was human-level AI, using C3PO as an example.
    You didn't quite understand my point. I didn't assume it would be automatically evil. I only suggested it would protect its own interest over anything else, just like we do. Do we consider ourselves evil when we step on an anthill?

    And about that human-level AI: it wouldn't stay like that forever, like I pointed out. This isn't just some random rambling of my own. It's a generally accepted view among scientists that an AI would grow in intelligence faster than a biological creature.

    Altruism still hasn't died out in evolution, because it helps the survival of the species. Evolution and biology hardly care about the individual organism. So no, not every lifeform is completely selfish. Pure selfishness isn't a good evolutionary strategy for any animal that lives in a group, despite what a cynical outlook on life might tell you. Cooperation is a very good survival strategy.
    Altruism usually only exists between species that are not intelligent or of equal intelligence or the same species. Humans are on the top of the foodchain atm, how many species do we cooperate with? An AI would benefit from us no doubt... up to a point. But when it became self sustainable why would it want us around? We would be competing with it for recources.

    And even between species it isn't all death and horror. Domesticated have evolved side by side with humans, and in nature as well you'll see different species collaborating and unlikely partnerships.
    Domestication isn't a partnership. We don't need cats and dogs. We like them and think they're cute but we don't need them to survive. Sheep and cows etc. we do need but we don't cooperate with them, we just need them for food and clothing. It's a oneway "symbiosis". The other side doesn't benefit from us at all. They don't get to voice their opinion about this "relationship" of ours.
    Last edited by zorkuus; 2012-05-06 at 10:59 AM.

  15. #235
    Quote Originally Posted by shootyadead View Post
    Nope. Human rights only apply to humans. Not saying they don't deserve some rights/laws to protect them, but not human rights. As far as we know, animals are self-aware and "free thinking" and they don't get human rights. Animal rights sure, but again, no human rights.
    What about dolphins then? They have human rights.

  16. #236
    Herald of the Titans Ynna's Avatar
    10+ Year Old Account
    Join Date
    May 2009
    Location
    Belgium
    Posts
    2,819
    Quote Originally Posted by Reyzzz View Post
    What about dolphins then? They have human rights.
    Do they? Never heard of that.
    Resurrected Holy Priest

  17. #237
    Deleted
    Quote Originally Posted by Reyzzz View Post
    What about dolphins then? They have human rights.
    Not in most western countries.

  18. #238
    Quote Originally Posted by zorkuus View Post
    Altruism usually only exists between species that are not intelligent or of equal intelligence or the same species. Humans are on the top of the foodchain atm, how many species do we cooperate with? An AI would benefit from us no doubt... up to a point. But when it became self sustainable why would it want us around? We would be competing with it for recources.


    Domestication isn't a partnership. We don't need cats and dogs. We like them and think they're cute but we don't need them to survive. Sheep and cows etc. we do need but we don't cooperate with them, we just need them for food and clothing. It's a oneway "symbiosis". The other side doesn't benefit from us at all. They don't get to voice their opinion about this "relationship" of ours.
    I have to point out AI means Artificial Intelligence. It doesn't need a body able to interact with the physical world. If we (I) manage to create a "human level AI" and we do not give it the means to interact with the world (which would be completely useless, we need an AI not moving metal bodies) it would
    1) make it unable to compete for resources as you see them
    2) create a sort of partnership in which we maintain their physical form intact and they think stuff for us

    And please avoid anthropocentrism such as "yeah but they would be ANGRY and would want a body LOLOLOLOL I WATCHED TOO MANY FILMS".

  19. #239
    Quote Originally Posted by Authary View Post
    I have to point out AI means Artificial Intelligence. It doesn't need a body able to interact with the physical world. If we (I) manage to create a "human level AI" and we do not give it the means to interact with the world (which would be completely useless, we need an AI not moving metal bodies) it would
    1) make it unable to compete for resources as you see them
    2) create a sort of partnership in which we maintain their physical form intact and they think stuff for us
    And what would this AI think of us if we placed such restrictions upon its existence?

    As for #2: No, this wouldn't be a partnership. It would be one party exploiting the other while giving next to nothing in return.

    And please avoid anthropocentrism such as "yeah but they would be ANGRY and would want a body LOLOLOLOL I WATCHED TOO MANY FILMS".
    ??? Arer you trying to build a straw man?

  20. #240
    Quote Originally Posted by zorkuus View Post
    And what would this AI think of us if we placed such restrictions upon its existence?
    There is no reason that this AI would care.

    Not that I think it is a good idea for ethical reasons, but if a truly sentient AI is ever created, there is no reason that it must be able to feel.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •