~Signature made by Resentful~
Robots should have the same right to kill humans (and other robots) as humans have. Which is to say, none at all.
Retired Holy Priest
As a rule, I try to act on the internet as I would in real life. If I have offended you, feel free to point it out. Unless I meant to offend you, I will probably apologize.
It's artificial. Not natural.
"Reason is not automatic. Those who deny it cannot be conquered by it." - Ayn Rand
No, because you gave them the sense of feeling and emotion...just don't program them to feel...problem solved. I just can't see any use in making a robot that had feelings. (emotional and physical)
This will be a point of major contention in the future should AI develop to that point. I suspect that we'll have aliens before AI so maybe replace "human rights" with "living rights."
And honestly it depends. Most of the rights granted to organics would likely be superfluous and useless to a self-aware AI. An AI would likely need its own set of rights drafted in conjunction with organic rights. Something to prevent discrimination and promote sociocultural integration but each in relevant-for-the-organism-in-question terms.
So is this topic in preparation for when Skynet takes over and some hippies are protesting that we shouldnt fight them even though terminators are running rampant and killing everyone?
Read R.U.R. by Karel Čapek - Oh look, someone thought the very same thing you did, only almost 80 years ago... edit: 91 years ago.
Taken from wikipedia:
The play begins in a factory that makes artificial people, made of synthetic organic matter, called "robots". Unlike the modern usage of the term, these creatures are closer to the modern idea of androids or even clones, as they can be mistaken for humans and can think for themselves. They seem happy to work for humans, although that changes and a hostile robot rebellion leads to the extinction of the human race.
Human rights are nothing but another virtual concept created in order to set a standard for living conditions in our societies.
As the world change, we see more and more laws and regulations online. If AI - or Artificial Sentience which might be a far more important thing to discuss - gets to the point where it is so complex and advanced that it have its own society, than that society will need rules and regulations too. Basic Sentient Rights might not be that far off at all.
Or let me put it another way; why should a sentient intelligence of any sort do anything we asked it if we treated it like a slave, like an animal, and it was able to understand that? No, there can be no doubt. If we create a sentient intelligence that intelligence must have rights. Or we're asking for problems.
And also... are you some sort of god or higher being? If not, who are you to decide what is natural and what has the right to exist? You can't even say for sure if we are natural either. We're natural because we're made of flesh and blood? Then what if we're able to create artificial sentient beings in a biological form. They shouldn't have rights because they weren't natural? Not to mention artificial insemination.
Despite being the theme for alot of sci fi novels and movies AI is set to carry out a function. It follows logic, if this this then that. Humans are motivated by evolutionary instincts such as the need to procreate, the need for shelter, the need for sustenance to survive. We have evolved to be survivors, a machine cant care if it lives or dies because it hasnt evolved. It has simply been designed by someone who has evolved with these evolutionary characteristics.
In hindsight I think if one feels that a machine can adapt and survive the same way that a human can, and was created by another intelligence, then human beings would have had to have been created by another intelligence I.E. Intelligent Design.
Last edited by Seani; 2012-05-06 at 07:56 AM.
So I don't really think that machine intelligence would deserve "human" rights just because they are sentient. Mind you I don't think that there is anything particular special or sacred about humans either, and in a broad sense humans are essentially robots made from organic materials. But our needs and wants would be so different that I'd say machine life would require its own set of rights. Not that I think it'll be a good idea to create sentient life at all in the first place, but yeah.
yes because have you seen in every game or movie when you don't treat them like people they turn and kill us.... so ya treat them nice
besides movie and games
i just think if they are developed enough where they can think for themselves why not? doubt it would cause more harm than not doing so.
I'd torture a thousand souls just to see her smile.
Nope. Human rights only apply to humans. Not saying they don't deserve some rights/laws to protect them, but not human rights. As far as we know, animals are self-aware and "free thinking" and they don't get human rights. Animal rights sure, but again, no human rights.
"I am the hope of the universe. I am the answer to all living things that cry out for peace. I am the protector of the innocent. I am the light in the darkness. I am truth. Ally to good! Nightmare to you!!!"
And about that human-level AI: it wouldn't stay like that forever, like I pointed out. This isn't just some random rambling of my own. It's a generally accepted view among scientists that an AI would grow in intelligence faster than a biological creature.
Altruism usually only exists between species that are not intelligent or of equal intelligence or the same species. Humans are on the top of the foodchain atm, how many species do we cooperate with? An AI would benefit from us no doubt... up to a point. But when it became self sustainable why would it want us around? We would be competing with it for recources.Altruism still hasn't died out in evolution, because it helps the survival of the species. Evolution and biology hardly care about the individual organism. So no, not every lifeform is completely selfish. Pure selfishness isn't a good evolutionary strategy for any animal that lives in a group, despite what a cynical outlook on life might tell you. Cooperation is a very good survival strategy.
Domestication isn't a partnership. We don't need cats and dogs. We like them and think they're cute but we don't need them to survive. Sheep and cows etc. we do need but we don't cooperate with them, we just need them for food and clothing. It's a oneway "symbiosis". The other side doesn't benefit from us at all. They don't get to voice their opinion about this "relationship" of ours.And even between species it isn't all death and horror. Domesticated have evolved side by side with humans, and in nature as well you'll see different species collaborating and unlikely partnerships.
Last edited by zorkuus; 2012-05-06 at 10:59 AM.