No, because you gave them the sense of feeling and emotion...just don't program them to feel...problem solved. I just can't see any use in making a robot that had feelings. (emotional and physical)
No, because you gave them the sense of feeling and emotion...just don't program them to feel...problem solved. I just can't see any use in making a robot that had feelings. (emotional and physical)
This will be a point of major contention in the future should AI develop to that point. I suspect that we'll have aliens before AI so maybe replace "human rights" with "living rights."
And honestly it depends. Most of the rights granted to organics would likely be superfluous and useless to a self-aware AI. An AI would likely need its own set of rights drafted in conjunction with organic rights. Something to prevent discrimination and promote sociocultural integration but each in relevant-for-the-organism-in-question terms.
So is this topic in preparation for when Skynet takes over and some hippies are protesting that we shouldnt fight them even though terminators are running rampant and killing everyone?
Read R.U.R. by Karel Čapek - Oh look, someone thought the very same thing you did, only almost 80 years ago... edit: 91 years ago.
Taken from wikipedia:
The play begins in a factory that makes artificial people, made of synthetic organic matter, called "robots". Unlike the modern usage of the term, these creatures are closer to the modern idea of androids or even clones, as they can be mistaken for humans and can think for themselves. They seem happy to work for humans, although that changes and a hostile robot rebellion leads to the extinction of the human race.
Not a single thing. Humans haven't got rights by nature. We defined those rights ourselves, and they only exists as long as there is a majority of people to support and enforce them.
Human rights are nothing but another virtual concept created in order to set a standard for living conditions in our societies.
As the world change, we see more and more laws and regulations online. If AI - or Artificial Sentience which might be a far more important thing to discuss - gets to the point where it is so complex and advanced that it have its own society, than that society will need rules and regulations too. Basic Sentient Rights might not be that far off at all.
Or let me put it another way; why should a sentient intelligence of any sort do anything we asked it if we treated it like a slave, like an animal, and it was able to understand that? No, there can be no doubt. If we create a sentient intelligence that intelligence must have rights. Or we're asking for problems.
Bad breath, farts, body hair, cancer, HIV, and hairy armpits. Earthquakes, tsunamis, asteroid crashes. All natural things. Not all natural things are good. We are humans. We thrive on the unnatural. Our entire society is made of artifical objects. Even a lot of our food. It can be argued there isn't a single natural cell in our bodies anymore.
And also... are you some sort of god or higher being? If not, who are you to decide what is natural and what has the right to exist? You can't even say for sure if we are natural either. We're natural because we're made of flesh and blood? Then what if we're able to create artificial sentient beings in a biological form. They shouldn't have rights because they weren't natural? Not to mention artificial insemination.
Despite being the theme for alot of sci fi novels and movies AI is set to carry out a function. It follows logic, if this this then that. Humans are motivated by evolutionary instincts such as the need to procreate, the need for shelter, the need for sustenance to survive. We have evolved to be survivors, a machine cant care if it lives or dies because it hasnt evolved. It has simply been designed by someone who has evolved with these evolutionary characteristics.
In hindsight I think if one feels that a machine can adapt and survive the same way that a human can, and was created by another intelligence, then human beings would have had to have been created by another intelligence I.E. Intelligent Design.
Last edited by Seani; 2012-05-06 at 07:56 AM.
Machine life would by its very nature be significantly different from biological life. There is no reason to assume that AI will be able to feel as humans do. Organic beings have emotions and instincts because that is the way life has evolved (on this planet). We feel pleasure or fear or sadness because of our brain chemistry, and not simply because of our awareness. It does not come automatically with achieving sentience. An AI that becomes self-aware isn't going to suddenly acquire self-preservation instincts, or compassion, or murderous intents, much less a desire to vote.
So I don't really think that machine intelligence would deserve "human" rights just because they are sentient. Mind you I don't think that there is anything particular special or sacred about humans either, and in a broad sense humans are essentially robots made from organic materials. But our needs and wants would be so different that I'd say machine life would require its own set of rights. Not that I think it'll be a good idea to create sentient life at all in the first place, but yeah.
yes because have you seen in every game or movie when you don't treat them like people they turn and kill us.... so ya treat them nice
besides movie and games
i just think if they are developed enough where they can think for themselves why not? doubt it would cause more harm than not doing so.
I'd torture a thousand souls just to see her smile.
Nope. Human rights only apply to humans. Not saying they don't deserve some rights/laws to protect them, but not human rights. As far as we know, animals are self-aware and "free thinking" and they don't get human rights. Animal rights sure, but again, no human rights.
"I am the hope of the universe. I am the answer to all living things that cry out for peace. I am the protector of the innocent. I am the light in the darkness. I am truth. Ally to good! Nightmare to you!!!"
You didn't quite understand my point. I didn't assume it would be automatically evil. I only suggested it would protect its own interest over anything else, just like we do. Do we consider ourselves evil when we step on an anthill?
And about that human-level AI: it wouldn't stay like that forever, like I pointed out. This isn't just some random rambling of my own. It's a generally accepted view among scientists that an AI would grow in intelligence faster than a biological creature.
Altruism usually only exists between species that are not intelligent or of equal intelligence or the same species. Humans are on the top of the foodchain atm, how many species do we cooperate with? An AI would benefit from us no doubt... up to a point. But when it became self sustainable why would it want us around? We would be competing with it for recources.Altruism still hasn't died out in evolution, because it helps the survival of the species. Evolution and biology hardly care about the individual organism. So no, not every lifeform is completely selfish. Pure selfishness isn't a good evolutionary strategy for any animal that lives in a group, despite what a cynical outlook on life might tell you. Cooperation is a very good survival strategy.
Domestication isn't a partnership. We don't need cats and dogs. We like them and think they're cute but we don't need them to survive. Sheep and cows etc. we do need but we don't cooperate with them, we just need them for food and clothing. It's a oneway "symbiosis". The other side doesn't benefit from us at all. They don't get to voice their opinion about this "relationship" of ours.And even between species it isn't all death and horror. Domesticated have evolved side by side with humans, and in nature as well you'll see different species collaborating and unlikely partnerships.
Last edited by zorkuus; 2012-05-06 at 10:59 AM.
I have to point out AI means Artificial Intelligence. It doesn't need a body able to interact with the physical world. If we (I) manage to create a "human level AI" and we do not give it the means to interact with the world (which would be completely useless, we need an AI not moving metal bodies) it would
1) make it unable to compete for resources as you see them
2) create a sort of partnership in which we maintain their physical form intact and they think stuff for us
And please avoid anthropocentrism such as "yeah but they would be ANGRY and would want a body LOLOLOLOL I WATCHED TOO MANY FILMS".
And what would this AI think of us if we placed such restrictions upon its existence?
As for #2: No, this wouldn't be a partnership. It would be one party exploiting the other while giving next to nothing in return.
??? Arer you trying to build a straw man?And please avoid anthropocentrism such as "yeah but they would be ANGRY and would want a body LOLOLOLOL I WATCHED TOO MANY FILMS".