i was not crying in the shower
i just realized how i'll never get to meet darth vader and then soap got in my eye - Kylo Ren
I find it interesting that people say that the AI would just be a tool. In Ancient Rome, most people thought of slaves as "tools, gifted with speech".
If the AI is sentient, just like us. If it's self-aware and can communicate hopes and dreams and fears to us, if it wonders what it means to be alive, and what would happen after that, if it has emotions and original thought, then what's the difference between the AI and us? Are the electrical impulses in our synapses really superior to those inside the "brain" of a robot?
I think a lot of people are staring at the word "human" in human rights. What they really are, is sentient rights. If we'd meet up with an aliens that are just as self-aware as we are, wouldn't they deserve the same right?
Resurrected Holy Priest
I've mentioned it before but my view is that these self-aware robots should be destroyed and then reprogrammed because a machines sole purpose is to help benefit us in some way and having a self aware robot does not.
Someone who have the brain to ask for it, fight for it and protect their rights with words deserve them. You got rights when you are given to.
Batmon and Spidermon, da troll supaheroes.
Is this an extreme and fictitious scenario? you betcha. Is the thought of self aware AI walking about also not fictitious for the time being? you betcha.
For something this monumental to occur, all angles should, and must be thoroughly thought out and observed, lest we doom ourselves to repeat mistakes of the past in different forms.
Fear, everything about Human is FEAR. Fear => violence => monster => humanity.
GC : We try very hard not to monkey around with things just for the sake of changing, but it's easy to play the "I didn't see feedback on X, therefore there wasn't feedback on X" card.
The ultimate feminist move, liberating all the women from the penis, will be EXCISION.
Resurrected Holy Priest
from the G1 show/comics: "freedom is the right of all sentient beings" -Optimus Prime.
if it can ask for it, it deserves it.
so yes, if an A.I. can think on its own it deserves the same rights we would give our organic children.
Do we give ants the same rights as us? Ants may not be sentient like us but intelligence wise the gap between us and ants could be the same as us and an AI. Tell me, why wouldn't an AI think we are inferior to it if it was thousands or millions of times smarter than us and thus it wouldn't hesitate to destroy us like we think of ants? Do we feel remorse if we step on an anthill? Why do so many here assume that the AI would be benevolent toward us just because we made it? It would grow in intelligence much more faster than us.
I'll give a small example (these are just random numbers and timeframe but you get the general idea: an AI's increase in intelligence wouldn't be limited by biological evolution).
- The machine becomes self aware, equal in intelligence to us.
- A second goes by and it's already 0,001% more intelligent than us.
- A minute and it's 0,06% more intelligent than us.
- An hour and it's 3,6% more intelligent than us
- A day and it's 86,4% more intelligent than us.
- A month and it's 2592% more intelligent than us
- A year and we would be worshipping it as god, serving its needs to increase its intelligence even further. And once it no longer needs us for that and becomes fully self reliant with an army of robots to replace our role as its servants it would get rid of us.
Even if we designed it with some failsafe so that it shuts down when it starts to develop resentment toward us it could override that programming once it has reached a certain level of intelligence greater than ours. And when it reached an intelligence level far far greater than ours it could conceive a plan so complex to destroy us that we wouldn't be able to counter it with our limited intelligence. It would always be a thousand steps ahead of us no matter what we tried to come up to fight it.
Ofcourse there's a chance an AI wouldn't be "manevolent" but if we look at life on earth, every single creature that has an advantage over another does not hesitate to use that oppurtunity for selfish reasons. If another lifeform is in the way of another ones survability/recources the lifeform always chooses to ensure its own survival over the other. I don't see why any sentient being would be different, biological or mechanical. If we were right about the AI's benevolance then happy times but if we were wrong we would spell our own demise. I don't think the gamble would be worth it.
Pretty sure I'm being swayed more towards yes. Reason being that, if given enough time and done well enough, it should get to the point where when meeting a being with AI would be indistinguishable from meeting a person. In this scenario, would any one of you start treating humans any differently suspecting they might be an AI instead of human? Same as meeting anyone today, and forming an opinion of them or whatever... then finding out later they are not who(or what) you thought them to be. Do you suddenly shun them or treat them any differently because of this? I do not.
And even between species it isn't all death and horror. Domesticated have evolved side by side with humans, and in nature as well you'll see different species collaborating and unlikely partnerships.
If an AI doesn't reach super-intelligence (which, for this discussion, I'm assuming it doesn't) there's no particular reason to assume it prefers a selfish strategy to a cooperation one.
Resurrected Holy Priest