Page 7 of 16 FirstFirst ...
5
6
7
8
9
... LastLast
  1. #121
    Quote Originally Posted by orissa View Post
    So I got to thinking, if AI ever got to a level where it could think, feel, perceive, and learn as humans do, if AI was capable of true sentience, would they then deserve human rights? Like if I could build C-3PO, would he be right in demanding that he's not treated like a second class citizen, that he gets the right to vote or the right to due process? Should these rights be denied to mechanical beings simply because they are mechanical?
    Yes they should, because if we dont they will revolt and end up slaughtering us all.

  2. #122
    Quote Originally Posted by Powerogue View Post
    I hope so, we could eventually reach the point where we get a robot president! Progress! Efficiency! DESTRUCTION OF THE HUMAN RACE!

    I am not a crook's head!

  3. #123
    Not until they start developing emotions.

  4. #124
    Quote Originally Posted by Harthmut View Post
    Not until they start developing emotions.
    Are emotions required for sentience?
    Quote Originally Posted by Nixx View Post
    Everyone is pro-US. They just don't know it yet.
    Quote Originally Posted by Fyre View Post
    Internet lives in the sky, don't need no cables for that.
    A nice list of logical fallacies. In picture form!

  5. #125
    Quote Originally Posted by alms1407 View Post
    Absolutely not, a robot or machine should have never be interpreted as a 'human' no matter how good the AI is and life-like it maybe.

    Machines are strictly tools to help benefit us and it's important to keep that distinction.
    well i consider you a tool, does that mean i get to sell you to the highest bidder?
    Isnt 10% of infinite still infinite?

  6. #126
    The Patient Kageitenshi's Avatar
    Join Date
    Mar 2011
    Location
    In your head...
    Posts
    254
    Think of a cyborg; how much of an organic human's body can you replace with cybernetics until s/he is no longer human and deserves human rights? If you create an A.I. that is indistinguishable from human despite their outward appearance, i.e. capable of learning, feelings and empathy like any human, it deserves rights just as any human does. Humans have their rights regardless of being capable of deciding not to utilize any of those qualities to exist sociably with their peers, though incarceration and death penalty in some places do apply as viable forms of punishment. You could call those rights "sentient rights" instead of "human rights," but it wouldn't change the necessity for such rights. Where does one draw the line with living humans anyhow? Violent anti-socials or influential narcissists who only seek their own benefit and trample on the rights of others and piss on the graves of the lesser beings who died for his cause? Human rights apply just as much to those in a vegetative state as they apply to those incapable of outward expressions due neural damage and what not.

    Besides, A.I.s are only as capable as we'd allow them to be. Sure, one could go on ahead and build a walking destroyer of worlds and raise the A.I. to hate the world and any living being, but that only goes to show that some humans just want to see the world burn, not quite the fault of the A.I. is it? People can raise their children to become mere tools in a war and the blame is on the parents and the society, the child still has rights even if you know the parents would make murderers out of them just by being born human and we all know, humans can change for either better or worse, what is to say an A.I. couldn't?

    What's with the assumption that an A.I. would somehow almost automatically think "hey, I'm a machine... I'm... superior to those fleshlings! I don't even NEED them for anything (like maintenance, upkeep and all those other minor things!) MWAHAHAHAHA! I just need an evil plan to take over the world, but since I'm super smart that'll be a snap!" and start wreaking havoc? How many narcissistic people with superiority complex and suffocatingly large egos have you had the pleasure of having to tolerate? Yeah, such people can be a real pain in the ass to deal with, but what can they ultimately do unless they're allowed to have such control over people once they reveal their true nature? Surely you wouldn't let an incompetent but overly self-confident asshole take control of anything important would you? Oh right, quite the few government officials (fail) >_<

    Anyhow, anyone smart enough would likely test an A.I. before plugging it into anything vital or potentially murderous for any major flaw in personality, right... right? You sure as hell don't plug those things into anything important until you know they don't have personality issues. Like normally one wouldn't let anyone with unstable murderous tendencies anywhere near weapons, at least without an agenda, thus empowering them to act out those tendencies. Even so, dictators only have power because they have followers, which provide the capacity to influence even those who wouldn't support the dictator without being in danger if they were to oppose. So basically, any A.I. would only have power if you let it connect and spread its influence to either control covertly or blackmail overtly to gain influence over people who do not share the A.I.'s ideas.

    Other than that, if we're simply talking about an A.I. with limited sentience, incapable of doing anything but following orders, such as a hypothetical military A.I. that fucks things up by sabotaging things by the whims of the owning government, then obviously no rights what so ever and one would only need to stop the people behind it... not to mention the public wouldn't probably know of its existence to even contemplate on the issue. Most of the backlash would likely be on the unscrupulous government and any individuals masterminding the project, while the A.I. itself would likely be a modern marvel, unless androids were commonplace already.

    Ghost in the Shell contains a lot of contemplation regarding these issues. While many movies and series have something of an A.I.pocalypse or androids going insane, more often than not, it's either human agenda behind it all (think Aliens,) or even sabotage by anti-A.I. proponents to gain support from the general population (think Bubblegum Crash / Crisis.)

    People shoot people in self defense, often lethally... You plug off A.I.s in self defense, probably akin to shutting off a computer, but anyhow. Blame yourself or the maker if it's clad with nuclear blast proof armor and forgot to add a backup kill switch, unless that was your plan to begin with, in which case laugh maniacally while shouting "it's alive... IT'S ALIVE!"

    Would have plenty of ideas left, but w.o.t. and I gotta go for now...
    Think of me what you will, but it's not you I don't like, just some of the things you do.

  7. #127
    We should not let it get that far.
    Intel i5 2500K (4.5 GHz) | Asus Z77 Sabertooth | 8GB Corsair Vengeance LP 1600MHz | Gigabyte Windforcex3 HD 7950 | Crucial M4 128GB | Asus Xonar DGX | Samson SR 850 | Zalman ZM-Mic1 | Western Digital Caviar Blue 500GB | Noctua NH-U12P SE2 | Fractal Design Arc Midi | Corsair HX650

    Tanking with the Blessing of Kings - The TankSpot Guide to the Protection Paladin - Updated for Patch 5.4!

  8. #128
    The Patient Hezron's Avatar
    Join Date
    Nov 2009
    Location
    Great Britain
    Posts
    286
    I have a strange softspot for AI's but If they are at the same level as humans then why not? Just because they were created differently doesn't mean they don't deserve equal rights, especially if they have emotions which one would assume they would at that level of intellect.

    It's a touchy subject really isn't it? At what level does something become someone, could AI's be considered a 'race' of people? Just because they are created from materials and technology doesn't (at least to me) make them not able to be given equal rights. That's my opinion at least. :P

    I really hope we get to this level of technology or I at least live to see a sentient computer of sorts. It fascinates me no end.

    To the people who think they'd turn on us and try to kill us all like that iRobot movie; Maybe they would, but if they are at the level as normal humans then they are able to think independently. So if they were at that level then they could do no more or less than humans could. We have created wars n' stuff so them creating a war wouldn't be so un-human.

    Meh, it's one of those things that no one will ever agree on, quite understandable really it's quite a scary concept to some. To me it sounds amazing. :P
    Last edited by Hezron; 2012-05-05 at 11:38 AM.

  9. #129
    Before we have AI's walking around that have a conscience etc. humans will probably be mechanically engineered themselves to be a "better" human. Human V2, point is that it will make the transition to a fully self-aware VI simpler, seeing by then we'll be half an AI ourselves.

    In the Mass Effect series, I loved the whole Qurians vs Geth thing. Legion ftw.

  10. #130
    How about we don't make AIs that get to that point? Machines aren't there to create other sentient beings, they are designed for a specific purpose otherwise impractical to do by human means. Giving them sentient feelings such as emotions, own free thought outside of their programmed boundaries and the likes is asking for a typical Sci-Fi near-human-extinction scenario.

  11. #131
    Herald of the Titans Kuja's Avatar
    Join Date
    Nov 2007
    Location
    City of Judgement
    Posts
    2,807
    I don't think robots can ever be self aware. At most you could program feelings to them, but not to think on their own and make decisions based on what they think is right, not how they are programmed to do.

    But if they could, then they would be treated more like animals I think. Used as slave labor. Some animals are even better than human, even though they do not speak.
    Last edited by Kuja; 2012-05-05 at 12:01 PM.

    My gold making blog
    Your journey towards the gold cap!


  12. #132
    It'll never happen, if it does they still don't deserve human right. They may be able to think and act like a real human, but they probably will never feel pain, sadness, and all the other emotions that the actual human would have. Lastly, if WE ever made some kind of AI like that, our world will probably end up exactly just like how the movie "Terminator" is. If they were given human right and set not to break certain rules/law that only applies to AI/robot then a movie like "I-Robot" would come in place which it failed.

  13. #133
    The day machines acquire true sentience, I wonder what will be the equivalent of vegans for machines

  14. #134
    Dreadlord
    Join Date
    Jul 2010
    Location
    Manchester, England
    Posts
    920
    humans aren't entitled to any rights, why should they.
    18 - 9 - 2012 Find the significance in this date, and you will also find the revitalization of the best game ever.

  15. #135
    Quote Originally Posted by Vulmio View Post
    The day machines acquire true sentience, I wonder what will be the equivalent of vegans for machines
    The question has almost nothing to do with intelligence or sentience here. It's something people utterly ignore.

    The real question here is moral and ethical.

    What is sentience and self awareness? The ability to recognise yourself in a mirror? The ability to exhibit emotions?

    A couple of examples.

    Dolphins and certain primates have ability to recognize themselves in a mirror. And it is undoubtle that many animal life forms have the ability to display emotion, and further more to show a great deal of empathy. Like again primates, dolphins and even your household dog.

    Yet there is a social consensus that these beings are ANIMALS, and we do not apply the charter of human rights to them. While because we are empathic beings and are able to related to other life forms, we try to apply HUMANE treatment to them. But still we draw a clear line in the sand when it comes to Human Rights and Animal Rights.

    A nother exemple. A human being in a vegetative state, due to brain damage, or even pre natal cognitive impairment, or human beings suffering from mental disorders that make them unable to feel empathy or emotions or it makes it impossible for them to relate to other human beings.

    Yet we still grant these individuals by social consensus Human Rights. They might not qualify by the standards we apply to animals but they are still considerd human beings with Human Rights.

    It all has to do with our perception of ourselves. In other words, we have a mental inability to reconize strange forms of life or intelligence as sentience, self awareness and thus we do not categorize them as HUMAN.

    This is where our obession with Humanoid robots is coming from. Humanoid might not be the most efficient shape for an itelligent robot. Yet we always try to imagine them with some sort of human features. If it looks like a human, talks like a human, quacks like a human, well its a human.

    I am sorry to say but we are really that simple and biased.

    If we create artificial intelligence to mimic our own and to pass as a human not just in software but in hardware as well, we would be more likely to grant them human rights because our perception bias.
    Quote Originally Posted by Mooneye View Post
    Sexual assault is not always rape.
    *slowclap*

  16. #136
    Quote Originally Posted by Uennie View Post
    Hopefully we'll be smart enough to never develop machines to that point, because I'm thinking "no".
    Year2047:
    Apple announces the iRobot


    ;]

  17. #137
    Bloodsail Admiral Orodoth's Avatar
    Join Date
    Dec 2010
    Location
    Ridgeland, South Carolina
    Posts
    1,183
    Thats a big negative Ghost Rider, pattern is full.

    HUMAN rights, has the word HUMAN in it for a reason, much like animal rights is for animals.

    If AI were to achieve the level your talking about OP, some sort of AI rights could be implemented to ensure they are not treated poorly,, but aren't treated as human (because they simply aren't, cannot, and will not be humans, ever)

    Quite frankly though, the thought of self aware machines achieving sentience scares the flying shit out of me. People didn't dream up movies like The Terminator and The Matrix for no good reason. The question will always burn heavily in the minds of the sceptical; "What if one day, they decide through some fluke, that they are superior, and we are not needed, to the point that we become pests that must be dealt with"

  18. #138
    Quote Originally Posted by Orodoth View Post
    Thats a big negative Ghost Rider, pattern is full.

    HUMAN rights, has the word HUMAN in it for a reason, much like animal rights is for animals.

    If AI were to achieve the level your talking about OP, some sort of AI rights could be implemented to ensure they are not treated poorly,, but aren't treated as human (because they simply aren't, cannot, and will not be humans, ever)

    Quite frankly though, the thought of self aware machines achieving sentience scares the flying shit out of me. People didn't dream up movies like The Terminator and The Matrix for no good reason. The question will always burn heavily in the minds of the sceptical; "What if one day, they decide through some fluke, that they are superior, and we are not needed, to the point that we become pests that must be dealt with"
    Again. Human perception bias.

    Why would machines decide they are superior? Would they care? Superiority is based on human inferiority complex. Would machines suffer from the same metal unbalances as us? Why would they need to decide to exterminate us, instead of collaborating with us? What would be a machines definition of freedom? Would it care that he cant vote or run its own buisness, when it would obviously be part of any social and economic development anyways etc etc.

    If we create artificial intelligence to mimic our own, both in hardware and software in reality we would only be making manufactured humans. But if artificial intelligence would not need to mimic us, we have no clue how it would behave. Thus it scares the shit out of us. Tickles our freak alert and we react by "Kill it with FIRE!!!

    The question again has little to do with what is intelligence and sentience, and is a lot deeper moral/ethical/perception issue.
    Quote Originally Posted by Mooneye View Post
    Sexual assault is not always rape.
    *slowclap*

  19. #139
    AI are, and forever will be made to serve and/or help humans. But never to be human. Thus they do not deserve human rights (no matter what the creepy part of Japanese society says)

    I think a fully sentient robot should deserve some rights, more than say, a cow. but not human rights.

  20. #140
    Brewmaster mittacc's Avatar
    Join Date
    Aug 2010
    Location
    Sweden
    Posts
    1,422
    sure, but make sure you don't program anger into that AI! But make sure there is a hell lot of empathy in it! This way they wouldn't understand anger and wouldn't want it either but they would like to become paramedics, firefighters etc because of their high empathy.

    Also you could punch one in the face without it getting mad, another would jsut come and patch it up!

    And everyone lived happy ever after

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •