Page 7 of 16 FirstFirst ...
5
6
7
8
9
... LastLast
  1. #121
    Deleted
    Think of a cyborg; how much of an organic human's body can you replace with cybernetics until s/he is no longer human and deserves human rights? If you create an A.I. that is indistinguishable from human despite their outward appearance, i.e. capable of learning, feelings and empathy like any human, it deserves rights just as any human does. Humans have their rights regardless of being capable of deciding not to utilize any of those qualities to exist sociably with their peers, though incarceration and death penalty in some places do apply as viable forms of punishment. You could call those rights "sentient rights" instead of "human rights," but it wouldn't change the necessity for such rights. Where does one draw the line with living humans anyhow? Violent anti-socials or influential narcissists who only seek their own benefit and trample on the rights of others and piss on the graves of the lesser beings who died for his cause? Human rights apply just as much to those in a vegetative state as they apply to those incapable of outward expressions due neural damage and what not.

    Besides, A.I.s are only as capable as we'd allow them to be. Sure, one could go on ahead and build a walking destroyer of worlds and raise the A.I. to hate the world and any living being, but that only goes to show that some humans just want to see the world burn, not quite the fault of the A.I. is it? People can raise their children to become mere tools in a war and the blame is on the parents and the society, the child still has rights even if you know the parents would make murderers out of them just by being born human and we all know, humans can change for either better or worse, what is to say an A.I. couldn't?

    What's with the assumption that an A.I. would somehow almost automatically think "hey, I'm a machine... I'm... superior to those fleshlings! I don't even NEED them for anything (like maintenance, upkeep and all those other minor things!) MWAHAHAHAHA! I just need an evil plan to take over the world, but since I'm super smart that'll be a snap!" and start wreaking havoc? How many narcissistic people with superiority complex and suffocatingly large egos have you had the pleasure of having to tolerate? Yeah, such people can be a real pain in the ass to deal with, but what can they ultimately do unless they're allowed to have such control over people once they reveal their true nature? Surely you wouldn't let an incompetent but overly self-confident asshole take control of anything important would you? Oh right, quite the few government officials (fail) >_<

    Anyhow, anyone smart enough would likely test an A.I. before plugging it into anything vital or potentially murderous for any major flaw in personality, right... right? You sure as hell don't plug those things into anything important until you know they don't have personality issues. Like normally one wouldn't let anyone with unstable murderous tendencies anywhere near weapons, at least without an agenda, thus empowering them to act out those tendencies. Even so, dictators only have power because they have followers, which provide the capacity to influence even those who wouldn't support the dictator without being in danger if they were to oppose. So basically, any A.I. would only have power if you let it connect and spread its influence to either control covertly or blackmail overtly to gain influence over people who do not share the A.I.'s ideas.

    Other than that, if we're simply talking about an A.I. with limited sentience, incapable of doing anything but following orders, such as a hypothetical military A.I. that fucks things up by sabotaging things by the whims of the owning government, then obviously no rights what so ever and one would only need to stop the people behind it... not to mention the public wouldn't probably know of its existence to even contemplate on the issue. Most of the backlash would likely be on the unscrupulous government and any individuals masterminding the project, while the A.I. itself would likely be a modern marvel, unless androids were commonplace already.

    Ghost in the Shell contains a lot of contemplation regarding these issues. While many movies and series have something of an A.I.pocalypse or androids going insane, more often than not, it's either human agenda behind it all (think Aliens,) or even sabotage by anti-A.I. proponents to gain support from the general population (think Bubblegum Crash / Crisis.)

    People shoot people in self defense, often lethally... You plug off A.I.s in self defense, probably akin to shutting off a computer, but anyhow. Blame yourself or the maker if it's clad with nuclear blast proof armor and forgot to add a backup kill switch, unless that was your plan to begin with, in which case laugh maniacally while shouting "it's alive... IT'S ALIVE!"

    Would have plenty of ideas left, but w.o.t. and I gotta go for now...

  2. #122
    We should not let it get that far.

  3. #123
    The Patient Hezron's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    Great Britain
    Posts
    295
    I have a strange softspot for AI's but If they are at the same level as humans then why not? Just because they were created differently doesn't mean they don't deserve equal rights, especially if they have emotions which one would assume they would at that level of intellect.

    It's a touchy subject really isn't it? At what level does something become someone, could AI's be considered a 'race' of people? Just because they are created from materials and technology doesn't (at least to me) make them not able to be given equal rights. That's my opinion at least. :P

    I really hope we get to this level of technology or I at least live to see a sentient computer of sorts. It fascinates me no end.

    To the people who think they'd turn on us and try to kill us all like that iRobot movie; Maybe they would, but if they are at the level as normal humans then they are able to think independently. So if they were at that level then they could do no more or less than humans could. We have created wars n' stuff so them creating a war wouldn't be so un-human.

    Meh, it's one of those things that no one will ever agree on, quite understandable really it's quite a scary concept to some. To me it sounds amazing. :P
    Last edited by Hezron; 2012-05-05 at 11:38 AM.

  4. #124
    Deleted
    Before we have AI's walking around that have a conscience etc. humans will probably be mechanically engineered themselves to be a "better" human. Human V2, point is that it will make the transition to a fully self-aware VI simpler, seeing by then we'll be half an AI ourselves.

    In the Mass Effect series, I loved the whole Qurians vs Geth thing. Legion ftw.

  5. #125
    Deleted
    How about we don't make AIs that get to that point? Machines aren't there to create other sentient beings, they are designed for a specific purpose otherwise impractical to do by human means. Giving them sentient feelings such as emotions, own free thought outside of their programmed boundaries and the likes is asking for a typical Sci-Fi near-human-extinction scenario.

  6. #126
    I am Murloc! Kuja's Avatar
    15+ Year Old Account
    Join Date
    Nov 2007
    Location
    City of Judgement
    Posts
    5,493
    I don't think robots can ever be self aware. At most you could program feelings to them, but not to think on their own and make decisions based on what they think is right, not how they are programmed to do.

    But if they could, then they would be treated more like animals I think. Used as slave labor. Some animals are even better than human, even though they do not speak.
    Last edited by Kuja; 2012-05-05 at 12:01 PM.

    My gold making blog
    Your journey towards the gold cap!


  7. #127
    It'll never happen, if it does they still don't deserve human right. They may be able to think and act like a real human, but they probably will never feel pain, sadness, and all the other emotions that the actual human would have. Lastly, if WE ever made some kind of AI like that, our world will probably end up exactly just like how the movie "Terminator" is. If they were given human right and set not to break certain rules/law that only applies to AI/robot then a movie like "I-Robot" would come in place which it failed.

  8. #128
    Deleted
    The day machines acquire true sentience, I wonder what will be the equivalent of vegans for machines

  9. #129
    Deleted
    humans aren't entitled to any rights, why should they.

  10. #130
    Quote Originally Posted by Vulmio View Post
    The day machines acquire true sentience, I wonder what will be the equivalent of vegans for machines
    The question has almost nothing to do with intelligence or sentience here. It's something people utterly ignore.

    The real question here is moral and ethical.

    What is sentience and self awareness? The ability to recognise yourself in a mirror? The ability to exhibit emotions?

    A couple of examples.

    Dolphins and certain primates have ability to recognize themselves in a mirror. And it is undoubtle that many animal life forms have the ability to display emotion, and further more to show a great deal of empathy. Like again primates, dolphins and even your household dog.

    Yet there is a social consensus that these beings are ANIMALS, and we do not apply the charter of human rights to them. While because we are empathic beings and are able to related to other life forms, we try to apply HUMANE treatment to them. But still we draw a clear line in the sand when it comes to Human Rights and Animal Rights.

    A nother exemple. A human being in a vegetative state, due to brain damage, or even pre natal cognitive impairment, or human beings suffering from mental disorders that make them unable to feel empathy or emotions or it makes it impossible for them to relate to other human beings.

    Yet we still grant these individuals by social consensus Human Rights. They might not qualify by the standards we apply to animals but they are still considerd human beings with Human Rights.

    It all has to do with our perception of ourselves. In other words, we have a mental inability to reconize strange forms of life or intelligence as sentience, self awareness and thus we do not categorize them as HUMAN.

    This is where our obession with Humanoid robots is coming from. Humanoid might not be the most efficient shape for an itelligent robot. Yet we always try to imagine them with some sort of human features. If it looks like a human, talks like a human, quacks like a human, well its a human.

    I am sorry to say but we are really that simple and biased.

    If we create artificial intelligence to mimic our own and to pass as a human not just in software but in hardware as well, we would be more likely to grant them human rights because our perception bias.

  11. #131
    Deleted
    Quote Originally Posted by Uennie View Post
    Hopefully we'll be smart enough to never develop machines to that point, because I'm thinking "no".
    Year2047:
    Apple announces the iRobot


    ;]

  12. #132
    Bloodsail Admiral Orodoth's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Ridgeland, South Carolina
    Posts
    1,184
    Thats a big negative Ghost Rider, pattern is full.

    HUMAN rights, has the word HUMAN in it for a reason, much like animal rights is for animals.

    If AI were to achieve the level your talking about OP, some sort of AI rights could be implemented to ensure they are not treated poorly,, but aren't treated as human (because they simply aren't, cannot, and will not be humans, ever)

    Quite frankly though, the thought of self aware machines achieving sentience scares the flying shit out of me. People didn't dream up movies like The Terminator and The Matrix for no good reason. The question will always burn heavily in the minds of the sceptical; "What if one day, they decide through some fluke, that they are superior, and we are not needed, to the point that we become pests that must be dealt with"

  13. #133
    Quote Originally Posted by Orodoth View Post
    Thats a big negative Ghost Rider, pattern is full.

    HUMAN rights, has the word HUMAN in it for a reason, much like animal rights is for animals.

    If AI were to achieve the level your talking about OP, some sort of AI rights could be implemented to ensure they are not treated poorly,, but aren't treated as human (because they simply aren't, cannot, and will not be humans, ever)

    Quite frankly though, the thought of self aware machines achieving sentience scares the flying shit out of me. People didn't dream up movies like The Terminator and The Matrix for no good reason. The question will always burn heavily in the minds of the sceptical; "What if one day, they decide through some fluke, that they are superior, and we are not needed, to the point that we become pests that must be dealt with"
    Again. Human perception bias.

    Why would machines decide they are superior? Would they care? Superiority is based on human inferiority complex. Would machines suffer from the same metal unbalances as us? Why would they need to decide to exterminate us, instead of collaborating with us? What would be a machines definition of freedom? Would it care that he cant vote or run its own buisness, when it would obviously be part of any social and economic development anyways etc etc.

    If we create artificial intelligence to mimic our own, both in hardware and software in reality we would only be making manufactured humans. But if artificial intelligence would not need to mimic us, we have no clue how it would behave. Thus it scares the shit out of us. Tickles our freak alert and we react by "Kill it with FIRE!!!

    The question again has little to do with what is intelligence and sentience, and is a lot deeper moral/ethical/perception issue.

  14. #134
    AI are, and forever will be made to serve and/or help humans. But never to be human. Thus they do not deserve human rights (no matter what the creepy part of Japanese society says)

    I think a fully sentient robot should deserve some rights, more than say, a cow. but not human rights.

  15. #135
    sure, but make sure you don't program anger into that AI! But make sure there is a hell lot of empathy in it! This way they wouldn't understand anger and wouldn't want it either but they would like to become paramedics, firefighters etc because of their high empathy.

    Also you could punch one in the face without it getting mad, another would jsut come and patch it up!

    And everyone lived happy ever after

  16. #136
    Quote Originally Posted by misspellar View Post
    AI are, and forever will be made to serve and/or help humans. But never to be human. Thus they do not deserve human rights (no matter what the creepy part of Japanese society says)

    I think a fully sentient robot should deserve some rights, more than say, a cow. but not human rights.
    and i'll throw you to the lions when the revolution comes
    Isnt 10% of infinite still infinite?

  17. #137
    Merely a Setback Adam Jensen's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    Sarif Industries, Detroit
    Posts
    29,063
    Quote Originally Posted by Luftmangle View Post
    What if they committed a crime, would we send them to jail for rehabilitation?

    What if they become depressed, do we give them robot prozac?

    I mean people who think that objects will become truly self-aware are pretty stupid.
    Why can't an object become self aware? We are a compilation of inanimate chemicals who became self aware, why is it such a strange idea that it can't be reproduced artificially?
    Putin khuliyo

  18. #138
    Quote Originally Posted by ambigiouslynamed View Post
    well i consider you a tool, does that mean i get to sell you to the highest bidder?
    I couldn't tell if your comment was hostile or not, but why do you consider me a tool?

    However to answer your question, no you may not sell me because you do not own me nor does anyone else.

  19. #139
    Merely a Setback Adam Jensen's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    Sarif Industries, Detroit
    Posts
    29,063
    Quote Originally Posted by mittacc View Post
    Also you could punch one in the face without it getting mad, another would jsut come and patch it up!
    If it's made out of metal, punching it might be a bad idea
    Putin khuliyo

  20. #140
    Quote Originally Posted by alms1407 View Post
    I couldn't tell if your comment was hostile or not, but why do you consider me a tool?

    However to answer your question, no you may not sell me because you do not own me nor does anyone else.
    but without rights i could sell AI and how are you any diff from a robot? to me your just just a bunch of electricity floating around a squishy mass
    Isnt 10% of infinite still infinite?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •