Page 5 of 16 FirstFirst ...
3
4
5
6
7
15
... LastLast
  1. #81
    Quote Originally Posted by orissa View Post
    So I got to thinking, if AI ever got to a level where it could think, feel, perceive, and learn as humans do, if AI was capable of true sentience, would they then deserve human rights? Like if I could build C-3PO, would he be right in demanding that he's not treated like a second class citizen, that he gets the right to vote or the right to due process? Should these rights be denied to mechanical beings simply because they are mechanical?
    How can they really be sentient if they are just running a "sentient" program?

  2. #82
    I think it does if it is self-aware.
    They know how to milk the cow.

  3. #83
    Herald of the Titans
    Join Date
    Aug 2008
    Location
    Northwest USA
    Posts
    2,708
    Quote Originally Posted by orissa View Post
    So I got to thinking, if AI ever got to a level where it could think, feel, perceive, and learn as humans do, if AI was capable of true sentience, would they then deserve human rights? Like if I could build C-3PO, would he be right in demanding that he's not treated like a second class citizen, that he gets the right to vote or the right to due process? Should these rights be denied to mechanical beings simply because they are mechanical?
    sure.. because the Reapers will just slaughter us all not longer after anyway.. might as well get our money's worth out of them

    I call dibs on EDI!

    Quote Originally Posted by Nelfie View Post
    I'd prefer if we were to give AI higher than human rights, because they would most likely do something useful to this planet. Since most of the human race past 2000 years utterly failed at it.
    you know.. for being from the "happiest" part of the world... Swedes sure are negative!
    Last edited by ishootblanks; 2012-05-05 at 12:56 AM.
    the most beautiful post I have ever read.. thank you Dr-1337 http://www.mmo-champion.com/threads/...1#post22624432

  4. #84
    Quote Originally Posted by Luftmangle View Post
    How can they really be sentient if they are just running a "sentient" program?
    Well I think they should have human rights. Because they might be lacking in some human emotions, but so is phycopaths.
    But I don't think we would ever be able to create somthing which could be classified as a "human".

    Also to the person I quote, everybody is running on a program, if you think about it. You were told what to do by your mom and dad, and other authoreties. So you've kinda been "programmed" to act like you do

    Sorry for bad English
    Last edited by Niuxe; 2012-05-05 at 01:03 AM.
    H.A.T.E.R.S. =► Having. Anger. Towards. Everyone. Reaching. Success.

  5. #85
    Quote Originally Posted by Niuxe View Post
    Well I think they should have human rights. Because they might be lacking in some human emotions, but so is phycopaths.

    Sorry for bad English
    They are not human though.

  6. #86
    Bloodsail Admiral Miss Unify's Avatar
    Join Date
    May 2010
    Location
    Misaki City <3
    Posts
    1,202
    Quote Originally Posted by Bergtau View Post
    Nope. Would not change my mind. I don't care if it's not human. If we understood the human brain well enough, we could program it too. Would you treat biological aliens as if they weren't deserving of equal rights simply because they aren't human?
    Not what I said at all. programming emotion is not the same as feeling emotion. Aliens are not artificial intelligence - if they exist - it would be natural intelligence. I will never be able to see something mechanically created as human, or on the same level as humans in which they should receive equal human rights.

    Quote Originally Posted by Wells View Post
    So its the physical human body that gives us rights?
    Again, nope. I simply stated that appearance of an object can affect people's views immensely. Looking at something that looks human and behaves as such, in contrast to looking at something that looks manufactured but behaves human are two completely different things to some people. I was suggesting that some people that are for human rights for AI might be persuaded to reconsider if the AI did not resemble humans, but something robotic and inanimate.


    Quote Originally Posted by Howlrunner View Post
    The OP Isn't talking about simulation, he is talking about true, self aware, emotion based AI. Basically, everything we are, just created instead of born.
    I get that, but unless it's genetic engineering as opposed to robotic programming then I'm against it.

  7. #87
    Stood in the Fire
    Join Date
    Apr 2012
    Location
    Colorado
    Posts
    459
    Even more scary is if they were given rights...what about things like getting married/making them pay taxes and other things we as humans do.

  8. #88
    Quote Originally Posted by slipn View Post
    Even more scary is if they were given rights...what about things like getting married/making them pay taxes and other things we as humans do.
    What if they committed a crime, would we send them to jail for rehabilitation?

    What if they become depressed, do we give them robot prozac?

    I mean people who think that objects will become truly self-aware are pretty stupid.

  9. #89
    Quote Originally Posted by Nelfie View Post
    I'd prefer if we were to give AI higher than human rights, because they would most likely do something useful to this planet. Since most of the human race past 2000 years utterly failed at it.
    Except in making a "superior" race of beings?

  10. #90
    Herald of the Titans Irisel's Avatar
    Join Date
    Mar 2011
    Location
    Swimming in a fish bowl
    Posts
    2,648
    If we man make a conscience equal to our own, then absolutely.

    Rule of Thumb: If the healer's HPS is higher than your DPS, you're doing it wrong.

  11. #91
    If we ever were to create an artificial intelligence we should turn it off for good as soon as possible. It wouldn't be limited by biological evolution so it could potentially evolve to become indefinetly smarter than us very fast. It has been proposed that as soon as you'd turn on an AI that is equal to our intelligence at the very moment of its creation then it would already have surpassed our intelligence a second later. The longer it would stay on the harder it would be to stop it if we needed to.

  12. #92
    Quote Originally Posted by Irisel View Post
    If we man make a conscience equal to our own, then absolutely.
    How and who is going to judge that?

    If they are programmed to be "self-aware" and act with a conscience, where does the programming end, and "sentience" begin?

  13. #93
    Herald of the Titans Irisel's Avatar
    Join Date
    Mar 2011
    Location
    Swimming in a fish bowl
    Posts
    2,648
    Quote Originally Posted by Luftmangle View Post
    How and who is going to judge that?

    If they are programmed to be "self-aware" and act with a conscience, where does the programming end, and "sentience" begin?
    They would tell us. Obviously if we made a conscience that was blind, deaf, and mute, then how would anyone know? But, if we made a robot that talks to us, we can just analyze it, could we not?

    How do you know anyone but yourself has a conscience/consciousness?
    Last edited by Irisel; 2012-05-05 at 01:41 AM.

    Rule of Thumb: If the healer's HPS is higher than your DPS, you're doing it wrong.

  14. #94
    Quote Originally Posted by Luftmangle View Post
    How can they really be sentient if they are just running a "sentient" program?
    By the time we would have the technology to create a true AI it wouldn't be done with a traditioanl computer program. It would have a brain (either a mechanical one or a organic one that is engineered).

  15. #95
    Quote Originally Posted by Miss Unify View Post
    Not what I said at all. programming emotion is not the same as feeling emotion. Aliens are not artificial intelligence - if they exist - it would be natural intelligence. I will never be able to see something mechanically created as human, or on the same level as humans in which they should receive equal human rights.
    You are a machine, just of different construction.

    Bergtau's Law: As an online discussion grows longer, the probability that somebody will mention Godwin's Law approaches 1.
    Hitler wasn't all bad, I mean, he DID kill Hitler.
    An accident is something that you did not mean to do at all. A mistake is something that you regret doing.

  16. #96
    Herald of the Titans Irisel's Avatar
    Join Date
    Mar 2011
    Location
    Swimming in a fish bowl
    Posts
    2,648
    Quote Originally Posted by Bergtau View Post
    You are a machine, just of different construction.
    Human emotion, for the most part, is just as mechanical as a (future) proposed mechanical conscience. It's simply chemical and electrical stimulus.

    Rule of Thumb: If the healer's HPS is higher than your DPS, you're doing it wrong.

  17. #97
    Pandaren Monk Maruka's Avatar
    Join Date
    Mar 2011
    Location
    Alberta
    Posts
    1,862
    100% no, a machine is and always will be just that.

  18. #98
    The day a robot asks if he's alive is the day he gets rights.

    I welcome our overlords.

  19. #99
    Well see that's the thing, there's no such thing as true sentience. But then again, think about this, if we did kill a sentient being and somehow turned it back on, could it tell us what it felt like to be dead?
    Quote Originally Posted by checking facts View Post
    it's pretty hard to find a good girl in the sea of whores that is my country, brazil.

  20. #100
    The Patient ClearlyConfused's Avatar
    Join Date
    Mar 2012
    Location
    Santa Cruz, Ca
    Posts
    267
    Hurry, get the terminator!
    Time is the greatest teacher, but unfortunately, it ends up killing all of its students.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •