Page 9 of 16 FirstFirst ...
7
8
9
10
11
... LastLast
  1. #161
    Quote Originally Posted by Ynna View Post
    If an AI doesn't reach super-intelligence (which, for this discussion, I'm assuming it doesn't) there's no particular reason to assume it prefers a selfish strategy to a cooperation one.
    That's kind of a big if. At the point we're capable of making machines as intelligent as humans, then we're probably capable of making them slightly more intelligent. While we're constrained by the slow process of evolution, intelligent design can work a lot faster.

  2. #162
    Herald of the Titans Ynna's Avatar
    Join Date
    May 2009
    Location
    Belgium
    Posts
    2,736
    Quote Originally Posted by ramsesakama View Post
    That's kind of a big if. At the point we're capable of making machines as intelligent as humans, then we're probably capable of making them slightly more intelligent. While we're constrained by the slow process of evolution, intelligent design can work a lot faster.
    I understand completely, but if we assume the AI becomes incredibly intelligent very fast, there's little point in discussing whether they deserve equal rights. We could ask the AI, and it would be right.
    Retired Holy Priest
    As a rule, I try to act on the internet as I would in real life. If I have offended you, feel free to point it out. Unless I meant to offend you, I will probably apologize.

  3. #163
    Quote Originally Posted by Ynna View Post
    I understand completely, but if we assume the AI becomes incredibly intelligent very fast, there's little point in discussing whether they deserve equal rights. We could ask the AI, and it would be right.
    I made a post on page 2 asking whether AI would think humans deserve human rights. It was sort of tongue-in-cheek, but in a way it's a very serious problem.

    If we can't justify to a super-intelligent AI why we deserve to live, then we will die. End of story.

  4. #164
    Herald of the Titans Ynna's Avatar
    Join Date
    May 2009
    Location
    Belgium
    Posts
    2,736
    Quote Originally Posted by ramsesakama View Post
    If we can't justify to a super-intelligent AI why we deserve to live, then we will die. End of story.
    Only if the AI is completely utilitarian, and then it would destroy most complex life on earth, because it isn't useful. And then it would self-destruct, because it isn't useful anymore. There are millions of things that aren't useful. And useful is a problematic concept to begin with.

    In all I've got a pretty optimistic view on the future, and I hope we can "raise" our robots sensibly, with respect for natural and artificial life.
    Retired Holy Priest
    As a rule, I try to act on the internet as I would in real life. If I have offended you, feel free to point it out. Unless I meant to offend you, I will probably apologize.

  5. #165
    Some of you guys think it's a clear cut answer without hesitation, but that may due to your inability to perceive what's coming in the not-too-distant future, or getting bogged down in semantics.

    The human brain will be completely mapped and reconstructed in robotic form within the next 3 decades. Prior to that human augmentation will have become more commonplace, including nanites being injected into the bloodstream to combat disease. We will further progress towards the technological singularity where, much like the matrix, our consciousness can be uploaded into a server/machine somewhere. Once this happens there will be a blurring of where your humanity begins and ends.

    I would guess that before we have to fully address the AI rights issue, we will be confronted with societal human/augmented human discrimination. That ruling would probably grant a better framework to the AI rights issue because it will give people the ability to look at it from a human-hybrid issue rather than just sentient machines. If you think what I'm talking about is all nonsense you should probably check out the documentary Transcendent Man on Ray Kurzweil. It's streamable on Netflix and it's got some interesting points on this subject matter.
    Last edited by Zeldaveritas; 2012-05-05 at 05:47 PM.

  6. #166
    The Lightbringer N-7's Avatar
    Join Date
    Apr 2012
    Location
    UK
    Posts
    3,410
    Althought it is just a game, Mass Effect 3 has touched this topic excellently (Geth and Quarains). I believe that if the AI is sentient then Yes it should have 'human' rights.

  7. #167
    Bloodsail Admiral Trigg's Avatar
    Join Date
    Apr 2010
    Location
    Lamp. Near the town of chair, in the country Coffee Table.
    Posts
    1,010
    In my opinion, any who says a sentient being that is mechanical and not biological shouldn't deserve "human" rights is incredibly naive. If a being can think for itself, can act by itself and can understand things on the same level as a human, yet is of a different material make-up, it/he/she still deserves rights just any thing else should. Unfortunately it's pretty much garunteed that in many places if not everywhere they wouldn't. People would be incredibly afraid of a being like this, which in turn, not only denies it rights, but also alienates it further. Shame, since imo sentient AI is the future, it can't not be. Species evolve and adapt. Extremely advanced AI will be able to do the same, but better.





  8. #168
    Hypothetically? No of course not! We still can't allow equal rights to all humans.

  9. #169
    Old God Grizzly Willy's Avatar
    Join Date
    Apr 2011
    Location
    Kenosha, Wisconsin
    Posts
    10,204
    Quote Originally Posted by Zeldaveritas View Post
    Some of you guys think it's a clear cut answer without hesitation, but that may due to your inability to perceive what's coming in the not-too-distant future, or getting bogged down in semantics.

    The human brain will be completely mapped and reconstructed in robotic form within the next 3 decades. Prior to that human augmentation will have become more commonplace, including nanites being injected into the bloodstream to combat disease. We will further progress towards the technological singularity where, much like the matrix, our consciousness can be uploaded into a server/machine somewhere. Once this happens there will be a blurring of where your humanity begins and ends.

    I would guess that before we have to fully address the AI rights issue, we will be confronted with societal human/augmented human discrimination. That ruling would probably grant a better framework to the AI rights issue because it will give people the ability to look at it from a human-hybrid issue rather than just sentient machines. If you think what I'm talking about is all nonsense you should probably check out the documentary Transcendent Man on Ray Kurzweil. It's streamable on Netflix and it's got some interesting points on this subject matter.
    My argument, and my answer, stems from my perception of human life. What makes us different than an advanced AI that can contemplate its own existence at the same level we do? If we would refuse to give them rights, then we certainly wouldn't give them to other sentient species, and at that point we find ourselves akin to the pro-humanity movement in the Mass Effect series, viewing other sentient beings as inferior, with the only reason being that they are not human. It's no different than saying that a man of different skin color is inferior merely because he's not like you.

    Also, if I had a Netflix subscription I would totally watch that, but alas, I do not.

  10. #170
    Legendary! Polarthief's Avatar
    Join Date
    Oct 2009
    Location
    (USA) Florida
    Posts
    6,280
    IMO no because they shouldn't be programmed that far in the first place.

    Futurama should NOT be what our planet (or rather universe in their case) looks like in 988 years or so.

    Retired Veteran Raider: [T14] 10/16H, [T15] 12/13H, [T16] 7/14H
    FFXIV Stuff: i95 WHM/i92 SCH/i84 WAR/i83 BRD; T1-4, Garuda/Titan/Ifrit Xs

  11. #171
    Herald of the Titans Ynna's Avatar
    Join Date
    May 2009
    Location
    Belgium
    Posts
    2,736
    Quote Originally Posted by Stuffedtreats View Post
    Hypothetically? No of course not! We still can't allow equal rights to all humans.
    I think most people can agree that all humans deserve equal rights. Or did you really mean that you don't think all humans deserve the same rights?
    Retired Holy Priest
    As a rule, I try to act on the internet as I would in real life. If I have offended you, feel free to point it out. Unless I meant to offend you, I will probably apologize.

  12. #172
    Mechagnome lzsg's Avatar
    Join Date
    Mar 2009
    Location
    Stockholm, Sweden
    Posts
    590
    Quote Originally Posted by N-7 View Post
    Althought it is just a game, Mass Effect 3 has touched this topic excellently (Geth and Quarains). I believe that if the AI is sentient then Yes it should have 'human' rights.
    I would say that the entire Mass Effect series is about organics vs. synthetics. The Quarian/Geth conflict illustrates well what happens when machines develop sapience but are still treated like slaves by their creators, though. IIRC, the entire conflict starts when Legion asks a Quarian "Does this unit have a soul?".
    Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo.

  13. #173
    Dreadlord fartman69's Avatar
    Join Date
    Jan 2009
    Location
    Tall Grass
    Posts
    828
    Quote Originally Posted by Led ++ View Post
    So what useful thing do we have to do for our planet? Sing together in the woods, naked while enjoying some fine shrooms?

    I blame Avatar for all this emo-shit on MMO as if we're some kind of plague that will destroy the universe. What pessimistic lives some people here must have.
    The problem is people dont realise what happens after avatar, from space... pew pew pew.

    If the AI has 3 laws its a tool a machine we do not treat it equally, if your toaster started talking would but burned your toast would you never throw it out? for a more efficient toaster?

  14. #174
    Scarab Lord Puck's Avatar
    Join Date
    Aug 2010
    Location
    Williams Lake, BC, Canada
    Posts
    4,358
    Quote Originally Posted by Orestis View Post
    Maybe...

    Be kind of fucked up if AI was given human rights before some humans were given it though...
    in the year 2083, Gays will still not have the right to marry or adopt, but robots will!

  15. #175
    Herald of the Titans Ynna's Avatar
    Join Date
    May 2009
    Location
    Belgium
    Posts
    2,736
    Quote Originally Posted by fartman69 View Post
    Th
    If the AI has 3 laws its a tool a machine we do not treat it equally, if your toaster started talking would but burned your toast would you never throw it out? for a more efficient toaster?
    Depends on what the toaster had to say. If it was an interesting toaster, I'd keep it.
    Retired Holy Priest
    As a rule, I try to act on the internet as I would in real life. If I have offended you, feel free to point it out. Unless I meant to offend you, I will probably apologize.

  16. #176
    No. They do not. If they are given rights, what's to stop them for realizing that we are bad. With the Internet's knowledge they will realize that humans are a plague on earth, and need to be annihilated.

  17. #177
    Old God Grizzly Willy's Avatar
    Join Date
    Apr 2011
    Location
    Kenosha, Wisconsin
    Posts
    10,204
    Quote Originally Posted by Tobywongg View Post
    No. They do not. If they are given rights, what's to stop them for realizing that we are bad. With the Internet's knowledge they will realize that humans are a plague on earth, and need to be annihilated.
    Because we're not?

    Quote Originally Posted by fartman69 View Post
    The problem is people dont realise what happens after avatar, from space... pew pew pew.

    If the AI has 3 laws its a tool a machine we do not treat it equally, if your toaster started talking would but burned your toast would you never throw it out? for a more efficient toaster?
    I would like to point you to the laws that man abides by.

  18. #178
    Yes we are, if we are able to create AI that intelligent they will realize that we are destroying the earth, and many other things.

  19. #179
    Old God Grizzly Willy's Avatar
    Join Date
    Apr 2011
    Location
    Kenosha, Wisconsin
    Posts
    10,204
    Quote Originally Posted by Tobywongg View Post
    Yes we are, if we are able to create AI that intelligent they will realize that we are destroying the earth, and many other things.
    But we're not destroying the planet. We are physically incapable of doing so. If they were that intelligent they would realize that, if we were ruining the environment, we would wipe ourselves out.

  20. #180
    Well if they could vote, they could mass produce themselves then win any vote they wanted to..

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •