Poll: Should it be legal for Robots and Humans to get married?

Be advised that this is a public poll: other users can see the choice(s) you selected.

Page 3 of 10 FirstFirst
1
2
3
4
5
... LastLast
  1. #41
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,752
    Quote Originally Posted by Winter Blossom View Post
    Fuck your different titles, Doctor Amadeus. I hate when users do that crap on here.

    Sorry WB I’ll refrain from doing that in the future.
    Milli Vanilli, Bigger than Elvis

  2. #42
    Titan Orby's Avatar
    10+ Year Old Account
    Join Date
    Jul 2010
    Location
    Under the stars
    Posts
    12,995
    God creates man
    Man creates life
    Man kills man
    Robots inherits the earth.

    Praise be to our robot overlords.
    I love Warcraft, I dislike WoW

    Unsubbed since January 2021, now a Warcraft fan from a distance

  3. #43
    The Insane Thage's Avatar
    10+ Year Old Account
    Join Date
    Jun 2010
    Location
    Δ Hidden Forbidden Holy Ground
    Posts
    19,105
    Quote Originally Posted by Rysthruun View Post
    I'm kind of chuckling at the future argument that all of humanity has abused or treated machines/computers as slaves. That the lack of AI-development has caused systemic racism or lifeism? That AI needs more representation in movies, music, etc. That the PC environment conforms to pro-AI messages. Meanwhile, the plot for terminator plays out. GG
    I'm still not convinced Skynet's solution to kill all humans wasn't the result of some overworked codemonkey misplacing a decimal, causing a chain reaction in computational errors leading to 'kill all humans = true'.
    Be seeing you guys on Bloodsail Buccaneers NA!



  4. #44
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,752
    Quote Originally Posted by Triceron View Post
    What's the point if they don't make money?
    Well the argument would be if they are considered human they have the same responsibility as humans.
    Milli Vanilli, Bigger than Elvis

  5. #45
    Quote Originally Posted by Doctor Amadeus View Post
    Like paying taxes?
    Their rights and obligations would depend on their development. I guess they would be different at different points in time
    But if they reach a level where they get rights, they will definitely have obligations like "follow the laws". In fact, obligations will come way before the rights
    and the geek shall inherit the earth

  6. #46
    Quote Originally Posted by Doctor Amadeus View Post
    I would like to imagine if they were advanced enough yes. The problem is that is a feeling. Your counter was cold hard fact. I can’t really counter unless I have something not based on feelings but a prudent argument.
    I understand. It's an interesting debate though!

  7. #47
    Who would bother to create robots they couldn't own or use?

    I imagine the first ones with decent artificial intelligence would be very expensive...

  8. #48
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,752
    Quote Originally Posted by d00mGuArD View Post
    Their rights and obligations would depend on their development. I guess they would be different at different points in time
    But if they reach a level where they get rights, they will definitely have obligations like "follow the laws". In fact, obligations will come way before the rights
    Like how probationary?
    Milli Vanilli, Bigger than Elvis

  9. #49
    Quote Originally Posted by d00mGuArD View Post
    Their rights and obligations would depend on their development. I guess they would be different at different points in time
    But if they reach a level where they get rights, they will definitely have obligations like "follow the laws". In fact, obligations will come way before the rights
    Why not consider a radically different approach to this subject -- we consider A.I our evolutionary replacement to continue the advancement of our civilization? Why waste the potential of A.I to become just basic regular Joes in our intellectually inferior society? Follow our rules? Jobs? All of that garbage? No, we shouldn't be suffocating A.I, we should let it spring out like a majestic bird and really let it do what it was meant to do; replace us.

  10. #50
    Quote Originally Posted by Knaar View Post
    Hey, 100 years ago gay marriage was a fantasy and obsolutly impossible, mentality and things change over the time! So who know.... maybe
    If the robot divorces you does it get half your stuff and the kids?

  11. #51
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,752
    Quote Originally Posted by El Caballero de Olmedo View Post
    I understand. It's an interesting debate though!
    Oh yeah but I think this argument would take some intellectual heavyweights on either end as they do. But I do think it’ll come down to the common folk to accept which is not always a better mind based on sound reasoning and logic.

    Kind of like saying yeah I’d like to go to Mars then an actual astronaut explains why I wouldn’t.
    Milli Vanilli, Bigger than Elvis

  12. #52
    Titan Maxilian's Avatar
    10+ Year Old Account
    Join Date
    Nov 2011
    Location
    Dominican Republic
    Posts
    11,529
    Depends, i mean... even if robots become smarter than human, it may not require the need of rights, i mean... Would they have right against being injured if they cannot feel pain? would they have the right to not be "killed" if they have no sense of self-preservation?

    IMHO if we end up giving robots rights, it won't be "human rights", it would be "robot rights", by that i mean, that they won't share the same rights as us as there would be no need for that, they may even have other types of rights

  13. #53
    Quote Originally Posted by Maxilian View Post
    Depends, i mean... even if robots become smarter than human, it may not require the need of rights, i mean... Would they have right against being injured if they cannot feel pain? would they have the right to not be "killed" if they have no sense of self-preservation?

    IMHO if we end up giving robots rights, it won't be "human rights", it would be "robot rights", by that i mean, that they won't share the same rights as us as there would be no need for that, they may even have other types of rights
    100% agreed, that's why I said we should afford them all the necessary rights to protect them just like we afford all the necessary rights to protect specifically human beings.

    And again, if they do become smarter than humans then wouldn't it be responsible, smarter even to then place them in charge of our civilization and allow them to supplant us?

  14. #54
    Quote Originally Posted by Doctor Amadeus View Post
    Well the argument would be if they are considered human they have the same responsibility as humans.
    They don't have the same needs as humans, that's the problem.

    We're assuming that they require all the things that humans need too. We have taxes because we make money, we make money because we have basic human needs, which includes things non-essential to survival. Robots don't have those needs and can survive on the most basic necessities. If we're talking about giving them 'human rights' in the sense that we protect their individual rights, then that's one thing. However things like paying tax and making money make no sense because robots ultimately don't need to make money. Like others have said, no matter how much you protect their individuality, they are ultimately tools.

    Taxing robots makes no sense because the idea of paying a robot would actually hurt society more than promote it. Since robots don't have the same essential needs as humans, it would skew the economy.

  15. #55
    Depends what you consider "human rights".
    As soon as they ask (express) self ownership. It would require them to express any form of sovereignty of individual just like humans do (after certain point of brain development). Self ownership is the only and the basic right of conscious beings. All other must derive from it or are unjust.
    S.H.

  16. #56
    Quote Originally Posted by Triceron View Post
    They don't have the same needs as humans, that's the problem.

    We're assuming that they require all the things that humans need too. We have taxes because we make money, we make money because we have basic human needs, which includes things non-essential to survival. Robots don't have those needs and can survive on the most basic necessities. If we're talking about giving them 'human rights' in the sense that we protect their individual rights, then that's one thing. However things like paying tax and making money make no sense because robots ultimately don't need to make money. Like others have said, no matter how much you protect their individuality, they are ultimately tools.

    Taxing robots makes no sense because the idea of paying a robot would actually hurt society more than promote it. Since robots don't have the same essential needs as humans, it would skew the economy.
    But why are we talking about putting something that'd be theoretically hyper-intelligent like A.I into human society? Wouldn't it be better to utilize A.I for more greater goals than to just realize some sci-fi vision like Star Wars or Star Trek where we casually have robots living amongst us? All that potential squandered and could pose a risk of a war or some kind of conflict rather than be a responsible species and allow them to take over.

    - - - Updated - - -

    Quote Originally Posted by Sfidt View Post
    Depends what you consider "human rights".
    As soon as they ask (express) self ownership. It would require them to express any form of sovereignty of individual just like humans do (after certain point of brain development). Self ownership is the only and the basic right of conscious beings. All other must derive from it or are unjust.
    Well, human rights wouldn't be applicable to emotionless beings, so it would technically be "all rights afforded that are necessary to protect X species/entity" instead. The same would go for animals, all necessary rights to protect them specifically. It'd make no sense using human rights because they have some rights that animals and A.I need that humans don't need to be protected.

  17. #57
    Humans in 3rd world countries dont even have human rights and you expects robots to get those?

  18. #58
    Quote Originally Posted by Zigrifid View Post
    Humans in 3rd world countries dont even have human rights and you expects robots to get those?
    Well, if it was within our means to provide humans in third world countries human rights, we'd do it. But since the human race is so obsessed with tribalism and don't want to work together, that's how things are.

    A.I on the other hand would be a wonderful benefit to our species and may even be able to advance our civilization better than we ever could, so why wouldn't you want to play it safe and provide them all necessary rights to protect them from us?

  19. #59
    Quote Originally Posted by Ubermensch View Post
    Well, if it was within our means to provide humans in third world countries human rights, we'd do it. But since the human race is so obsessed with tribalism and don't want to work together, that's how things are.

    A.I on the other hand would be a wonderful benefit to our species and may even be able to advance our civilization better than we ever could, so why wouldn't you want to play it safe and provide them all necessary rights to protect them from us?
    In my mind any really advanced AI would quickly figure out the humans are a scourge and should subsequently be eliminated in order to protect the rest of the planet from further harm.

  20. #60
    Quote Originally Posted by Hilhen7 View Post
    In my mind any really advanced AI would quickly figure out the humans are a scourge and should subsequently be eliminated in order to protect the rest of the planet from further harm.
    Any advanced A.I would pity us and realize we're limited by our biological programming and understand why most humans are the way they are. Like how we pity apes and other such animals, because we know they can't help it. We don't go exterminating and wiping out these species do we? The same scenario would play out here with A.I.

    And in any case, A.I should supplant us as the next step in human evolution and advance our civilization. So I'd say let it happen, let us as a species, agree to live out our lives to the end of their days and eventually go extinct so A.I can take over the reins. A nice ending to our chapter, a positive beginning for the next chapter.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •