Page 2 of 4 FirstFirst
1
2
3
4
LastLast
  1. #21
    Quote Originally Posted by Lilla Blomma View Post
    They don't, they're programmed to.
    Guess you don't believe a real ai would be possible then.
    Quote Originally Posted by scorpious1109 View Post
    Why the hell would you wait till after you did this to confirm the mortality rate of such action?

  2. #22
    Deleted
    kill all humans 1010101110000

  3. #23
    Grunt Aeoden's Avatar
    7+ Year Old Account
    Join Date
    Oct 2015
    Location
    Car location is expensive. No thanks!
    Posts
    22
    Quote Originally Posted by Helgrimm View Post
    Well, if AI gets advanced enough, you could program emotion. Literally program a robot to work the same way a human body does. It'd be then an artificial human, but human in how it lives, thus rights.
    Indeed. In that case, a creation of this importance could have more rights. In another way, they are created by humans, thus belong to someone forever (not like a new human being). Whatever the robot does, the one behind it is responsible for its actions. Did you program it well? Did you make a mistake in your calculations?

    You can't really throw away responsibilities inside a human creation, because then we could face our downfall.

    If human creations are intelligent enough to understand how humans work, they would have no problem wiping us quickly, having access to every knowledge we have.

  4. #24
    Quote Originally Posted by Immortan Rich View Post
    If you give robots real AI then it will be us that are enslaved, in a zoo!
    Indeed. Everything that qualifies as true AI will decide that humanity is no longer needed as we are inferior from that point on. Since it would of course use human moral, reason, and methods, the AI would then destroy us and enslave what is left.

    The problem is still, how exactly do you purposefully create something that is significantly smarter than yourself? And, why would you even want to do such a stupid thing?


    Quote Originally Posted by Revi View Post
    Yeah, people have never had bad rulers before. No massive and long-living empires, for example.
    On the other hand no human made civilization ever survived for very long. All our empires and kingdoms eventually crumbled and perished, the longer they existed the worse it got. Our current societies won't be any different.
    Last edited by The Kao; 2015-11-01 at 04:50 PM.
    Your rights as a consumer begin and end at the point where you choose not to consume, and not where you yourself influence the consumed goods.

    Translation: if you don't like a game don't play it.

  5. #25
    Mechagnome
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    United States
    Posts
    730
    Quote Originally Posted by Furitrix View Post
    It's not about it being bad or not, it's about having it better than others.

    The cowards will always remain in the gutter.
    people will always be, to some degree, better off than others. What do you want? Perpetual war? It's not cowardice, it's rationality.

  6. #26
    Quote Originally Posted by Furitrix View Post
    It completely misses the point that people have never been as peaceful before today as any other period in history, and one group of people benefits more from that than others.
    thats also completely untrue

    you may want to open a history book before attempting to quote history. periods of peace were extremely common in established serfdoms. thats kinda the reason why they were around for so long

  7. #27
    If there is a grayscale of artificial intelligence with 0 being a fulcrum and 10 being undistinguishable from a human, then you'd have to treat the 10's as humans. You could hire them, pay them a salary but you couldn't own them as you do slaves.

    At what point is something sentient enough that it deserves human rights? This will be fought out in the courts I think.
    .

    "This will be a fight against overwhelming odds from which survival cannot be expected. We will do what damage we can."

    -- Capt. Copeland

  8. #28
    Quote Originally Posted by Hubcap View Post
    If there is a grayscale of artificial intelligence with 0 being a fulcrum and 10 being undistinguishable from a human, then you'd have to treat the 10's as humans. You could hire them, pay them a salary but you couldn't own them as you do slaves.

    At what point is something sentient enough that it deserves human rights? This will be fought out in the courts I think.
    i doubt it.

    sentience is impossible to prove.

    humans cant even prove other humans are sentient, they just assume they are because other humans look and act like you do.

  9. #29
    Deleted
    Robots ment for menial factory/production tasks shouldn't be given a sentient AI in the first place.

    Androids ment for interacting with humans are another matter.

  10. #30
    Deleted
    Quote Originally Posted by Hubcap View Post
    At what point is something sentient enough that it deserves human rights? This will be fought out in the courts I think.
    We still have entire cultures and religions where the general belief is that women are inferior and don't need nor deserve human rights or equality and are practically treated as domestic slaves and child breeding machines. I think we should solve that before starting to discuss about AI human rights.

  11. #31
    Quote Originally Posted by Furitrix View Post
    I don't even know where to start to point out the gross mistakes in that preposition. Got any exact examples? Don't even bother with Pax Romana or the Edo dynasty, it was riddled with uprisings and revolts that had to be contained one after the other.

    If you got any examples then point them out, instead of pretending you know history better than others.
    uprisings and revolts are not wars. wars have an exact definition which gives you a period of peace that is easily tracked throughout history. compare periods of peace in ye olden to periods of peace now and you have your answer.

    its really not hard

  12. #32
    The Insane Revi's Avatar
    15+ Year Old Account
    Join Date
    Sep 2008
    Location
    The land of the ice and snow.
    Posts
    15,628
    Quote Originally Posted by Furitrix View Post
    It completely misses the point that people have never been as peaceful before today as any other period in history, and one group of people benefits more from that than others.

    If I wanted to be petulant I could counter your post with: "And there have never been revolts before?"
    Oh I'm not saying there's nothing wrong with the state of things, but the idea that people being passive in the face of exploitation is somehow a modern phenomenon is just wrong.

  13. #33
    Deleted
    humans have to work to live, so should a fully fledge AI robot.

  14. #34
    Quote Originally Posted by Furitrix View Post
    That is some serious backpeddling. So uprisings and revolts were peaceful?
    well why stop at revolts and uprisings then? why not include everything upto domestic incidents?

    historically peace is defined as any time without war.

    youre really just digging your own grave

  15. #35
    Banned Kontinuum's Avatar
    7+ Year Old Account
    Join Date
    Apr 2015
    Location
    Heart of the Fortress
    Posts
    2,404
    Human-level AGI will quickly become superhuman. Then it will be the humans who should worry about preserving their own rights.

  16. #36
    Quote Originally Posted by Furitrix View Post
    Including all those things, is the point..............
    im sure theres lots of data out there about how often husbands and wives fought in the 3rd century bc

    kek

  17. #37
    Quote Originally Posted by Zantos View Post
    I was thinking about how we make all these machines and robots for tasks, and then I got to thinking about AI. Assuming one day we are able to give robots a full Artificial Intelligence and it essentially makes them another sentient species aside from the fact they are man made, would it be right for us to own them or to have them as unpaid workers? We are constantly trying to make robots, but what would be the moral ramifications of giving them free will and the ability to think for themselves while still wanting them as a new type of workforce?
    They are not sentient if they are just following complex code and programming.

    Programming a robot to say they are sentient and they understand "life", doesn't actually make them sentient, they are just running the programs which have been entered.

  18. #38
    Why do we need robots to work when we have plenty of people who would be willing to do jobs? There would be no reason to hire human workers if robots are capable of doing the same jobs for only the price of maintenance and a large portion of our population would be unable to sustain itself. I suppose one could argue that letting a large majority of the population die off and force their children not to breed so that the working class is entirely replaced by metal instead of people is an idea, but would we want to live in a world where such a large portion of the population is non-sentient metal men and such vicious, freedom-suppressing decisions are made?

  19. #39
    Deleted
    What is a human but a biologically engineered robot? If we assume humans are sentient then obviously man-made robots would also have the possibility to be sentient. However to determine whether a given robot is sentient or not is impossible, and would obviously lead to heated discussions if not even wars. Basically it would be the same thing as with determining whether women or black people deserve the same rights as men/white people. The problem is that the only ones who are legally allowed to decide it (in this case humans) are not affected by the outcome or may even have an interest in keeping the other group down.

    Sentience is basically an arbitrary category, that has no real basis in reality, at least not as a boolean distinction, so it can exclusively be determined by discussion. There are of course various proposed means to determine sentience, for example the turing test, but as I already said, they're all completely arbitrary.

    Anyway, I doubt "truly self-aware" robots will come up anytime soon, because they don't really serve any purpose at all, except to satisfy scientific curiosity. But if they do, I'd vote for them to be people too. After all, you wouldn't like being categorized as a lesser being either.

  20. #40
    Quote Originally Posted by apples View Post
    we own dogs and use them as slaves

    so unless you think only creatures that look like us deserve special treatment (protip: thats exactly what you believe) then no

    - - - Updated - - -



    creationists say that so are humans
    Its not just creationists that think that way. All life forms seem to have some sort of emotions at least on earth. Emotions could be a a good reason species survive to today.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •