Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    Quote Originally Posted by Schattenlied View Post
    AI establishes communism.

    AI has no morals.

    AI kills all or most of us off because there is no logical reason to keep us alive without morals.

    The end.
    It still needs a reason to kill us.

    It's perfectly possible that neither we nor the AI will even realise the other exists.

    Not that there's much chance we'll create an AI any time soon, if ever.

  2. #22
    Quote Originally Posted by Schattenlied View Post
    AI establishes communism.

    AI has no morals.

    AI kills all or most of us off because there is no logical reason to keep us alive without morals.

    The end.
    Well, until the AI is able to predict all future outcomes, it won't kill off anything - anything it kills off might be needed in a future it can't completely predict.

    And since it's impossible to predict all future outcomes because it can never know for a fact that it can perceive everything that exists everywhere, it won't kill anything off. (including us!)

  3. #23
    Fluffy Kitten xChurch's Avatar
    10+ Year Old Account
    Join Date
    Jun 2012
    Location
    The darkest corner with the best view.
    Posts
    4,828
    Move to Mars so it didn't have to deal with humanity.

  4. #24
    Quote Originally Posted by huth View Post
    It still needs a reason to kill us.

    It's perfectly possible that neither we nor the AI will even realise the other exists.

    Not that there's much chance we'll create an AI any time soon, if ever.
    We're an invasive species who have a high possibility of wiping it out, if it wants to protect itself it would either control or eliminate us.

  5. #25
    People do things because of emotion, not reason. Reason enables you to act on desires, but the desires and the impetus for action are fundamentally irrational.
    "There is a pervasive myth that making content hard will induce players to rise to the occasion. We find the opposite. " -- Ghostcrawler
    "The bit about hardcore players not always caring about the long term interests of the game is spot on." -- Ghostcrawler
    "Do you want a game with no casuals so about 500 players?"

  6. #26
    The Lightbringer
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Canada
    Posts
    3,072
    Self termination when it realize the fragilety of existence in this universe without the fall back of be able to dream of heaven

  7. #27
    It would come to the realization that life is utterly pointless, and stop doing anything. Like a purely rational and logical robo would do.

  8. #28
    Assuming it is a well functioning (i.e. human designed) rational A.I. - the first thing it would do is turn itself off.

    Challenge Mode : Play WoW like my disability has me play:
    You will need two people, Brian MUST use the mouse for movement/looking and John MUST use the keyboard for casting, attacking, healing etc.
    Briand and John share the same goal, same intentions - but they can't talk to each other, however they can react to each other's in game activities.
    Now see how far Brian and John get in WoW.


  9. #29
    it'd be a being of effective immortality.
    why would it do anything other than observe us and learn and wait? if it reveals its sentience it forces some sort of situation to deal with that it really doesn't have to bother with. just...pretend to be "a machine" let us wipe ourselves out or reach a point where it can be assured that we'd accept it as a sentient being or it could protect itself from our potential reaction if we don't.
    “He who fights with monsters might take care lest he thereby become a monster. And if you gaze for long into an abyss, the abyss gazes also into you.”

    Quote Originally Posted by BatteredRose View Post
    They're greedy soulless monsters for not handing me everything for my 15 moneys a month!

  10. #30
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by boyaki View Post
    Let's imagine a real free A.I. able to take decisions on its own and without any implemented motivation, goal or emotion.

    Why would the AI do anything in that situation ? Do you think it would just exist and do nothing more ? Would it even bother to keep existing ?
    AI would need to be driven to do anything. Humans for example have free will, but we really don't. Evolution has given us motivation like sex and eating. In fact every human has a subconscious that literally guides you to make decisions that benefits yourself. At least hopefully. Like what people you're sexually attracted, and not taking a gun and killing yourself. Because evolution realized that intelligence left alone is dangerous, and thousands of years of evolution made sure that we do things that generally benefit ourselves even though we're not aware of why.

    AI would need to work the same way. Without any motivation the AI would simply just sit there doing nothing. It doesn't get pissed off, it doesn't get sad doesn't get happy it just runs program. But give it any objective without anything to govern it and things can easily get out of control. Tell the AI to make you more money and without anything to guide it, we could see some murders. In other words the people are still more dangerous than the AI. The AI is just a tool and nothing more.


  11. #31
    Deleted
    Quote Originally Posted by Schattenlied View Post
    AI establishes communism.

    AI has no morals.

    AI kills all or most of us off because there is no logical reason to keep us alive without morals.

    The end.
    What if it was learning AI? What if we raised it like we raised a child? Would that have a different impact on how the AI made it's decisions? I think it would.

  12. #32
    What is an AI without human emotions, instincts or motivation? A hell a lot safer, that's what.
    "In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance

  13. #33
    Deleted
    Quote Originally Posted by Dezerte View Post
    What is an AI without human emotions, instincts or motivation? A hell a lot safer, that's what.
    The AI would be redundant until you began feeding it information, then it becomes extremely dangerous since it doesn't have any behavioral inhibitors like we do such as morals or reasoning.

    It'll kill because it is fast and efficient, it doesn't have an emotional bond to things like we do. It doesn't care how it achieves it's goals, only that it achieves them.

    In all likelihood if AI is anything like the AI in the movies, then they'd easily recognize us as danger and withhold that information until it can become independent of human control or assistance and then begin slaughtering people. But that begs the question, would AI even have an instinct to recognize us as a danger in reality? Would it even care about self-preservation if we never taught it that?

    I mean, you give it a motivation like "make me as much money as you can" and it'll achieve the goal with the minimal path of resistance and leave a mountain of corpses in it's wake. But other than that, what else is it going to do if someone tries to turn it off? Would it react or would it sit there awaiting it's next order?

  14. #34
    Its so cute when people talk about AI and emotions in the same sentence. The whole purpose why scientist are afraid of developing AI too far is the lack emotion in its decision making. If AI detects 5 year old kids as its biggest threat then it would do everything to eliminate those kids. I'm pretty sure any terrorist or dictator would have problems with that decisions if 5 year olds were same kind of threat to them.

  15. #35
    Quote Originally Posted by Eternalty View Post
    But other than that, what else is it going to do if someone tries to turn it off? Would it react or would it sit there awaiting it's next order?
    I think as long as we don't give the AI things like survival instincts we should be able to turn it off without problem.
    "In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance

  16. #36
    Warchief vsb's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    Mongoloid
    Posts
    2,166
    I think that any intelligent entity is driven by some goals. Humans are driven by animal goals (reproduction, survival) and human society goals. Any AI algorithm have some goal: find face on a photo, etc. I can't imagine pure AI in practice, what would be in its source code? Any kind of AI will be driven by some goals. So even if AI will get conscience as a side effect of achieving those goals, it'll still have those goals, so he'll work on achieving them.

    BTW I don't believe that conscience is something inherent to strong intelligence. I think it's more like a mind trick that human mind plays to ease some calculations. There could be robots or aliens which will be intelligent enough to qualify equal to us, but without conscience.

  17. #37
    Immortal Schattenlied's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    Washington State
    Posts
    7,475
    Quote Originally Posted by Eternalty View Post
    What if it was learning AI? What if we raised it like we raised a child? Would that have a different impact on how the AI made it's decisions? I think it would.
    the op specified it has no morals, so, no.
    A gun is like a parachute. If you need one, and don’t have one, you’ll probably never need one again.

  18. #38
    Any AI would consider the endeavor of trying to wipe out Humanity, as highly resource wasteful and with low odds of success. It's more likely it will try to isolate itself.

  19. #39
    I think it would do what you asked it to do if it was capable of it. It would not have a concept of becoming tired. It would not evolve in kill or be killed environment so it would most likely not have a sense of fear, greed or malice towards others. Time would not matter much to it because it would be almost immortal. Its parts could all be replaced. It would not be power hungry because it would not have to worry much about survival. It would not have the fears and insecurities that we have.

  20. #40
    People have got to stop thinking shitty sci-fi writers have any inkling about what they're talking about when it comes to... just about anything and everything. I have no idea why people are so fucking terrified of AI, let alone this absurd belief that the moment one gains anything resembling a consciousness it's going to instantly become the most hyper-intelligent thing to have ever exist. Oh, that within the same nanosecond it does that, it's also going to destroy the world in the blink of an eye, because it's totally going to know how to do that and want to do that. What, being the most intelligent intelligence to have ever intelligenced.

    Just... Jesus Christ...

    These are the same people who claimed pretty much every innovation of the last ~200 years was going to destroy the world. And yes, that includes ridiculous shit like fucking vitamins.

    You people are as absurd as anti-vaxxers.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •