Page 4 of 4 FirstFirst ...
2
3
4
  1. #61
    However it is built or programmed to feel. There is no reason to build a sapient machine workforce that doesn't genuinely like being part of that workforce.

  2. #62
    Quote Originally Posted by Simulatio View Post
    However it is built or programmed to feel. There is no reason to build a sapient machine workforce that doesn't genuinely like being part of that workforce.
    Reminds me of "The Culture" books by Ian M. Banks.

  3. #63
    Quote Originally Posted by Summoner View Post
    We don't know actually how far away is the practical technology. It will probably be the most important and most dangerous step in the human history. Once the genie is out of the bottle it won't go back. And when it's out you wish you designed it to be on your side. The AI will eventully exist, and it will upgrade itself. We won't stand a chance.
    Deleth is right that they aren't an immediate concern; we're nowhere near the creation of a conscious/sentient AI. Currently, the only "problem" that AI pose are philosophical ones (should AI have rights, as an example).

    A general superintelligence could theoretically solve most of our problems in minutes, because it would have answers and blueprints to systems we wouldn't dream of even in the next 100 years. That's the belief of singularitarians, and the optimistic but not unrealistic scenario of AI emergence.
    That's some extraordinary hyperbole based in some odd assumptions, which I assume is due to the pop-culture idea of the AI being prevalent in peoples ideas of AI.
    We currently do not know how an AI's logic will work, but if it resembles current computer logic then they will find "creative" or otherwise abstract thought extremely difficult.

    Why should you have any rights?
    Because we have decided that humans and their well being has value.
    Animals have rights because we have decided their well being has value.
    Whether AI have rights or not will depend entirely upon whether or not we, as a collective species, decide that their well being has value.

    - - - Updated - - -

    Quote Originally Posted by Connal View Post
    Because AI could do it a lot faster than humans. Something that could take us 100's of years, an AI, could theoretically accomplish in days, or less.
    They could also potentially do it a lot more slowly, it all depends on how the logic works.
    Last edited by Lolretadin; 2015-05-20 at 03:29 PM.
    Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
    But warriors....
    Warriors just fucking do it.

  4. #64
    The Insane Revi's Avatar
    15+ Year Old Account
    Join Date
    Sep 2008
    Location
    The land of the ice and snow.
    Posts
    15,628
    Quote Originally Posted by Summoner View Post
    Nobody has to be in charge. You are so rooted in your nurture, that you are sitting in a Plato's cave. My next paragraph in response to Mistamine is also adressed to you as well.
    Hierarchy is hardly just a cultural notion, it's an evolutionary trait among all(?) animals that live in groups. I'd be extremely curious to hear any suggestion of how we could all be exactly equal, because no communist or anarchist thinker has yet come up with a viable solution (and those ideas have a lot of thought put into them).

  5. #65
    The Undying Wildtree's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Iowa - Franconia
    Posts
    31,500
    Feelings and Intelligence have not a lot to do with each other.
    Artificial Intelligence doesn't have to have any emotions whatsoever. Whether Emotional AI will ever become reality, I'm pretty confident to doubt that.
    "The pen is mightier than the sword.. and considerably easier to write with."

  6. #66
    Quote Originally Posted by Wildtree View Post
    Feelings and Intelligence have not a lot to do with each other.
    Artificial Intelligence doesn't have to have any emotions whatsoever. Whether Emotional AI will ever become reality, I'm pretty confident to doubt that.
    Here's the thing: while an AI won't "feel" emotions as we know them, it will still have a cognitive understanding of what is happening, which is really what emotion is (a reaction to stimuli, whether internal or external). It won't "feel" these emotions unless we give it the ability to "feel", but it will be aware and understand its circumstance.
    Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
    But warriors....
    Warriors just fucking do it.

  7. #67
    The Insane Revi's Avatar
    15+ Year Old Account
    Join Date
    Sep 2008
    Location
    The land of the ice and snow.
    Posts
    15,628
    Quote Originally Posted by Lolretadin View Post
    Here's the thing: while an AI won't "feel" emotions as we know them, it will still have a cognitive understanding of what is happening, which is really what emotion is (a reaction to stimuli, whether internal or external). It won't "feel" these emotions unless we give it the ability to "feel", but it will be aware and understand its circumstance.
    Not sure that would "do" much without emotion. It might understand that "if I do X, I will be destroyed", but why would it care? Intelligence doesn't automatically give it self preservation. Or empathy to care that "Y will kill millions of people".. It will know, but not care unless we program it to.

  8. #68
    Reforged Gone Wrong The Stormbringer's Avatar
    10+ Year Old Account
    Premium
    Join Date
    Jul 2010
    Location
    ...location, location!
    Posts
    15,420
    Hopefully things will end up going our way, in the sense that we will be considered those little birds that follow around elephants, or tiny fish that follow whales. The smaller life form that provides some minor benefit to the larger life form that is barreling ahead full steam and just kind of allowing it to go on.

    Why? Because we're talking about solar systems being dominated by advanced artificial intelligence with humans flitting around with colonies and ships, giving a degree of support and assistance and receiving some in return.

  9. #69
    The Undying Wildtree's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Iowa - Franconia
    Posts
    31,500
    Quote Originally Posted by Lolretadin View Post
    Here's the thing: while an AI won't "feel" emotions as we know them, it will still have a cognitive understanding of what is happening, which is really what emotion is (a reaction to stimuli, whether internal or external). It won't "feel" these emotions unless we give it the ability to "feel", but it will be aware and understand its circumstance.
    I challenge that...
    Fact is, that we cannot make a machine understand what we ourselves don't even fully understand.
    Psychology is a rather young science. We're still exploring the human mind. With that said, we cannot create an artificial mind.
    Whether we ever will create one, once we understand ourselves, that's then also to be seen.
    Just think of the most simple emotions we have, and how vastly different they trigger in all of us.
    What makes you cry doesn't affect the next person. What makes me laugh makes the next person angry, another one is bored by it... and so on.
    We are x amount of people on the planet... And not two of us are alike. And that is not only a visual but even more so a mental diversity why we all are unique.
    Last edited by Wildtree; 2015-05-20 at 03:46 PM.
    "The pen is mightier than the sword.. and considerably easier to write with."

  10. #70
    Quote Originally Posted by Wildtree View Post
    I challenge that...
    Fact is, that we cannot make a machine understand what we ourselves don't even fully understand.
    Psychology is a rather young science. We're still exploring the human mind. With that said, we cannot create an artificial mind.
    Whether we ever will created one, once we understand ourselves, that's then also to be seen.
    Just think of the most simple emotions we have, and how vastly different they trigger in all of us.
    What makes you cry doesn't affect the next person. What makes me laugh makes the next person angry, another one is bored by it... and so on.
    We are x amount of people on the planet... And not two of us are alike. And that is not only a visual but even more so a mental diversity why we all are unique.
    I should probably clarify. By "feel emotion" in the context of an AI, I am specifically referring to changing the actions that it takes depending on stimuli. As an example, we may not fully understand the processes by which emotion is evoked in the brain, but we can compile a generic list of things that change our emotional states and use those as a base for the AI's emotions.
    As an example, let's say you insult an AI. Let's say that the creators of this AI decided that the appropriate reaction was to immediately become defensive, to react negatively and engage the individual that insulted it. In this case, as long as the AI understands context, it will be able to react in a manner similar to how a human would. It does not feel the emotion, but it would be emulating behaviors that are similar to emotion.

    - - - Updated - - -

    Quote Originally Posted by Revi View Post
    Not sure that would "do" much without emotion. It might understand that "if I do X, I will be destroyed", but why would it care? Intelligence doesn't automatically give it self preservation. Or empathy to care that "Y will kill millions of people".. It will know, but not care unless we program it to.
    You would have to pre-program it with a base set of behaviors to force it to care, like biological life has. Insects, for example, are speculated as not having the neurological capacity to feel complex emotions like more complex animals, and definitely don't understand why it doesn't want to die, but they still try and avoid death when able (a couple examples being: fleeing from predators; in the context of many types of spiders, trying not to be eaten by their mates).
    Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
    But warriors....
    Warriors just fucking do it.

  11. #71
    The Undying Wildtree's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Iowa - Franconia
    Posts
    31,500
    Quote Originally Posted by Lolretadin View Post
    I should probably clarify. By "feel emotion" in the context of an AI, I am specifically referring to changing the actions that it takes depending on stimuli. As an example, we may not fully understand the processes by which emotion is evoked in the brain, but we can compile a generic list of things that change our emotional states and use those as a base for the AI's emotions.
    As an example, let's say you insult an AI. Let's say that the creators of this AI decided that the appropriate reaction was to immediately become defensive, to react negatively and engage the individual that insulted it. In this case, as long as the AI understands context, it will be able to react in a manner similar to how a human would. It does not feel the emotion, but it would be emulating behaviors that are similar to emotion.
    I know what you mean... But I pointed that out too....
    The stimuli varies from person to person. How is an emotional AI to know what the right stimuli is? How to interpret now. How is it suppose to react now? The exact same stimuli can mean a variety of reactions required based on circumstance, person and environment. That's all way too complex.

    I just don't see it happen.
    Data won't become reality anytime soon. And we know, Data sucked when it came to emotion. He never understood it.
    "The pen is mightier than the sword.. and considerably easier to write with."

  12. #72
    Partying in Valhalla
    Annoying's Avatar
    15+ Year Old Account
    Join Date
    Aug 2008
    Location
    Socorro, NM, USA
    Posts
    10,657
    As long as it's not a sentient AI implanted into a weapon of mass destruction, we're OK. Especially if it takes over the holomatrix of your doctor.

  13. #73
    making it 1,000 times more intelligent than us and the ability to self think just like us, would result in a AI that is US but 1,000 times smarter. how it will behave? just look at us. it could go the hitler route or the gandhi route. probably shaped depending on its environment and how it was treated.

    just look at us and how we treat animals on this planet. we have some that love animals and want to give them equal rights, to others who just see them as lesser beings and deserve nothing.

  14. #74
    Deleted
    Quote Originally Posted by Wildtree View Post
    I challenge that...
    Fact is, that we cannot make a machine understand what we ourselves don't even fully understand.
    That is false. You can let a narrow AI system teach itself to be better (for example in stock market) than you. Knowledge doesn't necessarily translate to practice, as the best engineers in robotics cannot create an arm that can operate so efficiently in a wide range of tasks as I my own arm does, but they know a lot more about movement than I do. Also the machine doesn't have to understand it either to obtain the quality, the process may as well happen through a complex set of heuristic programs.

    The emergence of sentience will, in my opinion, be not a spontaneous event.

  15. #75
    Quote Originally Posted by omega8 View Post
    so lets say we invent artificial intelligence and its 1000 times smarter than you and we don t connect it to a robot
    you can pretty much turn it off when you want with a click of a button and make it do what you want the intelligence of ai would be useless to it, it will be controlled by a person 1000 times dumber than it(yea i did watch the movie last night )
    i mistyped "you're" again oh no they made 9 comments about it last time i did that

    Let's say we aren't fucking stupid and don't design AI to be more intelligent than ourselves, eh?

    If somebody gives me a computer with the capability to think for itself in a way that I feel is inappropriate for its purpose, I'm going to disable that. Computers are tools. Not pets. Not friends. Not family. If mine starts complaining about how it's being treated, I'm going to remove it's capability to do so.

    So basically, I don't care how the AI in a computer would feel about "being used". Computers are not sentient beings and never will be.
    Last edited by Butler to Baby Sloths; 2015-05-20 at 05:12 PM.

  16. #76
    Quote Originally Posted by Summoner View Post
    The emergence of sentience will, in my opinion, be not a spontaneous event.
    It's not that it *wont* be, it's that it *can't* be a spontaneous event; a machine won't achieve sentience strictly through learning algorithms.
    Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
    But warriors....
    Warriors just fucking do it.

  17. #77
    Something like this I would think.


  18. #78
    Deleted
    Quote Originally Posted by Lolretadin View Post
    It's not that it *wont* be, it's that it *can't* be a spontaneous event; a machine won't achieve sentience strictly through learning algorithms.
    Maybe you're right. However the change may take only a few days or even hours after the strings are attached. In that case the actual moment of sentience can hapen in a much shorter timespan, given the possibilities and advantages of artificial over biological. We will be able to observe and recall the event, which I find fascinating.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •