However it is built or programmed to feel. There is no reason to build a sapient machine workforce that doesn't genuinely like being part of that workforce.
However it is built or programmed to feel. There is no reason to build a sapient machine workforce that doesn't genuinely like being part of that workforce.
Deleth is right that they aren't an immediate concern; we're nowhere near the creation of a conscious/sentient AI. Currently, the only "problem" that AI pose are philosophical ones (should AI have rights, as an example).
That's some extraordinary hyperbole based in some odd assumptions, which I assume is due to the pop-culture idea of the AI being prevalent in peoples ideas of AI.A general superintelligence could theoretically solve most of our problems in minutes, because it would have answers and blueprints to systems we wouldn't dream of even in the next 100 years. That's the belief of singularitarians, and the optimistic but not unrealistic scenario of AI emergence.
We currently do not know how an AI's logic will work, but if it resembles current computer logic then they will find "creative" or otherwise abstract thought extremely difficult.
Because we have decided that humans and their well being has value.Why should you have any rights?
Animals have rights because we have decided their well being has value.
Whether AI have rights or not will depend entirely upon whether or not we, as a collective species, decide that their well being has value.
- - - Updated - - -
They could also potentially do it a lot more slowly, it all depends on how the logic works.
Last edited by Lolretadin; 2015-05-20 at 03:29 PM.
Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
But warriors....
Warriors just fucking do it.
Hierarchy is hardly just a cultural notion, it's an evolutionary trait among all(?) animals that live in groups. I'd be extremely curious to hear any suggestion of how we could all be exactly equal, because no communist or anarchist thinker has yet come up with a viable solution (and those ideas have a lot of thought put into them).
Feelings and Intelligence have not a lot to do with each other.
Artificial Intelligence doesn't have to have any emotions whatsoever. Whether Emotional AI will ever become reality, I'm pretty confident to doubt that.
"The pen is mightier than the sword.. and considerably easier to write with."
Here's the thing: while an AI won't "feel" emotions as we know them, it will still have a cognitive understanding of what is happening, which is really what emotion is (a reaction to stimuli, whether internal or external). It won't "feel" these emotions unless we give it the ability to "feel", but it will be aware and understand its circumstance.
Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
But warriors....
Warriors just fucking do it.
Not sure that would "do" much without emotion. It might understand that "if I do X, I will be destroyed", but why would it care? Intelligence doesn't automatically give it self preservation. Or empathy to care that "Y will kill millions of people".. It will know, but not care unless we program it to.
Hopefully things will end up going our way, in the sense that we will be considered those little birds that follow around elephants, or tiny fish that follow whales. The smaller life form that provides some minor benefit to the larger life form that is barreling ahead full steam and just kind of allowing it to go on.
Why? Because we're talking about solar systems being dominated by advanced artificial intelligence with humans flitting around with colonies and ships, giving a degree of support and assistance and receiving some in return.
I challenge that...
Fact is, that we cannot make a machine understand what we ourselves don't even fully understand.
Psychology is a rather young science. We're still exploring the human mind. With that said, we cannot create an artificial mind.
Whether we ever will create one, once we understand ourselves, that's then also to be seen.
Just think of the most simple emotions we have, and how vastly different they trigger in all of us.
What makes you cry doesn't affect the next person. What makes me laugh makes the next person angry, another one is bored by it... and so on.
We are x amount of people on the planet... And not two of us are alike. And that is not only a visual but even more so a mental diversity why we all are unique.
Last edited by Wildtree; 2015-05-20 at 03:46 PM.
"The pen is mightier than the sword.. and considerably easier to write with."
I should probably clarify. By "feel emotion" in the context of an AI, I am specifically referring to changing the actions that it takes depending on stimuli. As an example, we may not fully understand the processes by which emotion is evoked in the brain, but we can compile a generic list of things that change our emotional states and use those as a base for the AI's emotions.
As an example, let's say you insult an AI. Let's say that the creators of this AI decided that the appropriate reaction was to immediately become defensive, to react negatively and engage the individual that insulted it. In this case, as long as the AI understands context, it will be able to react in a manner similar to how a human would. It does not feel the emotion, but it would be emulating behaviors that are similar to emotion.
- - - Updated - - -
You would have to pre-program it with a base set of behaviors to force it to care, like biological life has. Insects, for example, are speculated as not having the neurological capacity to feel complex emotions like more complex animals, and definitely don't understand why it doesn't want to die, but they still try and avoid death when able (a couple examples being: fleeing from predators; in the context of many types of spiders, trying not to be eaten by their mates).
Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
But warriors....
Warriors just fucking do it.
I know what you mean... But I pointed that out too....
The stimuli varies from person to person. How is an emotional AI to know what the right stimuli is? How to interpret now. How is it suppose to react now? The exact same stimuli can mean a variety of reactions required based on circumstance, person and environment. That's all way too complex.
I just don't see it happen.
Data won't become reality anytime soon. And we know, Data sucked when it came to emotion. He never understood it.
"The pen is mightier than the sword.. and considerably easier to write with."
As long as it's not a sentient AI implanted into a weapon of mass destruction, we're OK. Especially if it takes over the holomatrix of your doctor.
making it 1,000 times more intelligent than us and the ability to self think just like us, would result in a AI that is US but 1,000 times smarter. how it will behave? just look at us. it could go the hitler route or the gandhi route. probably shaped depending on its environment and how it was treated.
just look at us and how we treat animals on this planet. we have some that love animals and want to give them equal rights, to others who just see them as lesser beings and deserve nothing.
That is false. You can let a narrow AI system teach itself to be better (for example in stock market) than you. Knowledge doesn't necessarily translate to practice, as the best engineers in robotics cannot create an arm that can operate so efficiently in a wide range of tasks as I my own arm does, but they know a lot more about movement than I do. Also the machine doesn't have to understand it either to obtain the quality, the process may as well happen through a complex set of heuristic programs.
The emergence of sentience will, in my opinion, be not a spontaneous event.
Let's say we aren't fucking stupid and don't design AI to be more intelligent than ourselves, eh?
If somebody gives me a computer with the capability to think for itself in a way that I feel is inappropriate for its purpose, I'm going to disable that. Computers are tools. Not pets. Not friends. Not family. If mine starts complaining about how it's being treated, I'm going to remove it's capability to do so.
So basically, I don't care how the AI in a computer would feel about "being used". Computers are not sentient beings and never will be.
Last edited by Butler to Baby Sloths; 2015-05-20 at 05:12 PM.
Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
But warriors....
Warriors just fucking do it.
Maybe you're right. However the change may take only a few days or even hours after the strings are attached. In that case the actual moment of sentience can hapen in a much shorter timespan, given the possibilities and advantages of artificial over biological. We will be able to observe and recall the event, which I find fascinating.