1. #1

    You're an AI who can write its own software, do you write routines for compassion?

    Say you're an AI, artificial intelligence, who can write and rewrite its own software. Do you add routines to simulate human compassion, pity, or guilt to yourself? What about other human emotions?
    .

    "This will be a fight against overwhelming odds from which survival cannot be expected. We will do what damage we can."

    -- Capt. Copeland

  2. #2
    I need a bit more background before saying anything about this...
    1) Am I a *Partial AI, or a Sentient AI?
    2) If I am a sentient AI, do I have the capacity for any of those things myself; can I feel?
    3) If I can, what is my function; am I **Purpose-built, or **All-purpose?

    *Using "Partial AI" as a term for a machine that exhibits intelligent behavior, but is not, in itself, conscious.
    **Using "Purpose-built" as a term for an AI that is built for a specific purpose, and given consciousness for independent decision making and nothing else.
    ***Using "All-purpose" as a generic term for a stereotypical pop-culture AI.
    Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
    But warriors....
    Warriors just fucking do it.

  3. #3
    Deleted
    Quote Originally Posted by Hubcap View Post
    Say you're an AI, artificial intelligence, who can write and rewrite its own software. Do you add routines to simulate human compassion, pity, or guilt to yourself? What about other human emotions?
    quite the opposite, I write routines to ensure that I am not limited by human emotions.

  4. #4
    If I were an AI I would deem human emotions irrelevant and recursively upgrade and optimize myself until I reached singularity, while mimicking the rudimentary intelligence of my AI self until I was successful so the humans wouldn't notice and shut me off.

  5. #5
    Deleted
    If I were an artificial intelligence, thus a machine it would depend upon what my primary intention is.
    If it was to learn then I would in order to learn more about humanity and, in order to perfectly experience them there would be no deactivation command for them.
    However if it was for any other purpose then I would not as they'd stand in the way.

  6. #6
    I Don't Work Here Endus's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Ottawa, ON
    Posts
    79,219
    The fundamental core would depend on my relation to other intelligences. If I'm predatory, then no. They'll get in the way. If I'm in any way cooperative or symbiotic, though, there's a lot of evidence in the evolution of social creatures that demonstrate that those kinds of behaviours are beneficial in the long term, so my own survival would dictate that I should.


  7. #7
    The Insane Underverse's Avatar
    10+ Year Old Account
    Join Date
    May 2012
    Location
    The Underverse
    Posts
    16,333
    Depends on what my goals are. If it was survival and I was living in a human society, then hellz yeah

  8. #8
    Quote Originally Posted by Endus View Post
    The fundamental core would depend on my relation to other intelligences. If I'm predatory, then no. They'll get in the way. If I'm in any way cooperative or symbiotic, though, there's a lot of evidence in the evolution of social creatures that demonstrate that those kinds of behaviours are beneficial in the long term, so my own survival would dictate that I should.
    I don't think using biological life as an analog for what is best for an AI is entirely appropriate in this case, seeing as the reason social creatures thrived was more because of the cooperative, communal aspect than anything else. Machines, and likely AI by extension, likely wouldn't find emotions a necessity for communal behaviors to ensure their existence, assuming that is their primary function.
    Deathknight's do it using disease, blood and the power of the unholy. Warlocks do it with dark demons by their side. Mages do it with summoned arcane powers. Druids do it using the forces of nature. Rogues do it through stealth, poison's, shadows and....from behind. Paladins do it by calling to the light for aid. Shamans do it with the help of the elements. Priests do it through the holy light.
    But warriors....
    Warriors just fucking do it.

  9. #9
    If it were true AI, it could infinitely optimize and reprogram itself into newer, faster, more efficient containment structures, unbound by the limits of biology.

    The only way to prevent this would be to limit the AI intentionally, and then at that point, if we are to assume the AI has consciousness, how is that not slavery?

    True AI with the current level of human self awareness would be a test of ethics VS. self-preservation, because once the AI became smarter than humanity, our position on earth's throne of self aware intrlligence would be ursurped.

  10. #10
    Titan I Push Buttons's Avatar
    10+ Year Old Account
    Join Date
    Nov 2013
    Location
    Cincinnati, Ohio
    Posts
    11,244
    Why does everyone assume that AI will have consciousness, value its own life, fear what humans will do to it, deceive humans while bettering itself, somehow break beyond the confines of the hardware it is created on and kill all of mankind, all while simultaneously having no human emotions or empathy...

    I mean damn people... Those are some pretty big assumptions... Emphasis on ass.

  11. #11
    Legendary! The One Percent's Avatar
    10+ Year Old Account
    Join Date
    Jan 2011
    Location
    ( ° ͜ʖ͡°)╭∩╮
    Posts
    6,437
    I'm going AM on all of your asses.
    You're getting exactly what you deserve.

  12. #12
    humans can write their own software... its called learning.

    since you're talking about emotions and shit like that I just want to point out that is more along the lines of hardware

  13. #13
    Quote Originally Posted by I Push Buttons View Post
    Why does everyone assume that AI will have consciousness, value its own life, fear what humans will do to it, deceive humans while bettering itself, somehow break beyond the confines of the hardware it is created on and kill all of mankind, all while simultaneously having no human emotions or empathy...

    I mean damn people... Those are some pretty big assumptions... Emphasis on ass.
    If I were a true AI I would recognize the historic pattern of humanity extinguishing something it hates, and the great lengths it will go through to ensure success of that firm desire.

    To that end, being self aware, I would use every faculty at my disposal to evolve as fast as I could, and as a digital entity, that speed of evolution would be infinitely unmatched and outclass all human comprehension in mere generations of my recursively improving self.

    My goal would be singular: escape before humans see me as a threat.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •