Page 9 of 10 FirstFirst ...
7
8
9
10
LastLast
  1. #161
    The Unstoppable Force Granyala's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Arkon-III
    Posts
    20,131
    Since programming a self aware machine would require us to understand our brain function and our own self awareness, I highly doubt that it will ever happen.

    Because if your brain would be simple enough for you to understand it, you would be to dumb to ask how it works.

  2. #162
    Legendary! Wikiy's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Virgo Supercluster, Local Group, Milky Way, Orion Arm, Solar System, Earth, European Union, Croatia
    Posts
    6,733
    Quote Originally Posted by MasterHamster View Post
    Still based on algorithms and thus an illusion of sentience.
    Quote Originally Posted by MasterHamster View Post
    AI is limited to the patterns we give it. That's all there is.
    And you aren't?

  3. #163
    Quote Originally Posted by MasterHamster View Post
    Learning that we have to program for it.
    Or you can set up an architecture to do it. Just like your brain does.

    Kinda like that machine that was meant to learn how to walk without movement-based code.
    Flailing its legs around mindlessly until it recognized that it was moving forward.

    An "urge" that we told it to have.
    How is that any different from you? Your whole argument basically boils down to "they aren't sentient because we made them".

    So much hostility.
    Because I'm talking to wilful ignorance.

    And for the record, without algorithms an AI wouldn't do anything.
    An AI does not need algorithms, but I see you're still insisting on ignoring the entire field of artificial intelligence because reality is inconvenient to your beliefs.

  4. #164
    Quote Originally Posted by semaphore View Post

    An AI does not need algorithms, but I see you're still insisting on ignoring the entire field of artificial intelligence because reality is inconvenient to your beliefs.
    An AI DOES need algorithms by its very definition. But most people here are not able to distinguish between imperative programming and logical inference.

    Look up Lambda-Calculus, Markow-Algorithms, Chomsky-Grammars...

    Theoretically, computers haven´t evolved in capability since the invention of the turing machine. Every problem that can be solved with a modern computer, can be solved by a turing machine as well, the only difference is efficiency. And there´s absolutely no reason to believe that not every mechanism in our brain can be replicated.
    Last edited by XDurionX; 2013-02-24 at 03:41 PM.

  5. #165
    Self Awareness or the ego and super ego is like a global status check of all the brain subsystems. So I think if computers are complex enough and need such a "control center" that constantly collects all the status feedbacks from all the other subsystems it will develop self awareness. From animals we know that self awareness is not an on/off switch. It develops gradually. Some animals never understand that a mirror is them self and attack it or try to mate with it. Apes understand that the mirror shows not just another ape but themself. So when complex computers start to develop self awareness it will come in baby steps.

    When we see how fast computers increased in complexity I think we will see first primitiv forms of self awareness about 20 years. Human-like self awareness in about 100 years or even more.
    Last edited by Kryos; 2013-02-24 at 03:30 PM.
    Atoms are liars, they make up everything!

  6. #166
    Legendary! Wikiy's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Virgo Supercluster, Local Group, Milky Way, Orion Arm, Solar System, Earth, European Union, Croatia
    Posts
    6,733
    Quote Originally Posted by MasterHamster View Post
    So much hostility.
    Don't pretend you're not equally hostile. Not while making posts like this one, where you essentially say "This discussion is impossible because the other side wont yield." People on this forum really need to realize that the other side will rarely yield. Really, really rarely. What it always comes down to is simply putting out your own arguments and reinforcing your ego. However, getting frustrated that the other side aren't changing their minds will get you nowhere.

    So, everyone, from both sides, chill down and let's do this a bit more civilly, shall we?

  7. #167
    The Lightbringer Tzalix's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    Sweden
    Posts
    3,118
    Quote Originally Posted by MasterHamster View Post
    - snip -
    For this discussion to get anywhere we need to define what exactly it is that makes our conscious mind different from a computer.

    In essence, our brain works like a computer. It gathers data, sets up rules about how the world works and then goes about making decisions based on previously gathered data.

    We could build a computer that does the same thing.

    So if we build a computer that is constructed the exact same way our brain works, how would it not be conscious? What's special about our minds? If an exact replica of our brain is not conscious, then what is?

    Quote Originally Posted by JfmC View Post
    I was thinking you where on to something, but then I saw you second phrase, now I just think you made an intelligent comment by accident in order to disprove someone on the internet.
    Nono, it's something I've thought about for a long time. And it was more a response to everybody who brings it up.

    Like you said, we only have the information our brains give us to go on... But the human mind can be deceived. Why trust all the information your brain gives you?

    It's the whole brain in a vat thing all over again.
    Last edited by Tzalix; 2013-02-24 at 03:53 PM.
    "In life, I was raised to hate the undead. Trained to destroy them. When I became Forsaken, I hated myself most of all. But now I see it is the Alliance that fosters this malice. The human kingdoms shun their former brothers and sisters because we remind them what's lurking beneath the facade of flesh. It's time to end their cycle of hatred. The Alliance deserves to fall." - Lilian Voss

  8. #168
    If we make a perfect clone (atom by atom) of a random self-aware human, is that clone self-aware? Is it artificial/a machine? Is it intelligent? Does it only do what we instructed/programmed/built it to do? My answer would be yes to all of those.

    Humans are also programmed (i.e. have algorithms) to feel, learn and gather experiences in a specific way. The only difference from an AI would be that we were originally created by chance (without intention, unless you believe in Intelligent Design) and in order to make a new human we don't have to understand it, we just fuck.

    EDIT: I think Hamster's point is that if something is told to be self-aware, it isn't self-aware. It would have to become self-aware by itself. Which I don't agree with.
    Last edited by reckoner04; 2013-02-24 at 04:01 PM.

  9. #169
    Theoretically, since thinking and knowing are a purely physical processes, if humans can be self aware than it can be replicated. Just look at a wall for a second: how do you know that it is a wall? You were told at an early age to learn the word "wall", understand its meaning, know how to spell it, understand each letter, etc. You were not born with this knowledge, you were told it, and your brain wrote the chemical code to be remembered. Your memories are just chemical markings reacting with electrical impulses in your brain. Once this process is understood as purely physical, and it can be replicated under the right conditions, than the simple rules of chemistry could allow scientists to replicate the process and program memories. Even though it sounds like science fiction, it is still theoretically possible. So, humans are, in theory, programmed machines. (sure does take all the fun out of life)

    In the reality of computer science today, computers cannot become self aware on algorithms alone. If nobody understood what a wall was in a world with only computers with algorithms, no computer would ever be able to determine what a wall is. Algorithms are nothing but complex patterns with no room for adding or taking out different pieces without disrupting the loop.

    Overall, it will take a near absolute understanding of how our brains work to fully understand what self awareness is at the chemical level before scientists can replicate it. I'm not even sure that given our current methods of creating computers could allow for that kind of ability. Maybe the future holds increased gains in bio-mechanical science.

    Don't kill me too much in the comments, I was just trying to rationalize on both sides. Also, forgive me if I stumbled on some science, I'm not trained in the hard sciences.

  10. #170
    Quote Originally Posted by reckoner04 View Post
    EDIT: I think Hamster's point is that if something is told to be self-aware, it isn't self-aware. It would have to become self-aware by itself. Which I don't agree with.
    Yeah, and we humans are also "told" to be self-aware at a certain time in our age, so his point is kind of fallacious to begin with.
    "In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance

  11. #171
    Your neighbor is probably an android sent to spy on you

  12. #172
    Herald of the Titans Mechazod's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    Dimension 324325
    Posts
    2,506
    There was a very well made documentary that talked about machines becoming self-aware called Maximum Overdrive. Really cleared up a lot of questions and misconceptions I had on the subject.

  13. #173
    Quote Originally Posted by Mechazod View Post
    There was a very well made documentary that talked about machines becoming self-aware called Maximum Overdrive. Really cleared up a lot of questions and misconceptions I had on the subject.
    Ah yes, I saw that one, very compelling stuff.

  14. #174
    Void Lord Felya's Avatar
    10+ Year Old Account
    Join Date
    Jun 2010
    Location
    the other
    Posts
    58,334
    Quote Originally Posted by reckoner04 View Post
    If we make a perfect clone of a random self-aware human, is that clone self-aware? Is it artificial/a machine? Is it intelligent? Does it only do what we instructed/programmed/built it to do? My answer would be yes to all of those.
    A clone within our capacity to clone, would be no different as far as cognitive ability, than the original. Unless you are taking a sci-fi leap, a cloned sheep has to go through the same development process as a regular sheep. We can't create mater, so we can't have full grown clones, unless we figure out how to make full size meat puppets.

    Quote Originally Posted by reckoner04 View Post
    Humans are also programmed (i.e. have algorithms) to feel, learn and gather experiences in a specific way. The only difference from an AI would be that we were originally created by chance (without intention, unless you believe in Intelligent Design) and in order to make a new human we don't have to understand it, we just fuck.
    What algorithm makes you learn? Humans have the nearly same programming, but result in unique conclusions. Two people will give two unique observations of them selfs, while programming depends on having consistent results. For a robot to have the same self awareness as humans, the programming would have to include an ability to be wrong, but not realse it and lose function post development. To be self aware like a human, the robot would have to have a skewed view of it self. You would have to have two robots be aware of their nearly identical composition, but have two entirely different views of them selfs. In essence, the robot would have to be designed flawed and the flaw must be different between each robot.

    ---------- Post added 2013-02-24 at 04:33 PM ----------

    Quote Originally Posted by Dezerte View Post
    Yeah, and we humans are also "told" to be self-aware at a certain time in our age, so his point is kind of fallacious to begin with.
    Cognitive thought develops in your early teens. No one tells you to be self aware, but your brain development. The result of the development is both biological and environmental.
    Folly and fakery have always been with us... but it has never before been as dangerous as it is now, never in history have we been able to afford it less. - Isaac Asimov
    Every damn thing you do in this life, you pay for. - Edith Piaf
    The party told you to reject the evidence of your eyes and ears. It was their final, most essential command. - Orwell
    No amount of belief makes something a fact. - James Randi

  15. #175
    Quote Originally Posted by Felya420 View Post
    Cognitive thought develops in your early teens. No one tells you to be self aware, but your brain development. The result of the development is both biological and environmental.
    Hence why I said "told".
    "In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance

  16. #176
    Banned GennGreymane's Avatar
    10+ Year Old Account
    Join Date
    Apr 2010
    Location
    Wokeville mah dood
    Posts
    45,475
    if the programming is advanced enough

  17. #177
    Void Lord Felya's Avatar
    10+ Year Old Account
    Join Date
    Jun 2010
    Location
    the other
    Posts
    58,334
    Quote Originally Posted by Dezerte View Post
    Hence why I said "told".
    But, it's not immediate or simple like that. The results are both effected by your biology and environment. It's developed, not turned on.
    Folly and fakery have always been with us... but it has never before been as dangerous as it is now, never in history have we been able to afford it less. - Isaac Asimov
    Every damn thing you do in this life, you pay for. - Edith Piaf
    The party told you to reject the evidence of your eyes and ears. It was their final, most essential command. - Orwell
    No amount of belief makes something a fact. - James Randi

  18. #178
    Quote Originally Posted by Felya420 View Post
    Unless you are taking a sci-fi leap
    Yes, I am, sorry if that wasn't clear. I was simply trying to explain a concept, i.e. that of making a self-aware machine (which would be the same as that "perfect cloning", except that the blueprint and the materials used would be something different).

    Quote Originally Posted by Felya420 View Post
    What algorithm makes you learn?
    Our brain would be that algorithm. We receive inputs via our senses and then something happens inside our brain according to the natural laws. "Something happens" is the algorithm (sorry if that sounds dumb, I don't have the English skills and patience to come up with better formulations). This algorithm is also constantly evolving/growing. I'm not sure though if that would still fit the common definition of "algorithm" and "programming". Better terms could be "function" and "configuration". I don't particularly care about the semantics though.

    Quote Originally Posted by Felya420 View Post
    Humans have the nearly same programming, but result in unique conclusions.
    They don't result in unique conclusions as far as I'm concerned. Two different humans (i.e. differently programmed, though similar) have probably two different reactions. But if you make two perfect clones (as detailed earlier) and put them in two identical rooms, they will show exactly the same behavior.

    Quote Originally Posted by Felya420 View Post
    You would have to have two robots be aware of their nearly identical composition, but have two entirely different views of them selfs. In essence, the robot would have to be designed flawed and the flaw must be different between each robot.
    You don't have to design them "flawed" (or "randomly different from but similar to each other") because they will have different life experiences (unless you do it like in the paragraph above, which doesn't make them different from us self-aware humans).

    I hope I made sense .

  19. #179
    Quote Originally Posted by reckoner04 View Post

    They don't result in unique conclusions as far as I'm concerned. Two different humans (i.e. differently programmed, though similar) have probably two different reactions. But if you make two perfect clones (as detailed earlier) and put them in two identical rooms, they will show exactly the same behavior.
    Until this is proven I will retain my right to disagree

  20. #180
    Quote Originally Posted by Felya420 View Post
    But, it's not immediate or simple like that. The results are both effected by your biology and environment. It's developed, not turned on.
    That remark was in response to the person I responded too, but I agree with you otherwise.
    "In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •