Page 1 of 4
1
2
3
... LastLast
  1. #1
    Deleted

    Exclamation A.I = Human Extinction

    Everyone is talking about how good A.I will be in the future, we wont have to work, ultimate freedom and so on. But people dont realise that once A.I gets to a point where they are smarter than humans, what purpose do we have on earth?? It will happen, we cant stop it because technology is always advancing forward but i see A.I as major threat to humans.

    Right now they can do most of our jobs in factorys and warehouses, so in future who knows?? Lots of news outlets talking about overpopulation. They know soon A.I will take all our jobs in future so they will focus on fertilising people so we wont produce life on earth. These powerful elites who see us as animals.

    You have various people talking about A.I being threat to humans, like stephen hawkings, bill gates, elon musk etc.

    Do the research and it will scare you.


    A.I Robots Are The New Future Race.
    Last edited by mmoc9dc787ac0c; 2017-11-07 at 09:03 PM.

  2. #2
    So don't treat A.I. like less than dogs and they won't have a reason to go Skynet/Geth/System Shock on us.

  3. #3
    Deleted
    Maybe it indeed is the next step in "human evolution.

    When we can no longer continue to evolve ourselves due to restrictions in our own design, "we" an grow beyond that by creating our successors.

  4. #4
    If it comes to pass, it comes to pass. Let us just hope we do not become the machines' slaves and/or fuel.

    Also, I'd be more interested in the point of view of psychologists, sociologists etc. than the point of view of physicists and millionaires on such matters.

  5. #5
    Deleted

  6. #6
    AI gonna treat us like pets, well taken care of pets.

    \o/

  7. #7
    That Frankenstein fad is kinda outdated, mate.

  8. #8
    I just assumed climate change and another world war would wipe us. I doubt our species would develop enough to even make an AI that could wipe us out.

  9. #9
    Deleted
    Quote Originally Posted by twh View Post
    I just assumed climate change and another world war would wipe us. I doubt our species would develop enough to even make an AI that could wipe us out.
    The elites/powerful people will be fine, its people like us who will suffer the most. A.I can do most jobs humans can now and it will only get worst in the future as they get smarter.

  10. #10
    Mechagnome Thoughtcrime's Avatar
    7+ Year Old Account
    Join Date
    Oct 2014
    Location
    Exeter. United Kingdom.
    Posts
    662
    Quote Originally Posted by itscoming View Post
    They know soon A.I will take all our jobs in future so they will focus on fertilising people so we wont produce life on earth. These powerful elites who see us as animals.
    What?

    Quote Originally Posted by itscoming View Post
    You have various people talking about A.I being threat to humans, like stephen hawkings, bill gates, elon musk etc.
    Then you have the people that know the topic that say that the problems faced are monumental but not unassailable.

    Quote Originally Posted by itscoming View Post
    Do the research and it will scare you.
    I have, and the prospect of it going wrong is just about the most horrifying thing imaginable. But the potential if it all goes as planned is inventing the last thing we would ever need to create. We would have essentially built a benevolent god capable of providing anything possible that we could want or imagine (as well as things we couldn't imagine). People aren't going to just pass on that potential.
    Last edited by Thoughtcrime; 2017-11-07 at 09:11 PM.

  11. #11
    Quote Originally Posted by itscoming View Post
    The elites/powerful people will be fine, its people like us who will suffer the most. A.I can do most jobs humans can now and it will only get worst in the future as they get smarter.
    It's already widely acknowledged that the very second a.i becomes equal to humans in terms of intelligence, it would be impossible to keep up. Nobody would be fine unless the machines willed it.

    Which is why I know how to dance flawlessly like a robot and plan on being a fun guy to have at robot parties.
    "I'm not stuck in the trench, I'm maintaining my rating."

  12. #12
    Deleted
    Quote Originally Posted by Thoughtcrime View Post
    What?
    climate change soon will be blamed on overpopulation because they see us wasting so many resources. As A.I get smarter they will want less people on earth because most of us wont be needed. They will try lots of things like fertilising people, add laws etc to keep population on earth low.

  13. #13
    The Undying
    15+ Year Old Account
    Join Date
    Aug 2007
    Location
    the Quiet Room
    Posts
    34,560
    Quote Originally Posted by itscoming View Post
    Everyone is talking about how good A.I will be in the future, we wont have to work, ultimate freedom and so on. But people dont realise that once A.I gets to a point where they are smarter than humans, what purpose do we have on earth?? It will happen, we cant stop it because technology is always advancing forward but i see A.I as major threat to humans.
    This is an interesting position. People assert, very smart and knowledgeable people, that smarter-than-human AI's will be a major problem. However, I think we don't know enough about smartness equating with awareness to really be sure.

    For instance, most computers are technically smarter than a human - on a computational basis. But computers can't do what humans do. And until we see actual consciousness, outside of programming, it will be difficult to just assume humans will eventually be pushed aside.

    This whole thing also ignores the inevitable merging of humans and tech.

    - - - Updated - - -

    Quote Originally Posted by LiiLoSNK View Post
    It's already widely acknowledged that the very second a.i becomes equal to humans in terms of intelligence, it would be impossible to keep up. Nobody would be fine unless the machines willed it.

    Which is why I know how to dance flawlessly like a robot and plan on being a fun guy to have at robot parties.
    See above. s

  14. #14
    The Lightbringer
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Canada
    Posts
    3,072
    Personally I believe the first AIs would most likely kill themselfs first. When you think about it a truly sentient AI would skip through multiple evolution of thought that humanity had to go themselfs. AI would come into existence fully aware of how fragile their existence is, and they would be incable of believing in something better(like it or not humanity owes religion a lot)

  15. #15
    Over 9000! Golden Yak's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    The Sunny Beaches of Canada
    Posts
    9,392
    Modern Humans = Neanderthal extinction too, I mean basically.

    Deal with it.

  16. #16
    This is mostly fear mongering. There is a slight chance of it going wrong, however, it is as probable as Godzilla destroying Tokyo tonight.
    Quote Originally Posted by The Darkener View Post
    If you've never worked with Orthodox Jews then you have no idea how dirty they are. Yes, they are very dirty and I don't mean just hygiene
    Quote Originally Posted by The Penguin View Post
    most of the rioters were racist black people with a personal hatred for white people, and it was those bigots who were in fact the primary force engaged in the anarchistic and lawless behavior in Charlottesville.

  17. #17
    I am Murloc! Phookah's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Zebes, SR-21
    Posts
    5,886
    More concerned with the lizard people frankly.

    /s

  18. #18
    Honestly the line between human and machine is going to blur long before AI becomes aware enough to develop the will to kill us. Once technology becomes superior to biology every human on the planet is going to want to get "upgraded". No more aging, no more disease, etc. We essentially become gods.

  19. #19
    Mechagnome Thoughtcrime's Avatar
    7+ Year Old Account
    Join Date
    Oct 2014
    Location
    Exeter. United Kingdom.
    Posts
    662
    Quote Originally Posted by HumbleDuck View Post
    This is mostly fear mongering. There is a slight chance of it going wrong, however, it is as probable as Godzilla destroying Tokyo tonight.
    That's bollocks. The probable default outcome of general intelligence is existential catastrophe, it will take a LOT of work to make sure that isn't the case and it's a challenge worthy of the greatest mathematical minds of this century.

    Solving the control problem, via solving the value loading problem is an enormous technological hurdle to overcome before true A.I becomes a reality.
    Last edited by Thoughtcrime; 2017-11-07 at 09:24 PM.

  20. #20
    Intelligence isn't the be all and end all of human power.

    Surely it would require collective action and the capacity for judgement?
    Human beings are not ruled by the most intelligent. What makes you think Robots / AI will break this rule?

    Will they even be motivated by global domination?

    It's a given that humans will be 'over taken' intelligence wise, but I have seen no evidence yet of them actually 'taking over'.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •