The 3 robotic laws put in place to protect us,
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
And then there be Drones. Combat drones, used for the exact purpose to harm others and we are far from stopping there we even want robots to eventually start fighting for us, we even are creating swarms of small robots that can be used for exactly that. And these are just the first generations.
There's also the other opinion that robotics will harm us, since there's this political idea floating around that we have to compete with countries like china regards to product cost, the only real way to do this is automate a lot more of our industry. Some politicians sell us the tale that "well those jobs will simply become maintenance jobs" First off, i don't believe everyone that country works in a simple production job has the capabilities or insight to do maintenance, secondly a robot will replace several people, not just one and multiple robots will be maintained by a single human. So the basic math around this doesn't add up. There's actually a term for this ideology, it sadly escapes me.
So my opinion is that Asimov laws are at this point to nothing more then fiction as they are long surpassed and all future intent seems to be ignoring them anyway, What's your opinion?