I just rewatched Ex Machina last night and spent some time reading some saved articles on AI. It just seems to be that it's totally inevitable AI will assume power over humanity. Our weaknesses are so easy to exploit. Hell, most of us wouldn't survive a simple power outage.
That doesn't necessarily mean they are "evil". But if a smart enough machine can sense it's own demise, and we assume anything that becomes conscious wants to ensure it's own survival, wouldn't it just make complete sense it would exploit our many, many weaknesses to overtake us?
And this isn't really about "should" we do it - as in create the AI. This is going to happen regardless, this is more of a discussion about when it does, what do you think it will mean for humanity?