When is this going to end? Everything you say is racist, everything you do is sexist and everyone is a feminist all of a sudden. It wasn't always like this.
Do you think this era of "consider everyone's feelings!11" will pass, is it just a trend in motion? I'm damn sick of it. They're turning every movie franchise I love into politics. Ghost Busters? All female cast? Are you kidding me? As if that's anywhere near discrete. Thor is or is going to be a female in the comics now.
Some people are even discussing that Doctor Who should be black / woman just for the sake of it. For Feminism or LGBT rights. What the god damn **** just drop it. If you want a black female superhero, create one! If you want a buff gay policeman in a Die Hard spin-off, write the script.
But dont go calling him John McLane or anything like it. Why is everything and everyone all of a sudden so up-tight about exactly everything? I'm from Sweden and politics around here.. jeez. You mention immigrants once and you're branded a racist.
/endrant