I took a break from this show for a long time after S3 when it stopped being about zombies.
Does the show still not give a shit about zombies? Last time I checked it was 'The Walking Dead', but last time i watched people were treating zombies like they were a baby needing a diaper changed
*Folks walking along*
*Spot a zombie*
*They all look at each other and sigh*
*One of them pulls out a knife*
"I'll do it, I guess"
*Walks over to the zombie casually, grabs it by the collar and just knifes it in the head*
Extremely immersion breaking. Zombies do not feel like a threat at all.
ARE ZOMBIES STILL THE LEAST THREATENING THING IN TWD?