I'm 28 years old, and I can remember during my childhood when having pride in your country was something that was encouraged by American society. Now anytime I insinuate that America is the best country on the planet, I have a ton of liberals telling me that I must be uneducated, and ignorant, and stupid. When did this change occur? It must have been within the last ten years or so, because I remember National Pride still being pretty big right after 9/11 -- and that wasn't too long ago.