how do people still think america is a good country to live in?
I dislike America. It is the center of idiotic shallow naive fat morons who can't even tell you were they live on a map but now all of kardashians sisters names.
It is led by idiots who can't fix there own damn economy. And yet there are people who still think America is a good country why?