Originally Posted by
Themius
There is something that really bugs me when it comes to the way the USA is viewed by many foreigners, and some people in our country as well.
America gets this negative attitude when it comes to how we behave in the world. We are called the world police and are criticized for it; however, exactly where would we be without America? American hemogony is very likely what leads to our current times of less wars, and little major global conflict (compared to the past) we aren't without issues in the world but without us, and if we weren't so involved in the world where exactly would the rest of the world be?
We are the world's greatest deterrant being that we are the world's wealthiest nation and the world's most powerful nation and the world's only superpower.
When the issues with Libya began we didn't immediately act and were urged to by Europe, where many of the countries were previously complaining of our "world police" like behaviour.
Frankly America is the healer, extremely important, under appreciated when doing their job well, vilified when things go wrong.