For all of the things that America is doing that the world doesn't agree with, we have seen other nations around the world begin to turn their back on America more and more due to its behavior.
Is there a breaking point where the majority of nations start to refuse buisness with America? Has it already started to happen?
It seems like the country is becoming more and more isolated from the rest of the world, content to fight its own internal battles while the rest of the world continues to move forward.