As an American, I find it curious as to why Europe is so liberal. Why is that part of the world so left in their thinking? Now i'm not saying that is necessarily a bad thing. Whatever your way of thinking is, be it left or right, what makes the countries of Europe so left wing?
One would think after all the wars over there, they would appreciate liberty and freedom and core conservative values. Personally I sort of think its because the moral fiber of communities in Europe have started to decline, or make a shift Also the birth rate in some countries over there is less than 2, which means their populations are declining rather than increasing. Would this mean people are becoming more self centered and only want to live for themselves (since they are choosing not to have children)?
I hope this can be a civil discussion and the posts here can be on topic.