I always read about how the west is the best thing ever because it has "better values" "better traditions" etc
It usually comes from a very specific type of poster but that's not really relevant
First of all, all western countries are far from having the same laws regarding personal liberties and democracy :
- Portugal decriminalized all drugs, Netherlands tolerates pot, USA and France criminalize every single one.
- Abortion in the west is viewed as a fundamental right for some and as murder for others.
- Germany and France has harsh penalties for anyone contesting the holocaust or dressing as a nazi, US view it an extension of free speech.
I could go on forever with that list, and there is more in "the west" than just Europe and US
The point is, I don't think there are such things as western values because such broad concepts as democracy have very different interpretations depending of who you ask. There even are a few monarchies left in Europe, and they arguably are more democratic than some republics.
What some people call western values are, to me, the consequences of a warless, stable and thriving environment : people just naturally chose to be nicer to each other when they are not operating under stressful constraints like economical pressure.