Everyone seems to despice Florida somehow, and always say "well it's Florida, what you expect". And with good reason. The bigger part of the retarded news reports I see come from Florida. Is there something in the water there or is it like a cultural socio-economic thing? Or is it just the laws? What makes Florida the center of retarded news subjects?