It's that time of year again and I'm craving sand and sunshine. Usually I just wistfully look at airfare and fantasize about the beach and drinking out of a coconut but I made a promise to myself to start traveling more and therefore am actually looking to book something. Anyone have any good recommendations for tropical (or at least warm and sunny) destinations? Hawaii is on the short list.