It seems that the republican party has been in denial for a very long time on a number of issues. Denial about education, denial about America no longer being the "greatest country on earth", America being a Christian nation (it isn't), Denial about evolution, Global warming, the age of the planet, and a plethora of other scientific issues.
Watching the GOP play the blame game and scapegoat everything from the hurricaine, to Romney being a weak candidate, to America full of black and latino voters who "want free shit" has made me realize that they are still in denial. Do you guys think the republican party will admit that they lost based on their ideas being out of touch with the country and change their platform to be more inclusive instead of exclusive? Or will they continue business as usual and further alienate women voters by promising laws limiting their control of their bodies, pressing more de-regulation of banks and big business, and further pursuing their losing platform of 2012?