Perhaps it is just me, but I've been noticing a subtle change in neo-con Republicans.
The classic stance that they've had has always been simple: their agendas are veiled behind a "compassionate" exterior. Lately, however, that veil has been starting to fade away.
Most people who read this site would think that, if the neo-cons and their agendas were exposed for what they actually are, most Americans would not agree with them and it would be the end of neo-cons holding public office.
What if that isn't the case. What if the neo-cons came right out and just said what they really think? Would Americans still go along with them? Several items come to mind in the past few months: the prisoner torture, Dick C's "Fuck off," civil liberties.
The neo-cons are taking off the gloves lately and exposing much more than usual. And they're saying that all the ghastly things they believe in are OK. And many Americans are nodding their heads right along with them.
Is this what we've become? Is America a racist group of idiots? These last years have really made me question what America is. I'm actually now embarrassed to live here.