Can someone explain to me what happened politically to the US?
How did the US body politic get to be this paranoid, ideologically driven, seriously right wing administration full of hate for the other?
To explain where I'm coming from--I'm a dual national (us/uk) who served in the USAF, travelled around a bit, went to Temple University and left just after Clinton was elected. The whole BJ impeachment thing had my jaw permanently dropped. I've read Franken and Moore, watched a bit of Fox on cable here (uughh)and it still seems unreal. What happened, and how?
I ask this because my US friends are apolitical and my dad thinks Bush is a strong leader so I don't talk politics with him.
I've visited freerepublic and LGF and other conservative sites and it's quite amazing. The UK sucks in many ways but we don't seem to have quite the level of Christian fundamentalism that appears to dominate and stifle discussion.
I would really appreciate some input from the other side of the pond and also views on whether a change of administration is going to make that much difference to the level of unreasoning hatred that's out there.
And yes, I know, we have Blair.