In the midst of make-over shows, reality TV (what a joke), and an avalanche of advertising, the termites are eating the foundation of America away. Now there's sentiment most American's would agree with, but apportioning blame is where the difficulty lies. In much of America the blame is laid at the feet of liberals and free-thinkers (gay lovers, bleeding hearts, professors, pornographers, immigrants, immoral weaklings, etc.), not to mention those possessed by Satan and the minions of Hell. On the other hand, there are those who say the small-minded, bigoted, racist, corporation driven, non-reality based Right is at fault. Are either correct?
I happen to be of the latter camp, but does any of it really matter? Inevitably, regardless of what prescription is evoked as an answer, in the end it will all be useless chatter. America, in the form we've known it, is going, going, almost gone. One good look at history provides an obvious clue. The Roman Empire is gone, the Aztecs are gone, the British Empire is gone, the Babylonians are gone, the Assyrians are gone, the Persian Empire is gone, Carthage is gone, Sparta is gone, the Incan Empire is gone, the Zulu Nation is gone, Athens is gone, the Confederate States of America is gone, the Holy Roman Empire is gone, and ad infinitum.
There's a good argument that something a little different was at the root of each failure, but a good argument is just a moment's diversion from the fact that--they are all GONE.