Ever wonder what the limits of religion are in a free society? Now I am not talking about faith in a God, or an orthodoxy of long established worship. I am talking about the things we "say" is our religion or our interpretation of religion. You see, unlike some other developed countries in the world, we do not recognize "official" religious sanction. That is, we in the U.S. do not say.. "yes, the doctrine laid out by methodist or baptist is a "real" set of beliefs held on long established principles." This is typically read by most Americans as "establishing" religions. So in a real sense, any person can establish any set of principles and claim it is a religion. And what is more, the U.S. has created a marketplace of such interpretations and beliefs to suit nearly any need. Now I am not advocating that we sanction religions. But this situation does lead us to an interesting impasse...
I saw a day or so ago, the advertisement for an "All White" ministers meeting to be held somewhere in the south. With reporting what it is today, I dont know if this is true or not, but it really doesnt matter much. The fact is that it could be true. With a society that has become as polarized as ours and as mean spirited as ours, there are plenty out there that will fight for their "freedom" to believe what they want. The ploy is to use the rather obvious tactic of claiming this is part of their religious freedom and that no one has the right to suppress it. We have seen this in play plenty over the past year... Contraception, a concept that very few protestants have had a problem with previously, now infringes on "the church's right" to conduct their beliefs as they see fit. Even if this means to be intolerant to different beliefs among others. They want the right to deny others the benefits. More specifically, your "religion" could be to make the U.S. a religiously based legal and political system. One in which tolerance of gays could mean the downfall of society due to the impending wrath of God on the tolerators. So the whole approach seems to be to wrap everything you hate up in a nice religious package and claim that God somehow demands this of you. Making it the "freedom" to deny equal rights to those "other" than yourself, or a religious freedom to demand certain behaviors of others is very much like some religions I admit - like the taliban for instance. But here in the U.S. we have a peculiar twist on this idea. We allow ANYONE that has an axe to grind, to do so behind the veil of religious "freedom." We take the phrase "your entitled to your beliefs" to a whole new level because of its incredible breadth of intolerance. The KKK a religious organization? Of course! What about other white supremacists? Why not? If you dont like something - well then make it your religious interpretation of some religious text then there you go - "secular" society cant stop you.
Of course we have been doing this for some time. Scientology - a "religion" completely fabricated on some guy's boat a few dozen years ago. Vampirism - the new "fad religion" directed by Hollywood. And of course, you can always find a few that are willing to follow you - should you decide to hang out a shingle as the latest shaman. But the surprising thing in U.S. religious life (to me anyway) is the role played by branding and marketing. The big power players now cast themselves as Christian. We all know that the average American wants or needs their political leaders to be Christian. So thats what we call them. Even if the belief system being touted has no similarity to Christian doctrine what so ever. Now of course there was a time when the church would have branded this as heresy. But not today. The church has seemingly learned an important lesson over the past years - that is give the public what it wants and those collection plates will be filled. Maybe this started with the Bakers (Jim and Tammy) or perhaps the moral majority, who knows? But for sure, it is the lesson of the day. Instead of leading the masses to the loving and caring nature of God, rather let their pocketbooks lead you. It is a bit like Christ telling the young prince, "well it depends on how much you are going to put into the collection as to what I want you to do." This seems to be especially true of dominionism. Sure some of the words sound like Christian doctrine, but the interpretations are sure different.
Moreover, the game has now turned to hiding some of the more offensive aspects - the heretical beliefs of the systems, from all but the high "priests" of these particular "religions." And once established, they all seem to be turned to political outcomes. As an example the very notion that our society - in the form of a government - might try to enact anything that helps to alleviate the ills of society such as poverty or access to health care, has now been cast in terms of religious freedom. There have been many posts with regards to why this might be. But it is still a bit baffling to me. Afterall, the outcome claimed by these "Christian branded religions" is to meet the aims of a God-Centered society. But isnt this what we are all shooting for in a way? Equity, do unto others, peace, freedom for all not just a few? Naturally I am asking this rhetorically, because it would seem as though this is NOT what such entities are trying to accomplish. Even the Catholic Church was willing to give up its "social justice" rhetoric for the few pieces of silver it would receive in doing so.
Perhaps not so surprisingly, they are really after what most power worshipers are after - their right to dominate others. The corruption of money and power has come to roost on those that would call themselves Christian - but I maintain it has done so at an awful cost. In giving up their "call to God" they have also given up the morality that helped them to lead. In doing so, many are turning to the likes of scientology, etc. When a belief system becomes nothing more than attacking what you dont like and denying your own responsibilities in the state of the world, well then what should you expect. There isnt much need for churches except to project the aims of your own greeds through politics. This, is in my opinion, a sad end to a noble calling. But as with so many sad ends, it has been brought on by the hypercompetitive race for money and power that has become American society. It has been dished out to us since Reagan that our lot in the world is to rule all, and we have loved the message. With this has come our right to hate, our right to deny justice, our right to seek only the messages we wish to listen to.
Of course I have painted with a broad brush here. There are those that have received a message of benevolence and tolerance. But I maintain that their beliefs are being purged and their voices drowned by the priests of the new order. Even Mother Theresa has come under fire from the new religion. After all, handing out food to the poor just made them a "dependent society" and was an affront to God right? And far too many believers stood back and let this happen. Maybe they were unsure? Maybe they didnt want to make waves? But in some way those that believe allowed the Christian moniker to again become that axe that we love to grind. In a real way we have become the Satan we all feared.
I have no answers of course. But I was just wondering if any one else out there feels as though religion has fallen a little short lately? Are we really going to be OK with being told that voting for Obama is against God's will? Is it really OK to let the pulpits that bear the title of Christian to preach against the poor? Is seeking to provide health care to all really an act of satan? And is it OK to justify the means with lies and hyperbole that is misleading just to achieve the end result of a government for the wealthy? This really isnt a question of politicians, it is a question of belief. For those of us who are believers, do we really think that God will not hold us accountable for the deception that is todays religion?