After watching the State of the Union and all the responses, I find that I am sick and tired of hearing about 'governing'. Every News show, every soundbite in the media talks about 'We were elected, and now we just have to govern.' Or some such nonsense. When exactly did this change occur in all the capitals around this nation that our representatives now think of themselves as our governors?
Yes, back during the Constitutional Convention one can read of some using the term--usually the Federalists like Hamilton I have to point out--but not nearly to the extent that one hears it today. When I hear the term 'govern', I think of kings, dictators, despots, commissars, czars (is there something to this, given our current penchant to dubbing un-elected bureaucrats with this title?), and totalitarians in general. Not my congressman
or senator--or my president. They may be leaders, but they do not govern me. Do they govern you? I think a large number of Americans would grudgingly say 'yes'.
When did we accept the idea that our representatives govern us? I think that point in time would be the point we lost the true concept of being Americans. In that America, the people are defined as the repository of all power and we delegated some of it--through our various constitutions--to government. We elect representatives to ensure that that our power is exercised in accordance with our wishes, but not in such a way to violate either the limits we placed on that power, or in a way that infringes on another American's fundamental rights. Our representives are elected to ensure that government--the State--doesn't step beyond its delegated authority and it doesn't use even those few powers delegated to infringe on another persons fundamental rights.
Does that sound like they are governing the American people? Not to me. If at all, our representatives are only governing over the State. In this sense, they are our watchdogs, intended to keep the government in line and keep our individual rights secure from both government and other individuals; not governing us.
So when did we become the 'governed'? I think that is the point that we ceased being actual Americans, and became the shadow that we are today. From this perspective, to 'get it back' as so many have called for--the Tea Party, constitutionalists, conservatives--we first need to get back the idea that we govern ourselves. We need to teach our elected officials that they don't govern us they represent us. That their primary mission is to protect us from the State and its un-elected bureaucrats.
This is a huge difference, and we all must relearn it. In America, we govern ourselves.
No comments:
Post a Comment