Well, it's “election season” once again – but really, is it ever not election season? Haven't we gotten to the point where every initiative, every policy, every program, every bill, every “task force”, every “investigation”, and so on, is aimed at the next election, whether for the presidency or for Congress? The American interest, i.e. the well-being of its citizens, has long since ceased to be a consideration, never mind “realism” (especially when applied to foreign policy) and “sustainability” (a favorite buzz word of the environmentalist faction). Even the environment, AKA “the planet”, takes, at best, second place when forced to compete with more pressing shorter-term considerations of power and money (environmental issues being, at least in theory, aimed beyond the lifetimes of the present generation, whereas power and money are all about “ME”, and succeeding generations can eat you-know-what). The planning horizon of our elected officials extends no further than the next election, and once that is accomplished the planning horizon instantly shifts to the election after that. In other words, the next campaign begins the day after the swearing-in ceremony. The long-term welfare of the citizenry, whether economic or in terms of health (which is, or should be, a subset of “the environment”, which is, in turn, a subset of “the planet”) never enters the minds of our so-called “leaders” from one year to the next, except as talking points – and yet they continue to campaign, and continue, incredibly, to be re-elected to an overwhelming extent. So, the question is – or should be – what sustains this clearly maladaptive, and some would even say suicidal, system, where – if we still have any faith in the electoral system – people persist in voting against their own best interests, and thus reinforce policies that have failed time and time again – sometimes spectacularly?
The old chestnut about doctors is that they “bury their mistakes”, so their mistakes don't get to comment on their competence, or lack thereof, on the Internet. Politicians, on the other hand, are like unto miracle workers. Not only do they not bury their mistakes, they parade them out in public, and in speeches, as successes – and a few people out there dare to speak up and argue, but the vast majority remain silent, whether out of despair or utter apathy. (But they still vote – which seems to mean that many people shuffle into the voting booth in a state of despair or apathy. So then why do they even bother? Read on... )
Well, number one, “voting” has always been styled as the premier privilege of the citizens of a democracy – the one essential thing, the sine qua non. Of course, as we all know, or should know, most of the dictatorships of the 20th Century, and those that remain, had, and have, voting – after all, they were/are “people's republics”, right? The fact that their elections were a total sham was pointed out and ridiculed by the more enlightened, i.e. the American media and politicians – all while ignoring the chicanery that contaminated many of our own elections, both on the national and local level. If democracy in the “people's republics” was an illusion, then our own democracy was, at the very least, flawed, and in some case corrupted to an extent that rendered it meaningless and absurd in the “democratic” sense. And yet, it seemed that a flawed democracy, with an electoral system that was capable of being severely compromised, was better than nothing. In other words, the illusion of democracy was preferable to the stark realization that, in many cases, it was nothing but a sham and a cruel hoax.
But we still have to ask why, over time, ever since the earliest years of the Republic, the actions of elected officials have tended to contradict the intent of the people who voted them into office. Seek no further than the temptations of power, and the felt need to, once power is achieved, hold onto it at all costs, so the interests of the ordinary citizen are soon trumped by the interests of the rich and powerful, who take over control of elected officials the minute the ballots are counted. And there is nothing the least bit new or unique about this; it goes right back to our fallen human nature, and politicians are, after all, human – all too human much of the time. They seek power, presumably, in order to “do good”, but in no more than the twinkling of eye their Job One becomes holding onto the power they've been given by a trusting (say gullible, if you like) public.
And is this tendency worse in a “democracy” than it would be in a monarchy, or dictatorship? Well, no, because human nature is what it is – at all times and in all places. And after all, elected officials can, at least in theory, be voted out of office, although it's amazing how seldom this actually occurs (impeachment being ever rarer, and successful impeachment being the rarest of all). So they must be doing something right. Right? Or are we totally missing the point when it comes to our vaunted “democratic” system? If people supposedly vote, time and time again, “for their pocketbook”, or for some other concrete reason, and are invariably disappointed, and yet their voting behavior doesn't change... what is actually going on?
So, can we all agree that there's a disconnect here? We have “voters”, and “candidates” who turn into “politicians” but then are compelled to remain candidates until they decide to retire (or are forced to by some means). But as an arrangement, it very seldom pays off, at least for the hapless voter, i.e. the “average Joe”, “citizen”, whatever. It mostly pays off for the people behind the scenes who choose the candidates, support them, and then once they are in office call in their chits. So the notion of elected officials being “the people's choice” is a hoax in many if not most cases. Was it the “people's choice” when these candidates came out of nowhere and somehow ended up on the ballot? No, of course not – they were the product of a very selective and precise vetting process, the main criterion being: Will this person, once elected, remember who his “friends” are, and act accordingly? And the selection process has been developed and fine-tuned to dependably yield results that favor the people who are really in charge. And if an occasional “mistake” occurs – if an elected official decides to throw off the yoke and be their own man (or woman), well, we have ways of dealing with that sort of nonsense. (On this topic, see a previous post, "Mayor Pete was the Wine that was Sold Before its Time", April 25, 2020.)
And once again, and as always, the hapless citizen is left wondering what the hell happened... why they have been exploited and betrayed once again, and for the umpteenth time. (Won't Charlie Brown ever be allowed to kick that damn football?) And yet their faith in the system, if one can call it that, persists – or maybe it's just habit, or, once again, despair.
Let's try and take a closer look at this almost universal phenomenon. Let us, for starters, think about voting, and the voting booth, and what actually happens when someone is face-to-face with that choice – with trembling hand holding a pen or pencil, or poised over a button on a screen or a voting machine lever. This is the real moment of truth. All of the propaganda, advertisements, speeches, broadcasts... the papers, magazines, TV, Internet, radio... fade into nothingness at this point and are replaced by an empty space, a void... but then what rises up to fill that void? It's the narrative.
Narrative – A way of presenting or understanding a situation or series of events that reflects and promotes a particular point of view or set of values.
Or – a particular way of explaining or understanding events;
Or – a story or account of events, experiences, or the like, whether true or fictitious;
Or – an explanation or interpretation of events in accordance with a particular theory, ideology, or point of view.
Note that these definitions (and there are many more along similar lines) tend to reflect the utter subjectivity of narrative, i.e. that it's not factual in the strict sense, but based to a large extent on pre-existing ideas, prejudices, biases, and premises – not to mention emotional needs, which, in my opinion, are actually the main “drivers”. And those emotional needs are based, in turn, on an array of events and influences that culminated in the mind set of that person, on that day, at that time, in that voting booth. For we are, after all, human – which means complex, and unstable, impulsive, reactive,,, in short, all that the Founding Fathers believed – naively, perhaps – could be overcome if people were only given the opportunity and the proper motivation. They were, if you will, optimistic about human nature. But has their optimism proven to be utter delusion and folly?
Even the simplest person is capable of having “deep thoughts” at times – and the most intellectual and presumably rational and objective person can be the prey of their emotions, which can cancel out all attempts at “realism”. And I'm not claiming that these “irrational” factors come into play in spite of reason, or of “the facts”; it's really the opposite. It's a more rare occasion when reason, and the facts, come into play in spite of what one might call the more primitive, or juvenile, factors. (In the developmental sense they could be termed “pre-reason” or “pre-logic” or “pre-conceptual”.)
One might say that the notion of the average citizen suddenly turning into an adult when they step into the voting booth is no more than wishful thinking. What is every bit as likely is that they will regress, and revert to an earlier stage of their emotional and intellectual development when they are confronted by that moment of truth.
So what, then, is this all-powerful narrative, and why does it have such an iron grip on us when we are faced with what should be an important decision – not only for the individual but as a basis for democracy itself?
To begin at the beginning – our perceptions of the way the world is (call it our “metaphysics” if you like) start being formed at birth, or even before. Babies are, basically, data-gathering machines. They absorb everything that they see, hear, taste, smell, feel – without prejudice or editing. One might say this is the most objective stage of life, simply because we haven't yet learned to select, or filter, the input. We never think, or say, "That does not compute", because at that stage everything computes, which is as it should be. And so a view of the world (their world, of course) is formed – primarily from sensations as opposed to thoughts “about” sensations. Toddlers have a pretty good idea of the way the world is – their world, at least. But they also start getting notions of how they would like the world to be, and the conflict between the two leads to frustration (which helps explain why the "terrible twos" are the way they are). How is this resolved? The way they would like the world to be has to be put on the shelf, after repeated tries and frustrations – but it never goes away. Our “inner child of the past” (the expression comes from a self-help book from some years back) persists, and builds up -- layer by layer -- wishes, hopes, dreams, and frustrations while we lead a parallel life “in the world”. Thus is formed the root of a narrative – the one with the earliest origins – to which we cling (thank you, Barack!) in some form, even if a vestigial form, for life.
And I would say that the great cry of the human person – the real “primal scream” -- is “It's not fair!”, or some variation thereof. And if this attitude, or premise, retains sufficient power into adulthood, it can influence our political thinking, not to mention our relationships, our choice of vocation, and so on. The urge to make the world “work” – to be the way we would like it to be – is compelling, but also infantile in a way. The missing element is willingness to compromise, and – on a more rarefied level – humility. Humility doesn't insist on “compromise”, or negotiation, or bargaining; it is an attitude of willingness to accept things as they are – at least tentatively -- because, who knows, there may be perfectly good reasons for the way things are that we are not privy to – but at the same time to work to make things better, ideally by starting “at home”, with ourselves. (Wasn't there a Beatles song that said, “You say you'll change the constitution, well, you know, we all want to change your head. You tell me it's the institution, well, you know, you'd better free your mind instead” ?)
Many teachers in many religious traditions have taught the value of humility over the millennia. One of the main barriers to developing humility, however, is that residual infantile “It's not fair!” attitude, and this flows quite seamlessly into one's personal politics, and thus collectively into politics in general. For what is any political idea, cause, or movement other than an attempt to make things “right” -- or "fair" -- or "equitable" – to re-fashion the world in the image that we would like to impose on it? There is nothing less humble, or less satisfied, than a political movement. “Conservatives” – the real kind, i.e. the ones who want to keep things just the way they are (vs. the way they were, or are alleged to have been, 100 or more years ago) -- are fighting battles against “change”, especially change for its own sake (which is the most typical variety). If only people would be satisfied with the way things are now! But what if the way things are now is more like the “continuous revolution” of Chairman Mao? Then it's no longer true conservatism, is it? And perhaps in our time there is no such thing, i.e. conservatism has become self-contradicting.
But it seems that we have skipped a few steps here. Go back to the toddler, who is forming a view of the world based on raw data, and not spending much time thinking about it, or interpreting, or conceptualizing. They are, in a way, living an unexamined life, which is perfectly acceptable at that stage. But then along comes speech, and then concepts and ideas, and (hopefully) some beginning of moral sense – and, perhaps most important of all, imagination, which can be defined as wanting that which doesn't even exist, but which we develop as an image or “wish system” by projecting from what is, i.e. from what we know. And at that point the “ought” starts to overpower the “is” – the toddler's alternative world of “like” and “want” starts to accrete names, and concepts, and connections. And this is also the point at which, thanks to language and our conceptual abilities, one is exposed to other people's world view – again, not only the view of things as they are (the metaphysical) but the view of things as they might be, or ought to be (which ultimately turns into the political). (Isn't politics, after all, the art of persuasion – of convincing people that they ought to think differently, or ought to want what they don't presently have – and of showing them the means by which they can supposedly obtain it (starting with the all-hallowed vote)?)
And who are these other people? Our “influencers”, as the current saying goes? Well, parents, for certain... but also other family members, friends, and neighbors – people we know and converse with. But then we have teachers (whose world view may or may not match that of our parents – increasingly the case in our time, since the public schools have fallen prey to “the long march through the institutions”), and books, television, the Internet, and so on – that vast array of sources of information, ideas, facts, fancies, propaganda, threats, rewards – greatly magnified by the communications media in our time.
How often have I heard or read complaints by “boomers” that things were so much simpler when we were kids, and there were only 3 TV channels (two commercial and one "educational"), and school books, and books from the local library, and magazines from news stands, and the radio, and that was pretty much it? The competition for “hearts and minds” was barely a fraction as intense as it is now, and it can be debated whether this is a net improvement or an invitation to confusion and, ultimately, chaos and despair. Can it be that the human organism was only meant to handle so much information – most of it not sense information in the natural or “primitive” sense, but only boiled-down, digitized, distorted remnants? Not even well-developed concepts, but sound bites? Are we not all victims of overload? And if we are, are we equipped with the discernment and ability to do something about it? My observations would seem to indicate that the answer is no, for most people and in most cases. As with so much in the way of technology, our reach exceeds our grasp. We create instantaneous monsters, if you will, but have no idea as how to control them, and thus become their victims.
So our narratives are born, and at an early age they bifurcate into the “is” and the “ought” – “reality” vs. “wishful thinking”. But hold on. Were all of our influencers, and sources, over the years purveyors of reality? Because what where they pushing? Their world view, but surely more of a hybrid of the two – the way things are and the way they ought to be. After all, stark reality – “just the facts, ma'am” (thank you, Jack Webb) can become awfully tedious and boring at times. After all, we are human beings – always restless, always striving, always curious. We are not dogs and cats, or cattle contentedly chewing their cuds in the field. It would be nice to be that existential – that “here and now”, and many of the hippie gurus, taking a cue from Eastern religions, recommended that state of mind as the best, and the one least likely to lead to frustration. And this is not to say that some time spent meditating – being “here and now” – is a bad thing; it's been highly recommended, again by wise men (and women) of many religious and philosophical orientations. But let's admit that plain contentment is a rare thing, and perhaps should remain so. (Or, to put it another way, it is perfectly suitable for the human race to include both contemplatives and “action” persons. In the proper proportions, they complement each other, and in some ultimate sense they couldn't get along without each other.)
For the rest of us, not satisfied with radical contentment or “here and now-ness” (or the cheap imitation induced by drugs and alcohol), we have to deal with our narratives, and each person's core narrative at any given time is likely to be a combination of reality (facts) and wishful thinking (fantasies). I had a chemistry teacher in high school who would always correct any student who said “I think...” He would say “you don't think, you fancy”. But isn't this it in a nutshell? And to the extent that a person is willing to – calmly and deliberately – pry apart the portions of their narrative that are “the way things are” vs. “the way things ought to be”... but wait! What's to keep “the way things are” from being every bit as fantastic (albeit unconsciously) as “the way things ought to be”? Isn't all of our thinking ultimately subjective and illusory? Is objectivity a myth, and don't we use our reasoning powers to, more often than not, simply shuffle various fantasies around and rationalize the ones we prefer on any given day? Or, isn't there at least a continuum of some sort with the “actual” (things I'm pretty certain of, because they're tangible and observable) on one end and the “ideal” (things I know are not the case, but wouldn't it be nice if they were?) on the other?
And – does it even matter, ultimately? If our narrative, or world view, impacts our lives, our decisions, our relationships, on a daily basis, isn't it quibbling to talk about how “realistic” various pieces of it are? Because it's there. Over time, it's arguably the single largest influencer in our lives, or at least in our thinking about our lives.
It's almost like the old question, how can we live knowing that we will die? And there are many answers to that question, needless to say. But then, on a more personal, here-and-now level, how can we live knowing, or suspecting, that our every thought and action is based on, and conditioned by, something that was imposed on us (or, at least, suggested to us) by someone else – and that their every thought and action was, in turn, based on, and conditioned by... and so on. (Perhaps the first and only truly free-thinking human being was some cave man who accidently stumbled upon language. He was able to make it all up on his own, ex nihilo.)
But narratives are very seldom completely idiosyncratic. There are common, shared elements based on, first, the family, and then on the gradual widening world that each of us experiences – but there are central tendencies and groupings, otherwise there would be no political parties or causes. We would each be trapped in our own pseudo-factual world like patients in the back ward of a mental hospital (before they were all shut down, that is). So yes, the reference group (the one we don't choose, and later on the one, or ones, we do choose) has its agreed-upon narrative, which they promote, expand upon, and, in some cases, endlessly blather about, from the coffee house to the student union to the faculty lounge to Congress to TV and the Internet. (For a sampling of grass-roots narratives, I can recommend nothing better than that table of male senior citizens that one always finds at any McDonald's on a Saturday morning – or any other day of the week, most likely. They talk on endlessly, but it's clear that they share the same narrative, right down to the most minute detail. There is no real debate, in other words – just embellishment, and a friendly competition as to who feels most strongly about a given issue – or who can talk the loudest between gulps of lousy coffee.)
But this is also not to say that narratives are merely shared world views. That's precisely the point. They are unique in the same way fingerprints are – one set to a person, and no two people have the same set. But there are definite narrative clusters, if you will, the same way people naturally cluster into racial, ethnic, religious, and gender groups. After all, a key element in narrative formation is the culture one is born into; a good bit of it is, as I see it, pre-verbal or extra-verbal. People don't generally think much about their particular “culture” as young children; they just take it for granted along with everything else – religion, social class, customs, traditions, etc. In fact, I doubt if very many people, even upon achieving mature adulthood, think about how different – how radically different at times – their narrative is from others'. It's like the old saying, “Fish discover water last”. If you're immersed in something from the very beginnings of perception, you don't have to “discover” it; you're living it. In a very real sense, it is you and you are it. Whatever else we are, or become, builds on that; it's our foundation or groundwork, if you will.
But wait! – you might say. What about rebellious youth? What about the ones who, with every generation, decide that the old folks don't know anything, and they're going out on a quest for The Truth? Not only that, but how about all of these “influencers” who encourage, aid, and abet rebellion – thinking firstly about peer groups, but also teachers, the media, books, and so on? Doesn't that old narrative so carefully cultivated by Mom and Pop get tossed aside?
This might seem to be the case if we just look at the surface, or the symptoms. What I suspect, however, is that youthful rebellion is just that – youthful. The day comes when our youthful rebels will decide that it's just too much work, or the world isn't getting better despite all of their efforts... or that more immediate, material considerations need to be given more weight – you know, things like earning a living, as dull as that might seem. Notice that the ones who come to this realization first are invariably called “sell-outs” by the more hard-core types, but sooner or later pretty much everyone falls by the wayside in some way, except for the rare types like Bernie Sanders.
And it's not that the narratives they grew up with have survived intact, the way so many ancient Chinese customs survived Mao's Cultural Revolution. They may have gotten updated, refined, polished, rendered more “current”, more acceptable, but the roots survive. They are still their parents' children, in other words. A “rebel for life” is a rare thing – it happens, and there are many examples, but most people simply can't, or won't, entirely throw off that baggage – and in fact, there is no good reason for them to do so. They are behaving more like natural human beings than like any idealized “New Soviet Man” or member of “The Master Race”, who are expected to act in a robotic fashion and have no hidden agendas and, preferably, no individual personality at all.
To put it another way – imagine an ancient tree that has survived any number of storms, hurricanes, floods, diseases, injuries, even earthquakes. It's battle-scarred, but it's still the same tree. And narratives – the powerful ones, the ones based on what I call “the eternal verities” – race, religion, ethnic group, customs – have a staying power that transcends generations, and even transcends actual events – history, for example, and “social change” (which does not have the same depth as narratives, despite what activists would like).
If you look at current events, you see various narratives bubbling up – ones that were supposed to be long since done away with, precisely because they are based on the things that have always motivated people, both as individuals and in groups – things like, once again, race, religion, ethnicity – things that defined societies and even entire civilizations in the past, but which are now considered hopelessly atavistic and out of fashion. The question is, can any society, or civilization, survive the lack of these things, even if in implicit form? We seem to be experimenting with that possibility at present, with concepts like “diversity”, which is just another word for deracination, i.e. getting rid of things that actually create diversity. And yes, it's been tried before, with less-than-pleasant consequences, as witness Soviet Russia and Maoist China.
So now we come back, at long last, to poor old Joe Blow, Mr. Average, the plain citizen, etc. lurching into the voting booth. Is he really concerned with the latest polling numbers? Is he concerned with the “record” of the incumbent, or the many promises of the contender? And what about the myriad of “influencers” and talking heads in the media? They've been working full time for months, if not years, to talk Joe Blow into voting a certain way because, after all, he'll be so much better off if the follows their advice.
But
no. The narrative rises up, like the monster from an old movie
rising out of Tokyo Bay or New York harbor, to claim its own. And
it's the narrative that tells him who to vote for, and his decision
is based on which candidate seems to represent this narrative –
which one is the best match, that is, with ideas, notions, and
“fancies” that Joe Blow has been carrying around with him for
years, and most likely decades. (I don't think that there's a
political “consultant” alive who could systematically come up
with a campaign that would appeal to the collective narratives of
millions of voters. The best he can hope for is that his candidate
will get lucky and appeal to enough of them to win.)
But why should this be such a hard nut to crack? The problem is that there is nothing simple about any given narrative, and the complexity of narratives in the aggregate is beyond measure. At the very least, we can say that the well-developed narrative has many facets, to be sure – many of them quite personal and biographical, if you will. “I promised my father on his deathbed that I would never vote for [a certain political party].” “My family/friends/associates/co-workers would disown me if I voted for [a certain candidate].” “If I voted for [a certain candidate] it would make me a traitor to my race/ethnic group/religion/gender.” “We (meaning family, ethnic group, social class, religion) always vote for [a given political party].” And so on. (Freud called this the superego. It's that thing that's stuck in your head that makes you do what you'd rather not do, and not do what you'd rather do. It's the “ought” to our “is”, if you will.)
Now, notice how fact-free all of this is? How non-reality based? That's the point. This is the burden, the baggage, that people carry around with them on a daily basis, but it only rises to the surface on Election Day. This is why the pollsters are so often wrong, and hilariously so in some cases, because they (1) go by what people say when they are in a more “factual” or realistic mood; and (2) go by what people say even though they are thinking something quite different (a political poll is not a confessional, after all); and (3) assume that people know their own minds, i.e. don't get cold feet when they step into that voting booth, and substitute the narrative for all the ideas they've been ruminating on up to that point.
Loyalty – a deep-seated desire to remain faithful to all the many factors and influences in one's upbringing – is a powerful force, perhaps the most powerful in fact, since one can argue that without it civilization would never have even developed, and we would still not be living at anything larger than the small tribal level at best. (The smaller the reference group, the easier it is to enforce conformity, which is one reason why systems of central government imposed on a tribal culture tend to fail, and revert to chaos and violence. The village chief has power and commands respect, where the president or dictator in a far-away capital can only rule indirectly and by fear.)
So this would seem to be an answer – not the entire answer, but an important one – to the perennial question of why people vote the way they do. They aren't voting for the candidate or for his/her ideas as much as for a narrative of which that candidate is merely a passing and imperfect representative. And one interesting consequence of this idea is that the candidate may not be fully aware of the extent, or the depth, of the narrative that he or she represents. They may “fancy” that it's all about them, whereas they are, in truth, quite expendable. Or, they may at least imagine that people vote for ideas – preferably for the candidate's ideas. But that's wrong too. The narrative is much deeper than conscious ideas, and in fact (per our discussion of young children) deeper than the language that expresses those ideas. Primitive emotions come to the fore – traumas, pain, fear, frustration, etc. – and we might say that they unjustly influence the process (which is supposed be so rational, after all), but are they not the things that, over the course of a lifetime, have the most impact on our sense of the world and our place in it – our self-image and self-esteem? Don't they deserve a voice – a say in the matter? Perhaps rationality and realism are overestimated, and are, at best, secondary to these more basic drives – more superficial, more fleeting (witness how political loyalties can often change at the drop of a hat). Perhaps the notion of elections as popularity contests is not as off-base as we would like to believe. Maybe “popularity” is a better expression of the narrative than that which appears in all the polls and surveys. But if people vote that way, then can they really complain when, by the cold, clear light of day, they wind up with “buyer's regret”? But it's still better than the “walk of shame” home from the polling place when they think “I sure hope Uncle Louie (of fond memory) didn't see me voting that way.”
Perhaps, in other words, we should be more willing to accept the results of a narrative-laden election. Why look down on people because of the way they vote – especially their deep motives, of which we know nothing? Democracy is, at least in theory, based on the assumption that people are entitled to vote because they are capable of understanding the issues. But human nature being what it is, perhaps this understanding is likely to be outweighed by something more basic and, yes, more complex. Does this mean that democracy is a fatally flawed system because it's based on an erroneous premise? A better question might be, even if it is flawed, are the alternatives any better? We seem to have long since settled that question in favor of sticking with the system we have. If it should one day crumble under its own weight, it may be for any number of reasons, but one might turn out to be the power of narrative and its ability to create contradictions to the extent that they would prove fatal. For one thing, as cultural distinctions – a key factor in narrative development – dissolve in the so-called “melting pot” (even though it is mythical to some extent), resulting in what E. Michael Jones calls deracination, the glue or connective tissue among narratives may deteriorate, and a kind of centrifugal force may make Americans (but not only them) increasingly isolated and, one might almost say, autistic in their own unique narratives. This may be the thing that, once and for all, exposes “E pluribus unum” as a myth.
No comments:
Post a Comment