Thursday, May 9, 2024

The Power of Narrative

 

Well, it's “election season” once again – but really, is it ever not election season? Haven't we gotten to the point where every initiative, every policy, every program, every bill, every “task force”, every “investigation”, and so on, is aimed at the next election, whether for the presidency or for Congress? The American interest, i.e. the well-being of its citizens, has long since ceased to be a consideration, never mind “realism” (especially when applied to foreign policy) and “sustainability” (a favorite buzz word of the environmentalist faction). Even the environment, AKA “the planet”, takes, at best, second place when forced to compete with more pressing shorter-term considerations of power and money (environmental issues being, at least in theory, aimed beyond the lifetimes of the present generation, whereas power and money are all about “ME”, and succeeding generations can eat you-know-what). The planning horizon of our elected officials extends no further than the next election, and once that is accomplished the planning horizon instantly shifts to the election after that. In other words, the next campaign begins the day after the swearing-in ceremony. The long-term welfare of the citizenry, whether economic or in terms of health (which is, or should be, a subset of “the environment”, which is, in turn, a subset of “the planet”) never enters the minds of our so-called “leaders” from one year to the next, except as talking points – and yet they continue to campaign, and continue, incredibly, to be re-elected to an overwhelming extent. So, the question is – or should be – what sustains this clearly maladaptive, and some would even say suicidal, system, where – if we still have any faith in the electoral system – people persist in voting against their own best interests, and thus reinforce policies that have failed time and time again – sometimes spectacularly?


The old chestnut about doctors is that they “bury their mistakes”, so their mistakes don't get to comment on their competence, or lack thereof, on the Internet. Politicians, on the other hand, are like unto miracle workers. Not only do they not bury their mistakes, they parade them out in public, and in speeches, as successes – and a few people out there dare to speak up and argue, but the vast majority remain silent, whether out of despair or utter apathy. (But they still vote – which seems to mean that many people shuffle into the voting booth in a state of despair or apathy. So then why do they even bother? Read on... )


Well, number one, “voting” has always been styled as the premier privilege of the citizens of a democracy – the one essential thing, the sine qua non. Of course, as we all know, or should know, most of the dictatorships of the 20th Century, and those that remain, had, and have, voting – after all, they were/are “people's republics”, right? The fact that their elections were a total sham was pointed out and ridiculed by the more enlightened, i.e. the American media and politicians – all while ignoring the chicanery that contaminated many of our own elections, both on the national and local level. If democracy in the “people's republics” was an illusion, then our own democracy was, at the very least, flawed, and in some case corrupted to an extent that rendered it meaningless and absurd in the “democratic” sense. And yet, it seemed that a flawed democracy, with an electoral system that was capable of being severely compromised, was better than nothing. In other words, the illusion of democracy was preferable to the stark realization that, in many cases, it was nothing but a sham and a cruel hoax.


But we still have to ask why, over time, ever since the earliest years of the Republic, the actions of elected officials have tended to contradict the intent of the people who voted them into office. Seek no further than the temptations of power, and the felt need to, once power is achieved, hold onto it at all costs, so the interests of the ordinary citizen are soon trumped by the interests of the rich and powerful, who take over control of elected officials the minute the ballots are counted. And there is nothing the least bit new or unique about this; it goes right back to our fallen human nature, and politicians are, after all, human – all too human much of the time. They seek power, presumably, in order to “do good”, but in no more than the twinkling of eye their Job One becomes holding onto the power they've been given by a trusting (say gullible, if you like) public.


And is this tendency worse in a “democracy” than it would be in a monarchy, or dictatorship? Well, no, because human nature is what it is – at all times and in all places. And after all, elected officials can, at least in theory, be voted out of office, although it's amazing how seldom this actually occurs (impeachment being ever rarer, and successful impeachment being the rarest of all). So they must be doing something right. Right? Or are we totally missing the point when it comes to our vaunted “democratic” system? If people supposedly vote, time and time again, “for their pocketbook”, or for some other concrete reason, and are invariably disappointed, and yet their voting behavior doesn't change... what is actually going on?


So, can we all agree that there's a disconnect here? We have “voters”, and “candidates” who turn into “politicians” but then are compelled to remain candidates until they decide to retire (or are forced to by some means). But as an arrangement, it very seldom pays off, at least for the hapless voter, i.e. the “average Joe”, “citizen”, whatever. It mostly pays off for the people behind the scenes who choose the candidates, support them, and then once they are in office call in their chits. So the notion of elected officials being “the people's choice” is a hoax in many if not most cases. Was it the “people's choice” when these candidates came out of nowhere and somehow ended up on the ballot? No, of course not – they were the product of a very selective and precise vetting process, the main criterion being: Will this person, once elected, remember who his “friends” are, and act accordingly? And the selection process has been developed and fine-tuned to dependably yield results that favor the people who are really in charge. And if an occasional “mistake” occurs – if an elected official decides to throw off the yoke and be their own man (or woman), well, we have ways of dealing with that sort of nonsense.  (On this topic, see a previous post, "Mayor Pete was the Wine that was Sold Before its Time", April 25, 2020.)


And once again, and as always, the hapless citizen is left wondering what the hell happened... why they have been exploited and betrayed once again, and for the umpteenth time. (Won't Charlie Brown ever be allowed to kick that damn football?) And yet their faith in the system, if one can call it that, persists – or maybe it's just habit, or, once again, despair.


Let's try and take a closer look at this almost universal phenomenon. Let us, for starters, think about voting, and the voting booth, and what actually happens when someone is face-to-face with that choice – with trembling hand holding a pen or pencil, or poised over a button on a screen or a voting machine lever. This is the real moment of truth. All of the propaganda, advertisements, speeches, broadcasts... the papers, magazines, TV, Internet, radio... fade into nothingness at this point and are replaced by an empty space, a void... but then what rises up to fill that void? It's the narrative.


Narrative – A way of presenting or understanding a situation or series of events that reflects and promotes a particular point of view or set of values.

Or – a particular way of explaining or understanding events;

Or – a story or account of events, experiences, or the like, whether true or fictitious;

Or – an explanation or interpretation of events in accordance with a particular theory, ideology, or point of view.


Note that these definitions (and there are many more along similar lines) tend to reflect the utter subjectivity of narrative, i.e. that it's not factual in the strict sense, but based to a large extent on pre-existing ideas, prejudices, biases, and premises – not to mention emotional needs, which, in my opinion, are actually the main “drivers”. And those emotional needs are based, in turn, on an array of events and influences that culminated in the mind set of that person, on that day, at that time, in that voting booth. For we are, after all, human – which means complex, and unstable, impulsive, reactive,,, in short, all that the Founding Fathers believed – naively, perhaps – could be overcome if people were only given the opportunity and the proper motivation. They were, if you will, optimistic about human nature. But has their optimism proven to be utter delusion and folly?


Even the simplest person is capable of having “deep thoughts” at times – and the most intellectual and presumably rational and objective person can be the prey of their emotions, which can cancel out all attempts at “realism”. And I'm not claiming that these “irrational” factors come into play in spite of reason, or of “the facts”; it's really the opposite. It's a more rare occasion when reason, and the facts, come into play in spite of what one might call the more primitive, or juvenile, factors. (In the developmental sense they could be termed “pre-reason” or “pre-logic” or “pre-conceptual”.)


One might say that the notion of the average citizen suddenly turning into an adult when they step into the voting booth is no more than wishful thinking. What is every bit as likely is that they will regress, and revert to an earlier stage of their emotional and intellectual development when they are confronted by that moment of truth.


So what, then, is this all-powerful narrative, and why does it have such an iron grip on us when we are faced with what should be an important decision – not only for the individual but as a basis for democracy itself?


To begin at the beginning – our perceptions of the way the world is (call it our “metaphysics” if you like) start being formed at birth, or even before. Babies are, basically, data-gathering machines. They absorb everything that they see, hear, taste, smell, feel – without prejudice or editing. One might say this is the most objective stage of life, simply because we haven't yet learned to select, or filter, the input.  We never think, or say, "That does not compute", because at that stage everything computes, which is as it should be.  And so a view of the world (their world, of course) is formed – primarily from sensations as opposed to thoughts “about” sensations. Toddlers have a pretty good idea of the way the world is – their world, at least. But they also start getting notions of how they would like the world to be, and the conflict between the two leads to frustration (which helps explain why the "terrible twos" are the way they are). How is this resolved? The way they would like the world to be has to be put on the shelf, after repeated tries and frustrations – but it never goes away. Our “inner child of the past” (the expression comes from a self-help book from some years back) persists, and builds up -- layer by layer -- wishes, hopes, dreams, and frustrations while we lead a parallel life “in the world”. Thus is formed the root of a narrative – the one with the earliest origins – to which we cling (thank you, Barack!) in some form, even if a vestigial form, for life.



And I would say that the great cry of the human person – the real “primal scream” -- is “It's not fair!”, or some variation thereof. And if this attitude, or premise, retains sufficient power into adulthood, it can influence our political thinking, not to mention our relationships, our choice of vocation, and so on. The urge to make the world “work” – to be the way we would like it to be – is compelling, but also infantile in a way. The missing element is willingness to compromise, and – on a more rarefied level – humility. Humility doesn't insist on “compromise”, or negotiation, or bargaining; it is an attitude of willingness to accept things as they are – at least tentatively -- because, who knows, there may be perfectly good reasons for the way things are that we are not privy to – but at the same time to work to make things better, ideally by starting “at home”, with ourselves. (Wasn't there a Beatles song that said, “You say you'll change the constitution, well, you know, we all want to change your head. You tell me it's the institution, well, you know, you'd better free your mind instead” ?)


Many teachers in many religious traditions have taught the value of humility over the millennia. One of the main barriers to developing humility, however, is that residual infantile “It's not fair!” attitude, and this flows quite seamlessly into one's personal politics, and thus collectively into politics in general. For what is any political idea, cause, or movement other than an attempt to make things “right” -- or "fair" -- or "equitable" – to re-fashion the world in the image that we would like to impose on it? There is nothing less humble, or less satisfied, than a political movement. “Conservatives” – the real kind, i.e. the ones who want to keep things just the way they are (vs. the way they were, or are alleged to have been, 100 or more years ago) -- are fighting battles against “change”, especially change for its own sake (which is the most typical variety). If only people would be satisfied with the way things are now! But what if the way things are now is more like the “continuous revolution” of Chairman Mao? Then it's no longer true conservatism, is it? And perhaps in our time there is no such thing, i.e. conservatism has become self-contradicting.



But it seems that we have skipped a few steps here. Go back to the toddler, who is forming a view of the world based on raw data, and not spending much time thinking about it, or interpreting, or conceptualizing. They are, in a way, living an unexamined life, which is perfectly acceptable at that stage. But then along comes speech, and then concepts and ideas, and (hopefully) some beginning of moral sense – and, perhaps most important of all, imagination, which can be defined as wanting that which doesn't even exist, but which we develop as an image or “wish system” by projecting from what is, i.e. from what we know. And at that point the “ought” starts to overpower the “is” – the toddler's alternative world of “like” and “want” starts to accrete names, and concepts, and connections. And this is also the point at which, thanks to language and our conceptual abilities, one is exposed to other people's world view – again, not only the view of things as they are (the metaphysical) but the view of things as they might be, or ought to be (which ultimately turns into the political). (Isn't politics, after all, the art of persuasion – of convincing people that they ought to think differently, or ought to want what they don't presently have – and of showing them the means by which they can supposedly obtain it (starting with the all-hallowed vote)?)  


And who are these other people? Our “influencers”, as the current saying goes? Well, parents, for certain... but also other family members, friends, and neighbors – people we know and converse with. But then we have teachers (whose world view may or may not match that of our parents – increasingly the case in our time, since the public schools have fallen prey to “the long march through the institutions”), and books, television, the Internet, and so on – that vast array of sources of information, ideas, facts, fancies, propaganda, threats, rewards – greatly magnified by the communications media in our time.


How often have I heard or read complaints by “boomers” that things were so much simpler when we were kids, and there were only 3 TV channels (two commercial and one "educational"), and school books, and books from the local library, and magazines from news stands, and the radio, and that was pretty much it? The competition for “hearts and minds” was barely a fraction as intense as it is now, and it can be debated whether this is a net improvement or an invitation to confusion and, ultimately, chaos and despair. Can it be that the human organism was only meant to handle so much information – most of it not sense information in the natural or “primitive” sense, but only boiled-down, digitized, distorted remnants? Not even well-developed concepts, but sound bites? Are we not all victims of overload? And if we are, are we equipped with the discernment and ability to do something about it? My observations would seem to indicate that the answer is no, for most people and in most cases. As with so much in the way of technology, our reach exceeds our grasp. We create instantaneous monsters, if you will, but have no idea as how to control them, and thus become their victims.


So our narratives are born, and at an early age they bifurcate into the “is” and the “ought” – “reality” vs. “wishful thinking”. But hold on. Were all of our influencers, and sources, over the years purveyors of reality? Because what where they pushing? Their world view, but surely more of a hybrid of the two – the way things are and the way they ought to be. After all, stark reality – “just the facts, ma'am” (thank you, Jack Webb) can become awfully tedious and boring at times. After all, we are human beings – always restless, always striving, always curious. We are not dogs and cats, or cattle contentedly chewing their cuds in the field. It would be nice to be that existential – that “here and now”, and many of the hippie gurus, taking a cue from Eastern religions, recommended that state of mind as the best, and the one least likely to lead to frustration. And this is not to say that some time spent meditating – being “here and now” – is a bad thing; it's been highly recommended, again by wise men (and women) of many religious and philosophical orientations. But let's admit that plain contentment is a rare thing, and perhaps should remain so. (Or, to put it another way, it is perfectly suitable for the human race to include both contemplatives and “action” persons. In the proper proportions, they complement each other, and in some ultimate sense they couldn't get along without each other.)


For the rest of us, not satisfied with radical contentment or “here and now-ness” (or the cheap imitation induced by drugs and alcohol), we have to deal with our narratives, and each person's core narrative at any given time is likely to be a combination of reality (facts) and wishful thinking (fantasies). I had a chemistry teacher in high school who would always correct any student who said “I think...” He would say “you don't think, you fancy”. But isn't this it in a nutshell? And to the extent that a person is willing to – calmly and deliberately – pry apart the portions of their narrative that are “the way things are” vs. “the way things ought to be”... but wait! What's to keep “the way things are” from being every bit as fantastic (albeit unconsciously) as “the way things ought to be”? Isn't all of our thinking ultimately subjective and illusory? Is objectivity a myth, and don't we use our reasoning powers to, more often than not, simply shuffle various fantasies around and rationalize the ones we prefer on any given day?  Or, isn't there at least a continuum of some sort with the “actual” (things I'm pretty certain of, because they're tangible and observable) on one end and the “ideal” (things I know are not the case, but wouldn't it be nice if they were?) on the other?


And – does it even matter, ultimately? If our narrative, or world view, impacts our lives, our decisions, our relationships, on a daily basis, isn't it quibbling to talk about how “realistic” various pieces of it are? Because it's there. Over time, it's arguably the single largest influencer in our lives, or at least in our thinking about our lives.


It's almost like the old question, how can we live knowing that we will die? And there are many answers to that question, needless to say. But then, on a more personal, here-and-now level, how can we live knowing, or suspecting, that our every thought and action is based on, and conditioned by, something that was imposed on us (or, at least, suggested to us) by someone else – and that their every thought and action was, in turn, based on, and conditioned by... and so on. (Perhaps the first and only truly free-thinking human being was some cave man who accidently stumbled upon language. He was able to make it all up on his own, ex nihilo.)


But narratives are very seldom completely idiosyncratic. There are common, shared elements based on, first, the family, and then on the gradual widening world that each of us experiences – but there are central tendencies and groupings, otherwise there would be no political parties or causes. We would each be trapped in our own pseudo-factual world like patients in the back ward of a mental hospital (before they were all shut down, that is). So yes, the reference group (the one we don't choose, and later on the one, or ones, we do choose) has its agreed-upon narrative, which they promote, expand upon, and, in some cases, endlessly blather about, from the coffee house to the student union to the faculty lounge to Congress to TV and the Internet. (For a sampling of grass-roots narratives, I can recommend nothing better than that table of male senior citizens that one always finds at any McDonald's on a Saturday morning – or any other day of the week, most likely. They talk on endlessly, but it's clear that they share the same narrative, right down to the most minute detail. There is no real debate, in other words – just embellishment, and a friendly competition as to who feels most strongly about a given issue – or who can talk the loudest between gulps of lousy coffee.)


But this is also not to say that narratives are merely shared world views. That's precisely the point. They are unique in the same way fingerprints are – one set to a person, and no two people have the same set. But there are definite narrative clusters, if you will, the same way people naturally cluster into racial, ethnic, religious, and gender groups. After all, a key element in narrative formation is the culture one is born into; a good bit of it is, as I see it, pre-verbal or extra-verbal. People don't generally think much about their particular “culture” as young children; they just take it for granted along with everything else – religion, social class, customs, traditions, etc. In fact, I doubt if very many people, even upon achieving mature adulthood, think about how different – how radically different at times – their narrative is from others'. It's like the old saying, “Fish discover water last”. If you're immersed in something from the very beginnings of perception, you don't have to “discover” it; you're living it. In a very real sense, it is you and you are it. Whatever else we are, or become, builds on that; it's our foundation or groundwork, if you will.


But wait! – you might say. What about rebellious youth? What about the ones who, with every generation, decide that the old folks don't know anything, and they're going out on a quest for The Truth? Not only that, but how about all of these “influencers” who encourage, aid, and abet rebellion – thinking firstly about peer groups, but also teachers, the media, books, and so on? Doesn't that old narrative so carefully cultivated by Mom and Pop get tossed aside?


This might seem to be the case if we just look at the surface, or the symptoms. What I suspect, however, is that youthful rebellion is just that – youthful. The day comes when our youthful rebels will decide that it's just too much work, or the world isn't getting better despite all of their efforts... or that more immediate, material considerations need to be given more weight – you know, things like earning a living, as dull as that might seem. Notice that the ones who come to this realization first are invariably called “sell-outs” by the more hard-core types, but sooner or later pretty much everyone falls by the wayside in some way, except for the rare types like Bernie Sanders.


And it's not that the narratives they grew up with have survived intact, the way so many ancient Chinese customs survived Mao's Cultural Revolution. They may have gotten updated, refined, polished, rendered more “current”, more acceptable, but the roots survive. They are still their parents' children, in other words. A “rebel for life” is a rare thing – it happens, and there are many examples, but most people simply can't, or won't, entirely throw off that baggage – and in fact, there is no good reason for them to do so. They are behaving more like natural human beings than like any idealized “New Soviet Man” or member of “The Master Race”, who are expected to act in a robotic fashion and have no hidden agendas and, preferably, no individual personality at all.


To put it another way – imagine an ancient tree that has survived any number of storms, hurricanes, floods, diseases, injuries, even earthquakes. It's battle-scarred, but it's still the same tree. And narratives – the powerful ones, the ones based on what I call “the eternal verities” – race, religion, ethnic group, customs – have a staying power that transcends generations, and even transcends actual events – history, for example, and “social change” (which does not have the same depth as narratives, despite what activists would like).


If you look at current events, you see various narratives bubbling up – ones that were supposed to be long since done away with, precisely because they are based on the things that have always motivated people, both as individuals and in groups – things like, once again, race, religion, ethnicity – things that defined societies and even entire civilizations in the past, but which are now considered hopelessly atavistic and out of fashion. The question is, can any society, or civilization, survive the lack of these things, even if in implicit form? We seem to be experimenting with that possibility at present, with concepts like “diversity”, which is just another word for deracination, i.e. getting rid of things that actually create diversity. And yes, it's been tried before, with less-than-pleasant consequences, as witness Soviet Russia and Maoist China.


So now we come back, at long last, to poor old Joe Blow, Mr. Average, the plain citizen, etc. lurching into the voting booth. Is he really concerned with the latest polling numbers? Is he concerned with the “record” of the incumbent, or the many promises of the contender? And what about the myriad of “influencers” and talking heads in the media? They've been working full time for months, if not years, to talk Joe Blow into voting a certain way because, after all, he'll be so much better off if the follows their advice.


But no. The narrative rises up, like the monster from an old movie rising out of Tokyo Bay or New York harbor, to claim its own. And it's the narrative that tells him who to vote for, and his decision is based on which candidate seems to represent this narrative – which one is the best match, that is, with ideas, notions, and “fancies” that Joe Blow has been carrying around with him for years, and most likely decades. (I don't think that there's a political “consultant” alive who could systematically come up with a campaign that would appeal to the collective narratives of millions of voters. The best he can hope for is that his candidate will get lucky and appeal to enough of them to win.)


But why should this be such a hard nut to crack? The problem is that there is nothing simple about any given narrative, and the complexity of narratives in the aggregate is beyond measure. At the very least, we can say that the well-developed narrative has many facets, to be sure – many of them quite personal and biographical, if you will. “I promised my father on his deathbed that I would never vote for [a certain political party].” “My family/friends/associates/co-workers would disown me if I voted for [a certain candidate].” “If I voted for [a certain candidate] it would make me a traitor to my race/ethnic group/religion/gender.” “We (meaning family, ethnic group, social class, religion) always vote for [a given political party].” And so on. (Freud called this the superego. It's that thing that's stuck in your head that makes you do what you'd rather not do, and not do what you'd rather do. It's the “ought” to our “is”, if you will.)


Now, notice how fact-free all of this is? How non-reality based? That's the point. This is the burden, the baggage, that people carry around with them on a daily basis, but it only rises to the surface on Election Day.  This is why the pollsters are so often wrong, and hilariously so in some cases, because they (1) go by what people say when they are in a more “factual” or realistic mood; and (2) go by what people say even though they are thinking something quite different (a political poll is not a confessional, after all); and (3) assume that people know their own minds, i.e. don't get cold feet when they step into that voting booth, and substitute the narrative for all the ideas they've been ruminating on up to that point.


Loyalty – a deep-seated desire to remain faithful to all the many factors and influences in one's upbringing – is a powerful force, perhaps the most powerful in fact, since one can argue that without it civilization would never have even developed, and we would still not be living at anything larger than the small tribal level at best. (The smaller the reference group, the easier it is to enforce conformity, which is one reason why systems of central government imposed on a tribal culture tend to fail, and revert to chaos and violence. The village chief has power and commands respect, where the president or dictator in a far-away capital can only rule indirectly and by fear.)


So this would seem to be an answer – not the entire answer, but an important one – to the perennial question of why people vote the way they do. They aren't voting for the candidate or for his/her ideas as much as for a narrative of which that candidate is merely a passing and imperfect representative. And one interesting consequence of this idea is that the candidate may not be fully aware of the extent, or the depth, of the narrative that he or she represents. They may “fancy” that it's all about them, whereas they are, in truth, quite expendable. Or, they may at least imagine that people vote for ideas – preferably for the candidate's ideas. But that's wrong too. The narrative is much deeper than conscious ideas, and in fact (per our discussion of young children) deeper than the language that expresses those ideas. Primitive emotions come to the fore – traumas, pain, fear, frustration, etc. – and we might say that they unjustly influence the process (which is supposed be so rational, after all), but are they not the things that, over the course of a lifetime, have the most impact on our sense of the world and our place in it – our self-image and self-esteem? Don't they deserve a voice – a say in the matter? Perhaps rationality and realism are overestimated, and are, at best, secondary to these more basic drives – more superficial, more fleeting (witness how political loyalties can often change at the drop of a hat). Perhaps the notion of elections as popularity contests is not as off-base as we would like to believe. Maybe “popularity” is a better expression of the narrative than that which appears in all the polls and surveys. But if people vote that way, then can they really complain when, by the cold, clear light of day, they wind up with “buyer's regret”?  But it's still better than the “walk of shame” home from the polling place when they think “I sure hope Uncle Louie (of fond memory) didn't see me voting that way.”


Perhaps, in other words, we should be more willing to accept the results of a narrative-laden election. Why look down on people because of the way they vote – especially their deep motives, of which we know nothing? Democracy is, at least in theory, based on the assumption that people are entitled to vote because they are capable of understanding the issues. But human nature being what it is, perhaps this understanding is likely to be outweighed by something more basic and, yes, more complex. Does this mean that democracy is a fatally flawed system because it's based on an erroneous premise? A better question might be, even if it is flawed, are the alternatives any better? We seem to have long since settled that question in favor of sticking with the system we have. If it should one day crumble under its own weight, it may be for any number of reasons, but one might turn out to be the power of narrative and its ability to create contradictions to the extent that they would prove fatal. For one thing, as cultural distinctions – a key factor in narrative development – dissolve in the so-called “melting pot” (even though it is mythical to some extent), resulting in what E. Michael Jones calls deracination, the glue or connective tissue among narratives may deteriorate, and a kind of centrifugal force may make Americans (but not only them) increasingly isolated and, one might almost say, autistic in their own unique narratives. This may be the thing that, once and for all, exposes “E pluribus unum” as a myth.


















Saturday, January 27, 2024

Why are the Democrats Supporting Nikki Haley and not Trump?


Question du jour – why are the Democrats (which includes the mainstream media) supporting Nikki Haley? I mean, they expect to win in November, right?  So why do they care who the Republican nominee is? Some of it can be attributed to TDS (Trump Derangement Syndrome), which will always be with us... and of course they don't want Trump to have the “honor” of being nominated (for the 3rd time) by the Republicans, even though they fear and despise the “MAGA terrorists”, i.e. Trump supporters within the Republican party (and those not in the party as well).


It's clear that they aren't playing the long game here. They're getting their jollies by piling on Trump and whoever consents to be his running mate (“I pity the fool...”), but think about this. If Haley winds up being the nominee – and the Dems are doing everything in their power to make certain that Trump can't be nominated – and then loses, the Republican party will survive (if only in the usual minority status). That is, the mainstream Republicans – the “acceptable opposition”, the ones who are always happy to “cross the aisle” and be second-class citizens to the Dems – will take over once and for all, with Trump and the MAGA crowd finally sent into exile and relegated to the ash heap of history.


That would be a perfectly acceptable outcome for the Democrats. Having the Republicans as a perpetual and obseqious minority – which they have been for much of the time in recent years – would feel like business as usual, and the so-called two-party system would survive, at least in theory.


But the Democrats don't want a two-party system – not really. In their heart of hearts, they want a one-party system on the Soviet model, i.e. no opposition at all, not even the acceptable kind. Nothing but unanimous votes in Congress (and eventually one TV network, one radio network, one newspaper (OK, the Soviets had 2)... not to mention, no elections!). So what is the best way to make this happen, or at least to get a head start? It would be to throw Haley under the bus and allow the Republicans to nominate Trump, and then see to it that he loses, at which point the Republicans (meaning all of them, even including the RINOs) could be declared dead and buried along with their MAGA minority.


Now, you might say that Trump and the Republicans lost in 2020, but recovered – and this in spite of the fact that he was the sitting president at the time, and it's rare for a sitting president to be defeated for a second term. But the Democrats and their allies in the media and elsewhere will, by November, have had 4 more years to not only continue to brand Trump as Hitler Incarnate, but also to brand his followers as terrorists and put many of them in jail (and him as well, perhaps) – this process being well under way right now, and proceeding at warp speed. “Our very democracy is at stake!” – cry the mainstream media with one voice.


So the contrasts are much more stark now than in 2020 or 2016 – and this is mainly because the Democrats and the media have declared this to be the case. So the strategy of supporting Haley makes sense in the short run, but in the long run the Dems would be better off if Trump ran again and lost, because from then on the Republicans would be required to hang their heads in shame (for “putting us through this again”) and be paraded around wearing dunce caps by the Red Guard, and be reduced to a bunch of vaporous ghosts (think of 100 Mitch McConnells) wandering aimlessly around Washington while the Democrats establish a people's republic.


(PS – the Dems show no signs of wanting to push Biden aside (or Harris either), despite rumors to that effect. A president who is content to follow orders and read, even if haltingly, from scripts, and a vice president who is satisfied with a portfolio of sinecures, is exactly what they want; it has worked for three years, and it will work for one plus four more.)


(And BTW, Nikki Haley is playing her own game here. She's staying in the race, at least in part (in my opinion) because she expects the Dems/media/courts to take Trump out well before the election, at which point she'll be the last, um, person standing. That's the short game. The medium game would be for her to save a lot of time and money by dropping out now – or at least appearing to – and then wait for Trump to be neutralized, at which point she can come back on stage and save the day like Mighty Mouse.)


It's going to be very interesting to see how these various games intersect over the next few months. In fact, the mainstream (non-MAGA) Republicans may even decide to nominate Trump (assuming he hasn't been disqualified) for the same reason that the Dems would favor this – to insure his defeat, and thus the resounding defeat of the MAGA wing, thus leaving the mainstream unchallenged in their slouch toward obscurity. I'm not sure if they're capable of this kind of subtlety – they are called “the stupid party”, after all – but it would certainly win them friends on the other side – or let's say enemies disguised as friends.


Thursday, December 28, 2023

2024 -- Another annus horribilis?


2024 is looming, like one of those hurricanes out in the Atlantic that's not yet causing much damage, but just wait until it reaches land! I don't want to be just another alarmist (the field is much too crowded already), but I'm afraid that the Republican convention next year may make the Democratic convention of 1968 look like a tea party. (And the Republicans won't have a Mayor Daley to back them up – and I can't imagine the Milwaukee police department will be much help, since they've probably fallen prey to defunding and other forms of demoralizing and neutering.)


This is, of course, predicated on (1) Trump not being in jail at that point; (2) Trump still being in the race (or, Trump being in jail but still being in the race – hey, it could happen!); (3) The mainstream Republicans not having succeeded in keeping him out of the primaries; and (4) The mainstream Republicans accepting primary results that favor Trump, rather than declaring them null and void and going to a caucus, AKA “smoke-filled room”, system.


Note that the Colorado Supreme Court has already barred Trump from both the primaries and the general election, and they are likely to be followed by many other state supreme courts across the land – and in the Northern Mariana Islands, Guam, etc. (How exciting it is to keep an ex-president from running for president!  And just about anyone can play!) – but especially in states with high population levels (all you need is the West Coast and the Northeast). While Trump's base is justifiably outraged by this – as are a handful of commentators on Fox News – the Republican mainstream is strangely silent on the matter. Perhaps it's because they're glad to have someone else do the dirty work for them so they won't get in trouble with Trump's base, and/or they see it as an example of how easy it is to keep someone out of the primaries, as in “Hey, why didn't we think of that?” (Actually, they did, when it came to Ron Paul.) (OTOH, RFK Jr. has been subjected to a total media blackout, probably because, like Ron Paul, he has a lot of good ideas. But they can't make fun of him because of the family name – unlike Ross Perot, who at least had amusing ears.) To put it another way – I suspect that much of the Republican mainstream is secretly celebrating this, oblivious to the fact that if it can happen to Trump it can happen to any of them as well.


You can see the run-up increasing in intensity on a daily basis, primarily in the mainstream media but also in statements by Biden lackeys, certain academicians, certain “entertainers”... all reading from the same sheet of talking points, of which #1 is always “Trump is Hitler” (not “will be Hitler”, note, but he's already Hitler, in some kind of mysterious reincarnation phenomenon). The Ministry of Propaganda aside, what the Fox News folks call “lawfare” is also well underway, and is merely a seamless continuation of the impeachments while Trump was in office – with many of the same people calling the shots as during Trump's administration. Of course the “bloody shirt” that is constantly waved in the air is January 6 – a date that will live in infamy! – but it's far from the only weapon in their arsenal (heck, even the Russia collusion hoax is still alive and well in the fever dreams of many of them).


But behind it all – the thinly-concealed threat, if you will – is the very real possibility that the troops are already being organized to show up in force at primaries and at the convention – and yes, I mean the same folks who did all the burning and pillaging and vandalism back in 2020 (and who continue to do so at selected locations just to keep in practice). And this goes way beyond the time-honored “rent-a-mob” technique on the local level (often, depending on the issue, with Jesse Jackson and/or Al Sharpton parachuting in to add spice to the mix).  As in 2020, these so-called anarchists (totalitarians in disguise, I mean) will arrive from all over the country, brought in by plane, train, bus, and automobile, and with pockets full of cash from their billionaire sponsors, who – recalling a phrase from the war in Vietnam – believe that it's necessary to destroy the country in order to save it.


So what it really amounts to is a protection racket of sorts (remember the “long hot summer” threats of times past?). Keep Trump on the primary ballots and this is what will happen – and just try nominating him and putting him on the national ballot! Cities will burn! And the mainstream Republicans, ever the gentlemen (and gentlewomen), will, I expect, bow to mob rule and disown that troublemaker – i.e. Trump – once and for all, rather than just being passive-aggressive about it the way they were during his administration. And we'll wind up with some garden-variety neocon who won't ruffle the Democrats' feathers – Nikki Haley* being in the lead for that role at this point (and please note she's getting support from some Democrats simply for being the anti-Trump). And then, in turn, if the Republicans come up with another uninspiring, ho-hum candidate, that person will lose the election to Uncle Joe or whoever the Democrats have called up from the bench to replace him. (And – highly likely – the Trump base will simply sit out the election as a form of protest, thus giving Uncle Joe even more of a mandate than he would have had otherwise.)


So – bottom line – the protection racket will have worked. And no, it's not democracy or even a pale semblance thereof; it's strictly mob rule of the kind that can be found in many “banana republics” and other pseudo-democracies across the globe. But if this is what we've come to, well... some will call it karma, others will say it's the way empires decline and fall, and many of the citizenry – thoroughly demoralized already -- will just shrug and say (or think) “Eh, what do you expect?” Faith in government, anyone? I'm afraid that's already extinct at this point. Rule of law? The Colorado Supreme Court certainly doesn't have any use for it. There's just enough residual faith for some people to think that voting might actually make a difference; the rest of us are either cynical, or pessimistic, or just plain realistic – and if you can tell me the difference these days, please let me know.


* This just in – she failed to denounce slavery! Looks like the establishment has already administered the kill shot.

Tuesday, July 18, 2023

From Global Pillage to Global Village

 

The perennial debate when it comes to “empire” is: Who benefits? But before we deal with that question we have to distinguish between the two major types of empire, what I will call the expansion type vs. the overseas type. The expansion type is as old as human history – in fact, in a way it is human history, in that so much of what we know of ancient civilizations consists of their wars of conquest. (No one ever writes about, or memorializes, peace – too boring! The ancient inscriptions, steles, obelisks, etc. were overwhelmingly devoted to military campaigns – victories – conquests. (I have yet to hear of one commemorating a defeat.)) And this was all about expansion – enlarging an area of control (by a given race, ethnic group, tribe, etc.) beyond its current borders. And the motivation? Sometimes it was all about simply winning – conquest for its own sake. What king or emperor wouldn't want to expand his area of control? But it could also be about resources – arable land, timber, access to waterways, acquisition of slaves (conquered peoples), trade routes, minerals – even the need for a “buffer zone” between one empire and another, i.e. take over a given piece of territory but not make it an “official” part of the empire, just maintain it as a protectorate and a first line of defense against whatever's on the other side. (Ukraine, anyone? This is exactly what Putin is up to.)


And, of course, there is just plain old glory – being famous and celebrated far and wide – having a large chapter in the history books, etc. “The Sun never sets on the British Empire” – remember that? It was actually true within living memory. If we can “plant our flag” far and wide (and even on the Moon!) that makes us conquerors – winners – superior in every way.


But this introduces the second type of empire, which is relatively recent and which can be traced to the discovery of America. And that is the overseas empire, which is, to a significant degree, based on, and energized by, trade. But “trade” is a relatively peaceful enterprise, so it has to be backed up by strength – military certainly, but economic and diplomatic as well. I mean, think about it, what's the first thing that happened when the European powers started to colonize the Americas? Trade – followed fairly closely (in some cases) by missionaries. And then the powers had to get together and agree to keep their hands off each other's stuff, i.e. colonies (which pretty much worked most of the time, except when the colonies became spoils of war).


And what is trade? It's trading something of less value (to one party) for something of more value (to the same party) – and ideally, both sides of the trade realize a benefit, or profit. “Free trade” – the ideal of all good libertarians – is a deal from which both profit. Another way of putting it is that if a given trade raises the standard of living, or quality of life, for each party then it was a good trade.


But how much of the “trade” between the European powers and their colonies can be described as “free”? In other words, what did the colonies get out of it? In the worst cases, no material benefits but plenty of exploitation and slavery. In the more moderate cases, certain benefits, but you can be sure that the colonizers always came out better, bottom line-wise, than the colonized.


But here we have to make a distinction. When we say “colonizers” whom are we speaking of? The on-the-ground traders? The ship owners? The merchants back in the home country? The governments or rulters of said home country? It kind of depends on whom, or what, we're referring to. To oversimplify a bit, if it doesn't pay, it won't be done – which means that if someone back home isn't making a bundle from the colonial trade, said trade will come to an end (or never be initiated).


As usual, follow the money. Who got rich from the colonial trade – from, let's say, the conquest of America right up to World War II? The merchants, certainly – and the privileged few who managed to get their products sent back in the other direction. And if we say “the merchants” we are also saying the politicians, and even the ruling class, because they are dependent, to a greater or lesser extent, on the largesse of the merchant class, who – among other things – help them to remain in power.


But how about “the people” – the “man on the street” – the ordinary Joe? Were they better off living in a country that was a colonial power than in one that wasn't? One could make a “trickle-down” argument here – or, the crumbs from a rich man's table are better than nothing. But that would be to ignore the costs (both hidden and obvious) of empire. Number one, as I've said – trade is all well and good, but it's always backed up by military might. And who, pray tell, is in the military? The sons of the ruling elite? Very seldom. More likely, the average Joe who is either drafted into the military or who sees it as preferable to his other prospects (if any). So his blood may very well be shed in order to expand, consolidate, and maintain the empire – with very little in return except, as always, for a few memories of valor and heroism – a few “rusty medals”, if you will.


And is it worth it, to him? Well, the “common folk” of any country or empire are typically much more patriotic, if in a somewhat naive way, than the ruling elite, who tend to be self-serving and cynical. When Joe Snuffy shows off his medals to the folks back home, he's expressing a deep feeling of pride and patriotism, even if the jaded politicians who sent him over to some hell-hole on the other side of the world couldn't care less. Was he exploited? Hell, yes. Was he “cannon fodder”? Ditto. But as a “rite of passage”, military service in time of war has no peer. The guys who come home in body bags don't vote. And this is, sadly, the lot of fallen mankind and his various societies from time immemorial. The rulers have one set of values, and the common people have another, and ne'er the twain shall meet. And all of the “consciousness raising” on the part of antiwar activists is of no avail, as long as the people insist on clinging to their images and delusions (which are, of course, programmed into their brains by the ruling elite).


(When things eventually boil down to human nature, which is intractable, it may be time to turn around and walk away. But I would like to expand on the topic a bit more.)


So – the second type of empire – the “overseas empire” – really began in earnest with the discovery, and conquest, of the Americas. All of a sudden a European nation could flex its muscles without having to challenge, or even offend, its neighbors – and, by the way, sustain little or no damage or even inconvenience on the home front. Just take over a huge chunk of North, Central, or South America! Nothing to it! But at the same time, note, much the same was happening in Africa, Southern Asia, and East Asia. The European powers had become empire-happy, and any place that offered the least resistance found itself forcibly colonized (if not conquered in the strict sense). And again, it was about trade, first and foremost – but also about glory, and power, and being a major player on the world stage. And the point is that it was always a profit-making enterprise, at least for the ruling elite – and a net loss in blood and treasure (think increased taxation to support the whole thing) for the common folk.


And this, by the way, continues right up to the present day! There is nothing ancient, or merely “historical” about this. It's going on even as we speak.


Of course, there is a certain feeling of quaintness about some overseas empires of old. The Germans had one, right up to World War I. The Italians... the Portuguese... the Belgians... the Dutch... and so on. Eventually, it boiled down to the British and French, and that's when things started to change. All of a sudden the benefits of the traditional-style empire came under scrutiny – not only who profits (we always knew that), but do they even profit any longer? And then you had the curious phenomenon of what's called “self-determination”, and it started to catch on, big time, after World War II. Countries that had been consigned to abject slavery and servitude – especially in sub-Saharan Africa – started getting funny ideas about independence. And a lot of the “credit”, if you will, for this, goes to the international communist movement, and their agents from Soviet Russia and Maoist China (throw in Cuba if you like). They talked a lot about “freedom”, “liberation”, and self-determination, all of which was designed to conceal the actual agenda, which was simply a new and different kind of slavery – slavery not to another nation but to an idea. And, I might add, to create a new ruling elite (“Meet the new boss, same as the old boss”). But to people who had been under the boot of one or more European powers for, in many cases, centuries, this was music to their ears. So we had uprisings in India, Algeria, the Congo, Vietnam, and so on – not to mention uprisings against the ruling elite in Latin America, where liberation had already arrived once with Simon Bolivar. (Time for another revolution! Latin America became notorious for this after World War II – almost as if it were a national pastime.)


But what was it, really? Throwing off the colonial yoke, or boot – certainly. Rebelling against exploitation and the racism which usually accompanied it? Absolutely. Assertion of politcial ideas, and ideals which had no precedent in the “primitive” tribal culture? That too. (It was always the “intellectuals” of any given country – typically products of the Sorbonne – who spearheaded these movements.)


But... why was it always communism and never capitalism? Why was the red flag always being waved? Because they saw capitalism as part of the problem – as the economic model of their oppressors (“Yankee go home!”). Communism, on the other hand, was a new, fresh breath of freedom – never mind what it meant to the hapless citizens of the Soviet Union. (And quite frankly, maybe the lot of the average citizen of the USSR looked pretty good compared to the lot of the average “coolie” in one of the European colonies.) (The hackneyed term “it's all relative” comes into play here, and in this case it really is all relative.)


So if there is a mass movement in post-WWII history, it's the breaking free of the former colonies from the former colonial powers. And with the exception of France with Algeria and Vietnam, said powers were, by and large, remarkably docile and accepting of the situation, as if they could see that the time had come. There were struggles, of course – quite violent at times (India being an example, and the Congo) -- but the handwriting was on the wall. Suddenly the satisfying status quo had turned into a burden. The colonial empires were turning out to be more trouble than they were worth, so they were broken up – sometimes peacefully, sometimes not – but broken up nonetheless, with very few pieces remaining.


And too, on the home front, people started to question not only the wisdom but the moral validity of overseas empires – of coercing people of a wide range of races, ethnicities, religions, etc. into fitting into the “colony” mode. We speak – to this day – of the “Third World”, but are they truly inferior? Second-class citizens at best? Perhaps this is what the “diversity” movement is all about – not only on the domestic front, but the global front as well.


Of course part of this has to do with the admission – a tough pill to swallow! – that our “values” are not only not shared by much of the world, but that they aren't even interested – and in some cases, despise our “values”, and consider us fools for adhering to them. (This attitude seems especially prevalent in the Muslim world.) And doesn't this fly right in the face of our most basic, founding ideas – that the “American way” is not only good for us, but is good (or should be) for the world at large? One of the basic – I'll call it myths – of the American founding is that our values, as expressed in our founding documents, are universal, i.e. that they are valid above and beyond any accidental considerations of race, ethnicity, religion, etc. Any speech by any politician from 1776 on has this as its conceptual underpinning.


But what if it's not true? What if it really is “all relative” – to what I call the eternal verities, i.e. race, ethnicity, and religion? (And gender as well, for that matter.) What if religion, for example, is a more basic, deeper, and profound aspect of a given people's world view than what's in our founding documents? I don't think we have, yet, fully come to terms with this possibility. We're still convinced that “the American way of life”, and “democracy”, are universal values, and there are none higher. And note that our foreign policy is ultimately based on this – and backed up by military might whenever and wherever needed. Yes – all our blood and treasure is spent trying to convince the rest of the world of this one simple idea – so obvious to us, but so foreign and even perplexing to most of the rest of the world. And we find this highly offensive, and spare no expense to convince them (by persuasion or otherwise) that we're right and they're wrong. (And George W. Bush asks “Why do they hate us?”)


But is that the end of the story? Hardly. The colonial model is alive and well, but it has morphed into a new, different – more efficient – form in our time. It's no longer about large numbers of troops stationed in the colony – that pretty much ended with Vietnam. So it's not about overt brute force as much as economic and political colonization – and for this to work we have to, basically, bribe the rulers of any given country in order to secure their cooperation, while at the same time overtly “respecting” the “independence” of the country in question. And at the same time we have to coordinate with international organizations like the World Bank and the International Monetary Fund, because they have their own agendas – their own empires, if you will (I leave out the U.N. because it's basically become the court eunuch of the planet). And the goals? Basically the same as always --”trade”, which means exploitation to a greater or lesser degree, and political cooperation, i.e. don't get too friendly with any communists who might be lurking about, and keep any rebels and insurrectionists at bay (with the help of our military, if needed – but usually on a covert basis).


So the plunder continues – and it appears that sub-Saharan Africa is the most prominent example. How does the man on the street in Africa benefit from his government's “cooperation” with America (you know, the dictator who used to stash his bribes in Swiss banks, although maybe the Cayman Islands are the hiding place of choice now)? In many cases, enslavement on the same level, or nearly so, as in days of old when the colonial powers were issuing stamps with the name of his country on them. Or, at the very least, questionable benefits or a break-even situation where they're neither better off nor worse off for our involvement. And behind it all is – shocking, I admit – a kind of newly-minted racism on the international scale – as if to say, well, technically these people aren't inferior to the white race (PC check-off), but they really aren't ready for full self-determination (AKA “democracy”) as yet, so we're going to help them along. Help them in the usual way, that is – by supporting home-grown tyrants and doing battle with insurgents and rebels (who may be closer to “the people” than the tyrants are). (Any wonder why we actually have troops stationed in places like the Central African Republic, that most Americans don't even know exist? Here's your answer.)


So yes, the more colorful and stylish colonial era is long gone – as are the glories of the British, French, Spanish, etc. empires. The King of England is no longer the King of India. And so forth. But the Third World is still there, and it is still among the “done-to” as opposed to the “doers-to” (that would be us, sorry to say), although some countries are struggling, with mixed success, to overcome their Third World status – India comes to mind.


But wait! There's more. (And I'm not talking about steak knives.) A funny thing happened, just in the last few years. The denizens of the Third, AKA exploited, done-to, World started catching on – not to their sorry lot, which they've been aware of for generations, but to the fact that they could escape. Escape, that is, on foot or by boat or airplane (or surfboard, for all I know) from their ill-starred native land to – guess where? Yes! To the very land of their oppressors, their exploiters – the gold mountain, the promised land. Irony much? And yet it's happening before our very eyes on a daily basis. And all it took, really, was a bit of consciousness raising – perhaps not intentional so much as the overwhelming influence of news and entertainment media. These folks didn't all of a sudden acquire the resources with which to buy plane tickets, or boat tickets, or to pay smugglers – all they did was realize that it was possible. So now the world (literally) is pouring across our southern border and there's no political will to stop it – because... well, maybe it's some kind of guilt. Maybe it's the feeling that our karma is catching up with us. Maybe we genuinely feel that letting the world in the door will improve our lives in some way, or at least give us more respect. At any rate, it's happening, and all the quibbling about costs vs. benefits won't stem the tide. It is, arguably, one of the most significant human migrations in modern times (excepting war refugees, even though some of the current migrants are in that category as well as the economic one).


And what about the people who are paying the price for all this – in violence, competition for jobs, clashes of cultures, “no-go” zones in large cities, infrastructure costs, social programs, opportunity costs (dealing with refugees vs. improving or even maintaining the standard of living), etc.? Well, they don't count, as our politicians and their media facilitators tell us on a daily basis. Much better to be “compassionate” and “welcoming”, and so on, than to try and preserve what's left of the culture most of us grew up with and always assumed would last indefinitely. Because, after all, anyone with those outmoded ideas is, by definition, a racist/fascist/you name it. There is no more comfortable “majority”; what we have is a majority of minorities. Diversity is not a goal or ideal, but a fact.


But again – as always – who pays the price? The ruling elite in their gated communities and Martha's Vineyard mansions? The corporations in their blue-tinted towers? Surely you jest. It's the average Joe, the man on the street – the “deplorables” – who are seeing their way of life crumbling, their world view challenged, their welfare threatened, their prospects narrowing or vanishing. But how many of them connect the dots, i.e. from this to the politicians who they persist in voting into, or keeping in, office? Very few – because, again, the propaganda machine is permanently set on “anyone who questions any of this is a racist, fascist, etc. and deserves to be shunned”.


The world is being remade before our eyes, and it's – oddly enough – the “little people” from elsewhere on the planet who are doing it – the residents of the Global Village. The formerly dispossessed, done-to, exploited, bottom-rung people have become, in the aggregate, our “influencers” and tastemakers. They are voting, and have already taken over in many parts of the country. They own the streets, and are taking over the airwaves as well. (To become a stranger in a strange land – the one I was born in – is a bit disorienting. Now it appears that if I ever belonged somewhere, now I belong nowhere, and am only in the way.)


But is this truly something new under the Sun? Well, mass human migrations are as old as human history, and in fact older. When it comes to world history, instability seems to be the rule – which is why it's kind of hilarious when those in charge try to impose arbitrary borders on, basically, borderless groups of people, as happened in the Middle East, Africa, and elsewhere. There are no more “no man's lands” – everything is on Google Maps, as if to say “This is the way the world is, and this is the way it's going to stay, and if you don't like it you can just leave.” But human nature, especially as expressed in societies, races, large numbers – has no interest in that sort of ossification. We are migratory creatures, after all. If we didn't come from somewhere else, we had an ancestor who did. So yes, this concept of “Native American”, or “native” anything, misses the point. Does anyone have a “right” to be where they are? I think the most we can say in this regard is that there is a “right of conquest”. If someone, at some point, took possession of a given piece of land, and is able to defend it, and their descendants are able to defend it, then that comes as close as anything to being a “right”, and being entitled to protection by the government. But if that government, or regime, should change, or if waves of “aliens” descend on that place, then all bets are off. Then we are back in a more primitive time, a Mad Max world, where everything has to be defended at all times, and nothing can be taken for granted. And this is where our so-called “leaders” seem to be taking us – into an age which is anarchistic in some respects but totalitarian in others. Property rights are in jeopardy, but the rules for proper behavior – and proper thinking – are more stringent than ever. In this sense, we come to resemble, more and more each day, those “Third World” peoples from whom we had always thought we were maintaining a comfortable distance – except that they are now here, and we are becoming them.


Wednesday, November 16, 2022

Trump 2.0? Eh... not likely

OK folks, time for a reality check. Trump says he's running for president in 2024. Fine. Presumably he'll be running as a Republican. Fine. (I guess he could run as an independent – he might even get on the ticket!) But consider a few of the hurdles he will have to face. He can stage all the rallies he wants, but when it comes to “debates”, guess what – it's the party that decides who gets to participate, and the Republicans could simply refuse to let him in the door. That's number one. Then we have the primaries. Did you know that there is no requirement for primaries? It's not in the Constitution, or anywhere else. The party can decide to have a primary, or it can just skip primaries altogether and go right to the convention. Then there's the small matter of convention delegates and how they're selected. The state committees can simply refuse to send any pro-Trump delegates. And then, in the wildly improbable likelihood that Trump wins a plurality of votes in the convention, they can simply be declared null and void, and the convention can become “brokered” (formerly known as “the smoke-filled room”).

Now this, of course, is all predicated on the premise that high-ranking Republicans have... um... you know, those particular masculine anatomical parts, which they have demonstrated, over and over again, that they do not possess. But really now – does anyone actually expect Orange Man to rise again from the depths, like Godzilla, and take over the Republican Party again? Or for them to allow it to happen? This is, of course, the recurring nightmare of the Republican mainstream – not of the Democrats, note, despite all their wailing! They know it's a lost cause, but it's more fun to pretend it's not. (They've gotten so used to running the Fear Machine that they can't resist using it on themselves.) So all of the hand waving, running in circles, and nervous breakdowns in both parties are no more than theater (but if it keeps the MSM busy for 2 years that could be a good thing). The Republicans have had enough of show biz. They'll nominate some gray nonentity who will be certain to lose to Joe Biden (even if the latter is ruling from an oxygen tent at that point), and thus be able to return to their comfort zone of powerlessness.
(But – BTW – don't think that BLM and Antifa are going to take this sitting down. They are primed and ready for the next fight. Expect them to show up in force at any Trump event until this quixotic candidacy is terminated, either voluntarily or by force.)

Monday, September 12, 2022

Autism and Asperger Syndrome


The question arose as to whether one could, or should, label a certain individual "autistic".  Here are my thoughts on the matter.


Autism vs. Asperger Syndrome


I think this reflects an unfortunate problem with terminology. This is nothing new with the medical profession, which is always redefining ailments, sometimes for good reasons based on research and clinical observations, but sometimes with an agenda – typically having to do with things like research funding, medical insurance, certifications (of doctors, hospitals, medical schools), etc. – even politics. Everyone wants to “belong” – to be part of the “in group” – and medical professionals, being only human, are no different.


Autism:


It wasn't all that long ago (as recently as the 1960s, and maybe more recently) that “autism” described a well-known set of symptoms and conditions. It was typically diagnosed at an early age (pre-school or even infancy), and found more in boys than in girls for some reason (I don't think they've figured out that part of it yet – it probably has to do with differences in brain and neurological structures). Typical symptoms included inability to relate emotionally (and therefore socially) to others, including one's own parents... no signs of affection... minimal or no verbal communication... low threshold for over-stimulation (by lights, sounds, other people, activities, etc.)... what verbalization there was tended to be “flat”, i.e. uninflected or monotone... a tendency toward repetitive activity (concentrating on one thing for hours at a time)... physically passive in some cases, in other cases a tendency toward rapid, random and unfocused movements... basically just out of contact, in their own world much or all of the time. (Paradoxically, while not showing signs of obvious affection, some autistics can be physically “clingy”, which I take to be based on need for contact comfort.) (Think about it – if you don't understand the world and it doesn't understand you, some sort of physical comfort and security can be good.)


And this was – as one might imagine – a pretty easy condition to spot. The problem came not with diagnosis but with notions as to causality. For a long time, blame was placed on “cold, uncaring, non-nurturing” mothers – this has been debunked, fortunately, but it caused a lot of stress and heartache in many families. (If anything, there might have been some degree of causality in the other direction, i.e. the mother of an autistic child might have distanced herself to some degree as a matter of emotional defense, as if to say “if the child doesn't care about me (or anyone else) why should I care, or pretend to care, about him?” Thus, a way of avoiding or lessening chronic emotional stress and frustration.)


In terms of relating to the world, autistic people typically showed little or no competence, and therefore could never be left to their own devices for long, and certainly could never have been expected to live independently or make a living. So they always had to be cared for by others – and since they were incapable of showing much appreciation for that care, it could be a cause of frustration on the part of the caregivers.


But here's an interesting part. Some autistic individuals showed remarkable talents in certain very narrowly defined areas – especially music, and particularly piano playing. They could do things like hear a piece played on the radio or a record, and reproduce it perfectly on the piano after just one hearing. Some were also very good at certain mathematical operations, figuring out calendar dates, counting by just glancing at an array of objects, etc. – all having to do with numbers, you'll notice. Numbers in the basic sense, not concepts or theories or models, just plain numbers and things that had a mathematical basis. They may also show remarkable abilities in memorization – things like sequences of cards, phone books, train schedules, etc. So in that sense they (some, but not all) had extraordinary abilities in a very limited area, but when it came to everyday things not so much (being unable to dress themselves or perform any but the most rudimentary personal care actions, e.g.).


So this was the picture when it came to autism and autistic individuals – easy to spot, well-defined set of symptoms, incapable of independent living, and so on. And as to treatment, the best bet was always to find things that they would respond to, that would “wake them up”, so to speak – and let them spend time with those things, and not worry about the rest. And the condition, however it came about, was not amenable to cure – it was a fixed condition, basically, which would persist throughout adulthood.


Asperger Syndrome:


Now – somewhere along the line, someone decided that that substantial group of people who were, among other things, socially awkward, “shy”, over-sensitive to sounds and light, who avoided crowds (and other people in general, in some cases), who enjoyed finely-detailed activities and could concentrate on them for long periods of time, who tended to be socially isolated or prefer the company of others like themselves, who tended to be uncommunicative or, on the other extreme, talk people's ears off about some very narrow topic, who could be somewhat OCD – and so on – had a “syndrome” called Asperger Syndrome.


Now, this was all well and good, in that it, for one thing, provided a basis for understanding that there were people who were simply “that way”, and that while intensive therapy or interventions weren't generally called for, certain kinds of support and, if you will, “benign tolerance” would make life easier for everyone. The danger, however, was that once you define something as a “syndrome”, you, by implication, are saying that a person isn't “right”, or that they're handicapped in some way, or need help, etc. In other words, they're no longer on the same spectrum with “normal” people but need to be given special attention (which should be positive, but which can also be negative). On the plus side, Asperger “types” can be relieved of the burden of thinking that something is seriously wrong with them, or that it's their fault, or if only they'd get their act together, etc. And in the social sense, Asperger types can form interest groups of various sorts without feeling like a bunch of geeks and losers.


So it's a mixed bag, but overall I'd say the definition of the syndrome has had beneficial effects. It enables people with the syndrome to feel better about themselves, to pursue their interests and emphasize their strengths without feeling like underachievers in other respects... and it enables other people to accept them as they are, and likewise appreciate their strengths and talents, and be willing to overlook areas in which they aren't quite up to par.


The Bad Marriage Between the Two


Everything could have been fine at this point, except that someone – over-functioning in the “syndrome” and terminology department – decided that, because of the observable similarities in symptoms (some, but not all – and certainly not in severity) between autism and Asperger's, they had to be lumped together on a “spectrum”, which became known as the “Autism Spectrum”. So, number one, they're taking a rare subset of people (autistic) and grouping them with a not-at-all-rare subset (Asperger's) and, in effect, calling them all autistic. What sorts of motivations went into this? Well, for one thing, there's the simple matter of money, i.e. funding for research, treatment, therapy, etc. – not to mention health insurance. There was always money in autism, because it was rightly considered a serious condition – but there was little or no money in Asperger's, other than the opportunity to sell books. But lump them together and call it autism, and the money starts to flow. (This may sound a bit cynical, but the extent to which “science” can be tempted by money has been demonstrated many times over the years – and more than ever in these times, with obsessions like “climate change”, gender fluidity, etc.)


Secondly, there's a political, or let's say social, angle to it all, the notion being that autistic people, and their parents and caretakers, won't feel so bad about their situation if they now feel more “mainstreamed”, and therefore accepted. If the truly autistic were a small minority before, they can now feel like members of – still a minority, but a substantial one.


(One could ask, terminology-wise, whether rather than coming up with the “autism spectrum”, they couldn't have just called autism “high-level Asperger's”. It would have made no less sense, but the political and social impact would have been less.)


Plus, there's a pretty good chance that most truly autistic people don't care one way or the other what “spectrum” they're on; some of them don't care about much of anything at all. But the much larger number of people who are Asperger's types, and who know it, and now find themselves on the “autism spectrum”? I can't imagine that's very good for their morale or self-esteem. But we're talking politics here, right? So non-preferred groups always have to make sacrifices, like it or not, in order to benefit preferred groups. (And the fact that this is all about naming, and nothing else, makes it especially cruel and unjust. Terminology can change overnight, and someone who is “sick” one day can be declared “well” or “normal” the next, and vice versa.)


But is it true that autism and Asperger's are similar? Well, yes – in terms of the types of symptoms, but certainly not in degree – and also not in terms of the nuances, or fine points. And also not in terms of the variety of symptoms that might be exhibited by any one individual – Asperger's types have a much more varied repertoire, if you will, within the bounds of that syndrome, whereas true autistics are much more limited. Overall, you can point to social issues, attention factors, mathematically-based interests, responses to the environment, preferred vs. non-preferred activities, and so on. But in terms of self-care, ability to operate in society, ability to earn a living, and so on, it's a world of difference, and it does no one any favors to pretend that it's nothing more than a matter of degree. Plus, one can point to many examples of high-achieving individuals – world-class achievers, in fact – in things like math, physics, music, chess... even the performing arts... and also art, engineering, computing and automation (a veritable den of Asperger's types), and so on. Many have risen to the top of their field. Can the same be said of the truly autistic? No. Some have made contributions – Temple Grandin comes to mind – but this is exceptional. (There's a history of what have been called “idiot savants”, or “calculating boys” who can perform remarkable math operations in their heads with amazing speed – and the chances are those have been largely autistic individuals. The question in those cases was always, given that they have amazing talent in one specific area, is there anything else they can do well, or do at all? And the answer was frequently no. All their brainpower was focused on one thing.)


I also suspect – although exactly how one would measure this is a good question – that if you arrayed all the Asperger's types and the truly autistic along the same scale, you'd get a gradually downward-sloping curve starting at the low end (next to the “normal” population), and there would eventually be a gap, followed by a “bump” or miniature bell curve representing the truly autistic (with their own spectrum, although much narrower than the Asperger's spectrum). In other words, you would find few if any cases where a person was part-Asperger's and part autistic – and I think this would reflect significant differences in brain physiology.