Saturday, October 24, 2015

I Second the Emoticon


It has become a truism, a stereotype, and widely regretted – the picture of a crowd of 20- and early 30-somethings each staring at their own personal screen, of whatever size, and apparently oblivious to the world around them. They are engaged, certainly – but with a world mediated by electrons, or perhaps consisting entirely of electrons, as opposed to the here and now. And stories abound of traffic and public transportation accidents – some fatal – attributed to “texting” or some other untimely use of electronic gadgetry.

And this does, in fact, seem to be the status quo, as any stroll down a city street, or stop at a diner, or outside a public school or college, will verify. There is no denying that this is how society – at least of the youthful variety – presents itself in our time. But the question is, what does it mean? The conventional wisdom on the matter is that it represents increased isolation – egotism – me-ism... and, in a sense, is a kind of societal autism – vast hordes of people wrapped up in their own private world and blissfully ignorant of all else.

And yet, is it really about private worlds? If so, why is the technology referred to as “social media?” It seems that the folks in question are, in a sense, more connected than ever – with the entire world, in fact. Their reach exceeds their grasp, certainly... but can they really be accused of personal isolationism? We all laugh (silently, by and large) at the fat guy in his mom's basement, surrounded by mountains of pizza crusts, who spends all of his waking hours on the Internet. Well, he is physically isolated; that much is certain. And in terms of direct social contacts – the pizza delivery guy hardly counts (even assuming that he's the one who answers the doorbell, and not his mom). And yet in a sense he is reaching out to the world (even to fantasy worlds); he is engaged, he is interacting... and, perhaps most importantly in this age of anonymous violence, he is causing no harm.

Try this on for size. What would this guy have been doing 20 or 30 years ago? (And don't tell me this type didn't exist back then; they always have and they always will.) Would he have been down at the local Elks Club, enjoying brewskis with his lodge brothers? Playing poker? Coaching a Little League team? Making sandwiches at a homeless shelter? My answer is: No. He would have been just as physically isolated back then as he is now, but without even the Internet as an outlet. He would have been more passive – watching TV for hours each day – or, possibly, engaged in some solitary hobby (or for the truly brave and daring, ham radio). Would he have been happier then, or is he happier now? Who can judge? Happiness is, after all, a subjective thing, and it's, unfortunately, highly contingent on how we compare our situation with that of others. One advantage – if you will – of social isolation is that it creates a lack of any basis for comparisons of this type. And when someone retreats into their own world (or a world created by others that they have claimed a small part of) they may wind up being, and feeling, quite reinforced for that decision. They may become, as some psychiatrists have speculated, quite happy – ecstatic, even – in that world; the world the rest of us live in, and from which we derive our values, including self-valuation, scarcely exists for them.

Now, this may be an extreme case (but not by much). All I'm saying is that “social media”, while paradoxical in many ways, may not be having the isolating effect they are accused of having, by the more – shall we say – naturally extroverted media types. Extroverts -- “party animals” -- just don't get it. They will use the Internet, and social media, as tools, but as far as turning into addicts, no way – they far prefer the flesh-and-blood companionship of other human beings. And this is, in fact, the personality type that has been dominant throughout history, for millennia – ever since “history” began, and probably before. It is only in the last generation, quite literally, that we have experienced the “revenge of the nerds”, where the geeks seem to have taken over the world, or at least large portions of it – Bill Gates being the prime example and the “god above all gods” in that portion of the world where humans interact with electrons (said portion growing larger with each passing day).

And yet, there have always been species of humanity other than the dominant type, and society has generally had a way of accommodating them and using them to its advantage. Wise management indicates that people's strengths should be reinforced more than their weaknesses are punished. And you will not make loners into social butterflies by depriving them of electronic gadgetry and social media. What we are seeing, in my opinion, is actually a kind of blossoming effect, where people who had little or no way of connecting with others without experiencing extreme discomfort can now do so. (Or, as someone once said, an imaginary playmate is better than no playmate at all.) What this means is that our “shy” or “asocial” types (by traditional standards) can now venture out into broad daylight – with fear and trembling for certain, but they carry with them an indispensable tool – a crutch, perhaps, but nonetheless it constitutes a kind of umbilical cord... lifeline... “bubble”. They can venture into dark corners and be less afraid, because they have their own personal help line.

But why is this? -- because it's not obvious. Why would carrying some electronic gadget into the wild and threatening real world make one feel safer? It's not as though one can call 911 and report feelings of low self worth. What it is – it seems to me – is a kind of longing for company, and companionship, and belonging, and even affiliation (as in “I'm a member of _____”), but that longing has always been thwarted by the down side – having to deal with social ambiguities and the messiness of personal relationships and interaction. Power games, status games, people saying things they don't believe – these are all very confusing and disorienting to certain types of our fellow human beings (call them “Autism Spectrum” types or whatever, but they are who they are). (I've often felt that “hell on earth” for these folks has to be the “cocktail party”, where it's all about social dominance, status, small talk, game playing, and who can talk the loudest... and nothing even remotely genuine.)

So what is accomplished by opting for “social media” over real society? You get to sort out the good from the bad – the pluses from the minuses. You get to connect with like minds, or even soul mates, with minimal risk. (It's so much easier to exit an Internet page than to exit a party!) You get some of the satisfactions – though certainly not all – of social contact and interaction, while minimizing the damages (to yourself, and possibly to others as well).

So it seems to me that the “bottom line” of social media has to be considered on the plus side, because it does expand opportunities for vast numbers of our fellow citizens. Now, having said that, it's also true that for the, let's say, “marginal” types, who are perfectly capable of interacting in traditional ways, social media can become an easy out – a kind of handy escape route with which to avoid responsibility and to achieve emotional isolation. An emoticon is a sterile, shorthand substitute for an elaborate array of facial expressions, body language, and verbal expression – so yes, it can be for people in a hurry, but is can also be for lazy people and yes, it can be habit forming. It's a miniaturized form of emotional isolation – the ability to express a feeling, or a pseudo-feeling, or a feeling that you think other people think you ought to have but you don't. And – most importantly of all, perhaps – it protects you from feedback – from contradiction... from getting a message that your feelings might be foolish or “wrong”.

And the emoticon is just one of countless tools, appendages, and garnishes that enhance the appeal of the social media. There is also the appeal of total anonymity – available in some applications, not in others. So in a sense, these social media have created, or carved out, a new life style and a new demographic – a silent minority, if you will, who have finally found a voice. And yes, they may shape behavior in some respects – rewarding actions that are compatible with the social media world and punishing others, so that the participants become, in a sense, creatures (if not creations) of the social media. (But aren't we all creatures of technology, communications, and information to some extent? No sense picking out one group and accusing them of being any more passive than the rest of us.)

But as far as causing a major upheaval in the distribution of personality types – no. I don't believe this phenomenon can turn sociable people into isolated wallflowers, hunched over a tiny screen at a corner table in Starbucks. What it may do is expand their options – introduce more (previously unknown) levels of interaction into their lives – and how can this be bad? And it may even be a shield, of sorts, against some unpleasant realization – like, how ultimately dull and uninteresting your date is, and what on earth are you going to do for the rest of the evening? The answer is, you whip out your respective gadgets and all is well. (And we've all seen this on any number of occasions; no sense denying it.)

Any discussion of this type – where a technological revolution of some sort is met with ambivalence – has to at least include the question, would you go back? Would you be willing to wave a magic wand and make it all go away, then have to deal with the consequences? It would take a hard-core Luddite to answer this in the affirmative when it came to social media – and I, for one, am not about to do it.

Tuesday, October 13, 2015

Ayn Rand, Call on Line One


Who hasn't indulged in fantasies that begin with the words, “If I were president...” -- or dictator, tyrant, whatever? Who hasn't made a list of things they would do on Day One of their presidency? Well, I don't know about you, but my first order of business would be to dissolve, disband, cancel, and basically demolish the worst piles of bureaucratic do-do in Washington, and at the top of the list – and, let's face it, there is plenty of competition for the top spot – but really, can there be any doubt? The top of the list would have to be none other than HUD – the Department of Housing and Urban Development. It is a child of LBJ's “Great Society” -- and, by the way, where is that Great Society nowadays? Have you seen it lately? I certainly haven't. I think it died somewhere in Detroit – or maybe Chicago, Baltimore, or even Washington itself. It was, of course, a liberal pipe dream – the notion that if you throw enough money in the general direction of a problem, that problem will go away... and the problem in question in this case was “urban blight”, “inner cities”, “decay”, and so on – all of which had powerful racial implications. The reflexive action in the glory days of “urban renewal” was that all you had to do was tear down the “ghetto”, build “projects” or subsidized housing where it stood, and... well, this was not part of the original vision, but it's the way things turned out – move all the surplus population out to suburban “developments”.   (Urban renewal always results in surplus people, simply because planned developments, even of the high-rise type, wind up holding a lot fewer people per acre than the old, ramshackle ghetto tenements.  So the surplus have to be warehoused in outlying areas.)  And to see how that all turned out, I offer four simple words: Prince George's County, Maryland.

See, the conceit at that time was that “ghetto” -- AKA “black neighborhoods” -- automatically equaled “misery”. What the liberal think-tankers and urban planners failed to see was that these neighborhoods, as scruffy and edgy as they were, were nonetheless home to coherent communities – to cultures – not of the lily-white kind, but of a kind that, in many ways, met the needs of those who lived there. Tear them down, and build sterile apartment (both low- and high-rise) developments in their place, and you not only fragment the communities and the culture, but you create something much worse in their place – you, in effect, warehouse blacks in new ghettos, which turn out to be much more violent, dysfunctional, and drug-ridden than the old ones because they totally lack psychological, cultural, and physical roots. You create what, in effect, are prisons without walls (except for psychological ones).

And there is nothing theoretical about this – it happened, time and time again, in cities across the country, and rare were the ones that managed to successfully resist the Godzilla-like rampage of the urban renewalists. The bottom line, when thinking about outfits like HUD, is: If you seek its monument, look around – in cities like Baltimore, Washington, D.C., St. Louis, Detroit, Cleveland, and so on. But as always in politics, the rhetoric overcame the reality, and once people realized that their birthright had been confiscated and replaced with a mess of pottage, it was too late.

This is all by way of background. Objectively, HUD is the most ill-conceived, most totally failed, wasteful, and fraud-prone of all government agencies. It has failed not only the country and the taxpayers, but the very people it was supposedly designed to serve. It has been instrumental – perhaps essential – in creating a permanent socio-economic underclass mired in poverty, drugs, and violence. It needs to be laid waste, and its mass of bureaucrats driven from the seats of power – by which I mean, throw them out of D.C. and close and lock the gates behind them. And no “bumping” -- i.e. moving into slots in other agencies based on seniority. Out! Period.

But! On rare occasions, a glimmer of wisdom emanates from even the darkest corners of the collective state. A recent “discovery” -- not that the information wasn't already there for all to see – concerns people living in subsidized housing whose income exceeds the level required to qualify for said housing. In other words, they earn too much to qualify for this particular variety of handout. And of course the initial response from HUD was to say that, well, you can't just throw people out of public housing because their household income rises above a certain point, because “evicting them could destabilize their progress toward self-sufficiency.” In other words, they might be earning decent money now, but who knows how predictable or stable that is? They might wind up back at poverty level any minute, for any number of reasons. We have to be sure that this higher income level is for real, and not just a momentary phenomenon – like, someone won a lottery or something.

Well, isn't this true of a lot of people? I mean... a person may be middle-class today, but then lose their job, or make a bad investment, or lose a lawsuit, and then they find themselves in the strange new world of poverty, and instantly qualify for subsidized housing. In which case, why not provide subsidized housing to everyone, just to insure that no one falls through the safety net? You see where this is going.

Well, as it happens, HUD reversed itself on this issue, and is now “urging public housing authorities across the country to kick out tenants who make too much money to qualify for government subsidies” -- which, I guess, means that said authorities have to get hold of everyone's tax return... or at least the returns of the ones who drive a Jaguar to work every day. (And by the way, who earns close to $500,000 a year and yet is satisfied to remain living in subsidized housing? Who are these people?)

But the point about “de-stabilizing their progress toward self-sufficiency” -- well, that's pure bureaucratic-speak. A non-liberal (a species that does not exist within the confines of HUD) would call it “punishing achievement”, and it's an interesting point. If one accepts that it's the government's job to establish, and guarantee, a certain minimal standard of living – including housing – then the question shifts to one of who should qualify. And we all know (or should) that household income is far from static – it can go up or down, quite drastically at times, depending on the various fortunes of those who contribute to it. And aside from the stability question, do we really want to punish people for doing better? This question comes up all the time in the more general discussion of welfare, of course – the notion that we not only “punish” people by forcing them to seek employment, but also punish them if they should happen to land a decently-paying job – to the point where many in the dependent class (who are not as stupid as their liberal overlords would like to think) have figured out that they're better off not working. (As usual, liberal social policy is predicated on the premise that the recipients of government – i.e., taxpayer – charity are stupid. And while this may occasionally be the case, it's much more likely that they display a steep learning curve when it comes to gaming the system. If there are any “chumps” in our society, it's not ghetto dwellers, but wage earners who persist in voting for politicians who intend to squeeze as much cash out of them as possible and turn it over to someone else.)

Let me tell you a little story that has some bearing on the matter. During my time with the feds, there was a program by which a minority- or woman-owned business would receive preferential treatment when it came to contracting. It was not so much a matter of lowering standards as of eliminating, or reducing, some of the bureaucratic red tape that characterizes the government contracting process – identify a bidder as minority- or woman-owned, and you wound up on a fast track. “Other things being equal”, they would land a contract more readily than other bidders – and, of course, there were plenty of ways to game the system, but that's not the point I want to make.

In one particular case, we starting using a woman-owned business for certain types of support functions; they were a new outfit, and no one knew at the time whether they would work out. (They were, in fact, one of those proverbial “kitchen table” outfits started and staffed, at least partly, by housewives.) Well, as it turned out, they were very good at what they did – so good, in fact, that we kept renewing and expanding their contract year after year, until... and this is where the self-defeating nature of the bureaucracy comes in. Because of the terms of the contract, they became too successful – too big, too stable, too much cash flow. It was no longer enough to be woman-owned; they had passed the milestone at which that no longer mattered. So from then on they had to compete on an equal footing with everyone else – which would still have been OK, because they were very good. But, lo and behold, along came a new cohort of minority- and woman-owned businesses, all hungry, all anxious to establish a spot at the government trough, and guess what, the formerly-disadvantaged outfit started to lose bids to the new, still-needy ones – and some of those were, quite frankly, wildly incompetent and staffed by idiots. (It was at this point that we realized how really lucky we had been with the first outfit.) So, basically, we had to start accepting sub-standard work in the name of compassion – and this, I would say, typifies government operations at all levels. (And one might say, well, but isn't every government program a jobs program, after all? And where does quality of work fit into that? The answer is, it doesn't. But on any given day it's more pleasant to be working with competent people than with clueless ones.  And it's more satisfying to turn out, or at least oversee, a good product than a mess.  But this would be -- I hasten to add -- an atypical attitude among the bureaucracy.)  

So you see, preferences... set-asides... quotas... etc.... giveth and they taketh away. And quality of work is no object. As I've said before, every government program is a jobs program – no exceptions! But within that iron rule, some people wind up more qualified for jobs than others, and it has very little to do with experience, competence, or anything else other than being in the right place at the right time (and the right gender, race, ethnic group, and size – and, soon, sexual preference).

So to get back to HUD – I actually understand their point about not punishing achievement, even though they'll never call it that. Someone down there on 7th Street SW has a brain that functions to the extent of questioning why people should be rewarded for non-activity but punished for self-improvement. And yet, overall, the government juggernaut, which is designed to provide jobs, will be scantly swayed by this rare insight.