The End of Linguistics:

Taking the Language Back from Nature�and Linguists

There�s only one thing that everyone knows about language�that it�s a living, growing thing�so it seems particularly unfortunate that the notion should be false. This metaphor may once have served some useful purpose; today it is a noxious cloud whose effect is to stifle rational discussion of language. It is heard whenever A questions a usage of B�s: someone, usually C, will counter the criticism by reciting the one thing everyone knows�and with that remark, reason flies out the window. You raise your eyebrows at his use of reticent to mean reluctant ? You think him ignorant for using disinterested to mean uninterested ? You groan because he speaks of running the gauntlet ? I tell you in response that language is a living, growing thing; thus I refute pedantry�and carry the day!

So enthralled are we by this metaphor that it is almost painful�like massaging back to life a leg that�s fallen asleep�to force oneself to recognize that every point it asserts is false. Language is not living, not growing, and not a thing; it is a vast system of social habits and conventions, inherited from our forebears, and showing every sign of being an artifact rather than an organic growth. Certainly languages, or at least aspects of them, exhibit some changes over time�but then, so do the Dow Jones Industrial Average, the barometer, and the Oregon coastline, none of which we characterize as living or growing. And the most impressive changes observed in any language�loss of inflected forms, of moods and of tenses�are in the direction of simplification and economy, not of the enlargement, elaboration and proliferation we call growth. The changes exhibited by our language in historical times make it plain that it changes only when it is changed by us, its users, and never of its own accord, since it has no accord. It is, in fact, inert �we cannot call it dead, primarily because it never lived, but also because we have, rather tellingly, reserved dead language for one whose native speakers are dead.1

It is just the vivacity of its users that makes the attribution to inert language of life and growth so nearly irresistible; languages seem to have many of the marks of living, growing things because they are bound up so closely with beings who are indeed alive and growing. At one moment in history, it�s true, languages did seem to be possessed of lives of their own, and not be simply reflections of our needs and desires; they seemed to follow their own developmental laws, independent of our wishes and efforts. But that historical moment, which will be discussed later, has passed, and with it any excuse for succumbing to the Fallacy of Linguistic Autonomy. Over the last few centuries, we humans have begun to take charge of our languages just as we took charge of our food supply in going from plant gathering to agriculture, and as we will shortly be taking charge of our genetic makeup. A modern language changes when we change it, and the metaphor that makes it autonomous only obscures our real task, which is to consider just how and why we do so.

By far the most common kind of change we make to our language today is an addition to its vocabulary, made for the entirely innocent reason that a new creature has been observed in Eden, and Adam is called on to perform again his onomastic function. There may be occasional objections to the particular token Adam chooses, but no one objects in principle to the coining of new words for new things. And apart from this unexceptionable and linguistically trivial kind of change, innovations in language today are due almost always to Simple Ignorance, Social Climbing, Semantic Inflation , Group Solidarity, or Journalistic Convenience.

•  Simple Ignorance is a mistaken but entirely excusable reason for changing the language�at least I hope it is excusable, because I have been guilty of it, and fear that I may be guilty of it again. For some years I thought there was such a thing as infrared (rhymes with impaired) light, because I had seen infra-red printed without the hyphen, and failed to recognize it. For years I gave full value to the syllable - is - in names like Salisbury. And so on; the list of my errors is a long one. (I intrude my personal failings into this discussion because I would have no one think that Simple Ignorance is a bludgeon that I have created to beat others with.)

Most of the time Simple Ignorance can be corrected without arousing hostility in the Simply Ignorant; if the correction is made tactfully, most of the S.I. will accept it without resentment, sometimes even gratefully. But sometimes the would-be benefactor is told that he is a pedant, a reactionary, or some other kind of villain, for presuming to correct a free-born, native speaker of the language. And while he is no villain, it is true that he is sometimes a hero come too late.

It is probably too late, for example, to correct the widespread notion that ilk means type or class, or that a cohort is a buddy, partner, sidekick, or colleague. But if we give up on these points, it is not a triumph of the new and improved over the old and obsolete, nor even a replacement of the old by something just as good�it is simply loss, and a capitulation to entropy or force majeure. The new usage does not even offer the stimulation of novelty, because its users are not aware that it is new; we have all lost, and no one has won. Two kinds of price will have to be paid for this surrender: we will have lost good words for our own later use, and we will have lost yet another small connection to the literature of the past. Losing ilk, we will not know how to convey briefly that we are speaking not of just any Campbell, but of the Campbell, nor will we understand some passages of Walter Scott�s; nor will we be able, having lost cohort, to picture what the onslaught of the Assyrians must have looked like. And all we will have in return is one more loose synonym for type or class or kind or category�not much of a testimonial to the worth of a living, growing language.

•  Social Climbing is a less innocent but still forgivable cause of language change. Into this category fall all the usages that we adopt in the hope that they will cause others to think more highly of us: the fancy, half-understood word; the misspelled or mispronounced French and Latin tags that are supposed to make us sound worldly or learned; the misapplied scientific or technical term that is meant to suggest that we are adepts in one of the modern black arts like electronics or programming; the adoption by whites of the vocabulary and speech patterns of inner-city African Americans in the hope of being taken for street-smart dudes; the insider�s term that makes it clear that we are savvy, in the know, wised up, witting, clued in, and otherwise among the cognoscenti.

These variants of Social Climbing differ in the amount of damage they do. It probably costs the community nothing if some members, hoping to be taken for movie-industry dealmakers, talk of taking a meeting (or even a meet), or doing lunch, but there is a price to be paid when our fellow citizens talk of making a quantum jump when they mean a very large, significant jump: that practice will cause some to misunderstand the phrase when they see it correctly used, and their misunderstanding could have harmful consequences. (And there is probably no such thing as a perfectly harmless misunderstanding; given the right circumstances, any error can prove disastrous.) When the Social Climbing is downward�a yielding to nostalgie de la boue�it can have even more serious consequences. When Blacks hear whites try to �talk Black,� they may feel contempt for those whites, or feel resentment at having what they thought was their private property co-opted (see Group Solidarity, below), or feel rage because they think they are being mocked.

•  Semantic Inflation is the response of the hurried or unimaginative to the facts that figures of speech, like all artifacts, wear out and lose effectiveness over time; that countless voices are competing for public attention, making it necessary, apparently, to scream if one would be heard; and that the substance of what we have to say is often a stale cake that needs tarting up with the most picquant possible frosting. Recognizing these facts, many language users are constantly in search of verbal novelties and new levels of hyperbole to give their utterances some chance at winning our attention, with often ludicrous results.

This is the quest that has so many routinely calling a pretty dress awesome, a becoming hair-do devastating, and a fetching hat divine; that uses genocide for any random massacre, or even for social and political events that involve no deaths at all; that calls a university-town neighborhood known for fine restaurants and fancy groceries the gourmet ghetto. It has produced a generation that has never seen wicked, sinful, depraved, outrageous, or decadent in print except on dessert menus, and might be surprised to learn that they once played other roles. The last word in this direction�one hopes�is the claim made by the half-defiant, half-terrified user of a threadbare figure of speech that it�s not a figure at all: �I literally exploded with anger!� says this pathetic seeker of our attention and regard. One bad effect of this is to make the speaker or writer ridiculous; another, more serious for the rest of us, is to rob us of the use of the inflated terms for their older purposes.

There are a few variants or sub-genres of Semantic Inflation that are worth mention before we pass on: one is the turning of what was a merely factual term into a reproach; another, the mirror image of the first, is the dimming of what was a euphemism into a merely factual term. As an illustration of the first, consider the career of stereotype. Once merely a synonym of clich�, it has became a condemnation of any attempt to generalize about any human group, particularly one felt to be in need of protection, with a meaning lying somewhere between unwarranted generalization and false accusation. This represents something close to a complete reversal in meaning: where stereotype once meant a characterization that, although boring, was probably true, it now means one that is perhaps provocative, but false. Here, for example, is a contributor to an Internet discussion group, objecting to a description of librarians as harsh enforcers, within their domains, of the rule of silence: �News flash! The grim librarian is a stereotype.� No explicit claim that it is a false stereotype; no need, since stereotype now includes, indeed consists mainly of, the notion of falsity.

The opposite type of change�call it Semantic Deflation�in which a once effective euphemism loses its euphemistic powers and becomes merely the standard term for the unpleasantness it once tried to shield us from, is illustrated by the career of misunderstanding . Would-be spreaders of oil on troubled waters have called so many conflicts misunderstandings where the combatants had a perfect understanding of each other�s intentions (as when two dogs fight over a bone) that the term has lost all of its placatory power, and is now just the ordinary term for a quarrel.

  

•  Group Solidarity is deliberate language change, made for the purpose of affirming our identification with or support of some group at odds with mainstream society. Does that society say homosexual ? We say gay . Do you speak of women ? We insist on womyn . Do we suspect that people call us, in private, niggers ? Then we will call ourselves that in public . And so on. These language changes are political acts; whether one sympathizes with them or not, they are not primarily linguistic developments, any more than throwing a bomb is primarily an exercise in ballistics.

•  Journalistic Convenience is the pressure exerted on the written language�and indirectly on the spoken form�by the exigencies of print, particularly of page and column width. Why do we all now say gay rather than homosexual ? Partly to show that we are not bigots, of course; but largely because we have become accustomed to seeing the former in our daily paper. What a blessing for page-layout editors and compositors to have a three-letter word rather than a 10-letter one! But perhaps a mixed blessing for the rest of us. When they see the story head �A raps B,� experienced readers know that they are probably going to read merely that A has criticized B, or just refused to endorse B�s position unreservedly, but it takes a while before one has fully learned that most print organs carry headlines that fit the space more than the story.

All these, of course, are changes merely to vocabulary or to minute and localized points of usage, and utterly trivial in the eyes of linguistic science�mere froth on the surface of the ocean of language. But these are the changes that exercise the public, including linguistic scientists when they are acting simply as citizens and controversialists; these are the ones that cause debate and conflict. Really deep language change, such as a drift from agglutinative to analytic syntax, is, like geological change, something that very few people other than scholars are even aware of, let alone passionate about.

So much for a thumbnail sketch of what I have called the Fallacy of Linguistic Autonomy and the chief modes of modern language change; more needs to be said of them, but this will do for now. The point to note is that none of these modes of language change has anything to do with laws of linguistic science, any more than having one�s horoscope cast has to do with laws of astronomical science.

Another prevalent fallacy about language, the Fallacy of Pedantic Persecution, springs from a myth of the kind that we freedom-loving Americans are especially susceptible to: the myth according to which the language and its users are born free and naturally creative, but are everywhere in the clutches of pettifogging, narrow-minded, legalistic, reactionary, and mean-spirited pedants, sworn to thwart aspiring young writers, to blight literature with oppressive and irrelevant rules, and generally to take the bloom off the linguistic and literary roses.

All this superstructure of nonsense rests, at least in part, on the underlying delusion that language itself is a living, growing thing, so this fallacy is not strictly a fundamental one, but derivative of the Linguistic Autonomy fallacy. But it is so rampant a growth that merely killing its root will leave it still in place for a long time to come, living on the vegetable equivalent of stored fat; it must be attacked in its own right, treated as if it were an independently false axiom of the system. And, as noted earlier, the effort necessary to grasp inwardly that language is merely an aspect of human behavior, not an independent entity�that it has no nature, no destiny, no desires, no �genius,� no yearning to be free�is so great that we need all the help we can get to make it.

In order to preserve their delusions on this point, would-be linguistic liberators have to fantasize that we are beleaguered by schoolmarms threatening to crack our knuckles with a hickory stick if we split an infinitive, use a double negative, or commit any of the other solecisms that 19th-century teachers were so alert for, and which, no doubt, a few 21st -century ones still are. It is also necessary to cherish the illusion that there are many potentially great writers who have been crushed and silenced by the strictures of these pedants, and that we have been deprived of many a masterpiece by such oppression. The best short retort to these fevered imaginings is that of Flannery O�Connor:

Everywhere I go I�m asked if the university stifles writers. My opinion is that they don�t stifle enough of them. There�s many a bestseller that could have been prevented by a good teacher.

Unfortunately there are not enough good teachers, and those angry rebels who would free language from the shackles of the pedants are almost always suffering the after effects of an early bout with bad teaching. (Not that the teaching need be bad to arouse student hostility: there is frequently a conflict between the young writer, who generally wants to write stories about himself and his feelings, and the teacher�the possibly very good teacher�who is more interested in teaching him the mechanics of good writing than in giving him an opportunity to �express himself.�)

Some of the poor wretches whose job it is to try to turn anti-literate adolescents into tolerable writers fall back, in their desperation, on rules as a way of breaking their students of at least their worst practices�they surely know that no body of rules will ever turn a bad writer into a good one, but the sheer number of poorly-prepared students they have to cope with forces them to reach for the crudest and nearest of tools. So they tell their charges not to use the passive voice, or not to end a sentence with a preposition, or not to start a sentence with and, and so on. These rules�even the ones for which there is some rational basis�do not help the poorer students, and infuriate or depress the potentially good ones. And some of these potentially good ones will go on to build a skyscraper of resentment against all rules and all authority on the foundation of the frustration they experienced at school.

These schoolchild grievances remain smoldering in many adult minds, and need only a whiff of oxygen to flare up even after many years. When a short article of mine that incidentally said a good word for rules was published a few years ago in The Atlantic, a number of its readers demanded that I specify just what rules I had in mind, or, assuming that I must be championing the very rules that they had been tormented with as schoolchildren, simply expressed scorn and outrage that anyone could defend such foolishness or wickedness. These critics missed the point�or rather, two points: first, it had never been my intention�or duty�in that article to put forth specific rules (its argument was that linguists had no special authority in matters of usage; my remarks about rules were merely obiter dicta ), and second, I had in fact given a rather broad hint as to the kind of rules I had in mind when I there recommended Graves� and Hodge�s The Reader Over Your Shoulder. But my glancing reference to the notion of rules, to judge by the mail the article provoked, exercised my readers far more than all the rest of my points taken together, controversial though they were.

As if to show that nothing can help one who has started with false assumptions, even the Oxford English Dictionary has been pressed into service to support false conclusions, giving us the Fallacy of the OED. A recurrent scene from the language-usage wars: A decries a usage by a contemporary, B; C rushes to B�s rescue by citing the OED to show that the usage is to be found in Beowulf, Chaucer, Shakespeare, and Browning.

The Oxford English Dictionary has been called, and in many ways is, the greatest dictionary ever compiled of any language. But if we ever needed to be shown that the devil can quote scripture, the uses to which this great dictionary is regularly put would do it. Many of its users assume that as the �greatest of dictionaries,� the OED is the greatest of authorities on usage�that is, on how the language should be used. In fact, its greatness and its real usefulness involve its complete renunciation of any claim to authority in that sense. It excels simply as a record of where the language has been; its distinction lies in the fact that it records the earliest use its contributors have found of every sense of every word; that, and nothing more. In doing so, it provides the best available answer to questions of the form, �When did cattle start meaning just bovine animals?� and �When did road surfaces start being called macadam ?�; it is no good at all for questions of the form, �Is it OK to use infer and imply interchangeably?� or �Is it ever permissible to use a plural pronoun with a singular antecedent?�

All the OED can tell you in answer to questions of the latter type is, �Well, Shakespeare did it once,� or �Browning did it regularly; see The Bishop Orders His Tomb at St. Praxed�s for the earliest example.� These statements, while offering possibly interesting information, are not answers to the questions that were asked; all that these non-answers do is replace one set of questions with another: �Does Shakespeare�s use of such-and-such a construction authorize me to use it now?� and �What does Browning�s use of that locution have to do with my situation?� And the answer is that Shakespeare�s or Browning�s use of some construction or locution has little if anything to tell us about usage today, and for two reasons, either of which would suffice.

First, the authors to whom appeal is being made lived and wrote some time ago, and much has changed since their day: spelling and syntax have been regularized and stabilized by printing, near-universal literacy, public schooling, word-processing programs with spell-checking features, and all the other standardizing institutions and technologies that have in many ways fixed the language. We have decided for virtually every word in the language which of numerous ways of spelling it is correct, and we enforce that decision�in the schools with grades, and afterwards by privately downgrading and, sometimes, publicly humiliating those who use a wrong form (former Vice President Quayle may have lost his chance at the presidency of the United States by misspelling potato). We are not quite so settled in matters of grammar and usage, but nearly so�settled enough, at least, to make appeal to the practice of writers of as little as a century ago futile and irrelevant. In short, it simply doesn�t matter, for purposes of resolving a dispute over usage today, how Chaucer, or even Dickens, said it.

(There is one way in which the practice of yesterday�s writers can and should be a consideration in our disputes over current usage: it is good for us to know Shakespeare�s usages�that is, to have them in our passive vocabularies�not because his use of them makes them acceptable by today�s standards, but because we want to be able to continue to read him with ease. Our delight in Shakespeare is such that those of his usages that would be called errors if employed today are usages that liberally educated moderns should know�but also know enough not to use in their own writing. There is a great difference between those who want to use the OED�s Shakespearean citations to defend dubious modern usages, and those who want to use them to read Shakespeare.)

Second, the use to which Shakespeare and Browning (these names are used as examples only, of course) put the language is usually very distant from that to which you are putting it, unless you too are writing poetic drama or dramatic poetry. The use to which language is put in poetry, romances, belles lettres�De Quincey�s Literature of Power�is radically different from that in news stories, magazine articles, business reports, scholarly papers, and other non-fiction�De Quincey�s Literature of Knowledge; the difference is as great as that between Bernini�s use of stone and a mason�s use of stone. Even if Shakespeare were our contemporary, then, his use of language�as an art medium�is so different from that of a journalist or academic writer that citing his practice in defense of theirs is ludicrous.

Another error to which OED idolaters are prone is that of taking the date of a word�s earliest citation in that work as the date of its acceptance into common usage, when it may well be simply the date on which some adventurous innovator tried it out, with no one seconding his use for many years after. On the basis of this error, the word in question is then claimed to have been �in use� far longer, or earlier, than the facts warrant, and is deemed to have been a full citizen of the English language when in reality it was still generally unknown, and regarded by most who did know it as an illegal alien (I owe this point to a suggestion by Jacques Barzun).

But despite its irrelevance, the OED nevertheless plays a big part in the running battle that has long been going on between the �prescriptivists� and the �descriptivists� (as the jingle goes), because the former sometimes succumb to the temptation to support their positions on questions of usage by citing what they suppose to be historical facts. And when they do, they sometimes get those facts wrong, and thus allow the descriptivists to triumphantly point out their errors, and make them look silly. The prescriptivist arguing for the preservation of the distinction most good writers of today observe between disinterested and uninterested may, for example, carelessly claim that the distinction has always been observed by good writers; at this, the descriptivist dives into his OED , and emerges to announce gleefully that the prescriptivist is wrong, as indeed he is. The prescriptivists whose pretensions are thus exposed deserve the embarrassment they suffer on such occasions, but that exposure does not refute their position on usage, since that position is not based on linguistic history. In short, being able to write �OED� at the end of one�s argument is not like being able to write �QED.�

Finally in this cavalcade of error, consider the Fallacy of Linguistic Nihilism . In matters of language usage, as in those of religion, morality, and governance, the theme of our time is one of revolt against authority. No one, contemporary man is resolved, is going to tell him what to do�not church, king, parent, community, school, or books. His sovereign will alone will determine what he does and thinks, and as for what his will will will, and why, why, that is no one�s business but his, or its, or�well, whatever. I am not concerned here with the effect this doctrine has on other aspects of human life, but its effect on speech and writing has been what one would expect of a determination to saw off the branch on which one sits.

It has made us simultaneously arrogant and servile: we want to be the final judge of how we say our say, but we also want to be correct�we want to do as we please, but we also want our audience to both understand and respect us. So we demand of our dictionary, for example, that while not daring to tell us explicitly what is correct and incorrect, it should nevertheless guide us to the usage that will achieve for us our double goal of being understood and respected. Rather like our demand that our parents should not restrict us in any way, but should nevertheless protect us from danger, hardship, and embarrassment.

In questions of language usage, the demand that we shall eat our cake, have it, and not grow fat, is especially difficult to meet, because even those of us who are most dedicated to pure willfulness sometimes need to use words that will mean the same thing to others that they do to us. And at least on such occasions we find ourselves wanting to know what is correct (the absence of quotation marks around the preceding word is not an oversight), and we look for the dictionary that we know is around somewhere. But when we find it, our problems are compounded.

Until well into the twentieth century, dictionaries, grammars, and other such guides and reference books were openly and even aggressively prescriptive�they made no bones about telling their readers �This is right� and �This is wrong.� Starting about mid-century, however, these authorities lost their nerve, and for the most part have stopped telling us what is right and what is wrong in our speech and writing. The cause of their surrender of authority was two-fold: students of scientific linguistics often were successful in undermining their pretensions to exact knowledge (and with it, their self-confidence), and champions of the common man protested any attempt to lay down rules on matters of usage to free-born citizens.

The result is that many books that innocent readers still regard as authoritative are merely archives: they no longer profess to offer expert opinions and judgments, they merely record the phenomena. Their authors read the books and articles we write, listen to the speech they hear us utter, and record our usage as closely as possible in their own books, passing no judgment on any of it. So when the innocent reader turns to one of these works for a rule or at least a guide to what he should say or write, he is not consulting an expert, he is merely looking into a mirror.  Sometimes readers realize that they are being denied guidance, and complain; more often they don�t, and take the presence of some locution in such a book for approval. Thus, they may look into some modern reference books and find that they seem to say that imply and infer mean much the same thing, or that disinterested is simply a variant form of uninterested, and the like. The readers are mistaken; the compilers of these books are merely saying that they have observed people writing and talking as if those things were true, and that they, the compilers, are not presumptuous enough to call such things wrong.

The immediate problem caused by this abdication of authority on the part of the authors and compilers of such works�an abdication largely unnoticed by the general public�is that naive readers often think that their questions have been answered when they have in fact only been thrown back at them: they call into the wilderness, �Is it OK to use imply for infer?�, and echo answers ��use imply for infer.� Fun, but not informative. The longer-term problem is that caused by all abdication of authority in the absence of a legitimate successor: it does not mean that we are now free; it means only that a new authority, probably less legitimate and less restrained, will now take over.

Someone or something is going to determine the way we speak and write; someone is going to make decisions on usage�there is no such thing as a moratorium or neutral zone in language usage any more than in politics. We cannot defer usage decisions while waiting for linguists or anyone else to perform further research or arrive at consensus; we cannot invent for ourselves a language and a set of rules for using it. We can only choose among authorities, and only among those we know of. And if the authorities on usage are not to be the best writers of the recent past and present, and the critics and teachers with whom we study them, who are they to be? Television personalities? Rock stars? Gangbangers? Funeral directors? Gossip columnists? Telemarketing consultants? Or perhaps just Sir Echo, telling us soothingly that whatever we say is fine, just fine?

The linguistic libertine, who wants all the advantages of absolute freedom plus all those of the coddled child, invariably falls into the hands of one or more of the above-listed charlatans, just as children whose parents have refrained from teaching them a code of behavior, so as to allow them to choose one for themselves when they reach their majority, do not in fact remain in moral limbo until then, but join the local gang and adopt its code. When a dog is left masterless, it does not live independently, but joins the local pack of feral dogs, and submits to its alpha male.

All these falsehoods and errors, and more, victimize not merely ordinary citizens, but the professional students of language, the linguists. Indeed, the linguists may be even more in thrall to these errors than are the laity.

The scholarly study of language seems to have begun�we are dependent here on such texts and inscriptions as happen to have survived�millennia ago, reaching a very high level by the fourth century BC in the work of the Sanskrit grammarian Panini in ancient India. Linguistics in the modern sense, however�linguistics not as grammatical scholarship in the service of religious ritual and the preservation of sacred texts, but as a claimant to the status of science�dates from the late eighteenth century. A convenient starting point is the recognition in 1786 by Sir William Jones of the relationships among Latin, Greek, and Sanskrit that led to the idea of what we now call the Indo-European family of languages. That recognition inspired a century-long effort on the part of linguistic scholars to find other such relationships and families, and that effort bore so much fruit that nowadays the marvel is to find a language that, like Basque, seems to have no relatives.

These triumphs of learning and imagination, in turn, seemed to establish the claim of linguists to be practicing a science�a discipline dedicated to elucidating the laws that govern an order of nature. And so it seemed, for several exciting decades in the latter half of the 19th century and the first half of the 20th that linguistics, now founded on rock, was taking its place among the true sciences, and could soon be expected to produce the kind of rigorous and illuminating results that the others produced. But although departments and journals of linguistics have sprung up everywhere, the results achieved by what seemed in the early decades of the twentieth century a promising science have been disappointing. No findings even remotely comparable to the triumphs of nineteenth-century linguistics have been made, no great new principles have been formulated, no epoch-making discoveries have been announced.

Linguists have been very busy, but their busyness seems to be about an increasing number of increasingly divergent topics: some are in effect anthropologists, gathering linguistic data from remote peoples, and compiling dictionaries and grammars of languages spoken by small and isolated tribes; some study the minutiae of the grammars of their own or other contemporary languages; some try to find �language universals� among seemingly unrelated tongues; some, following Chomsky, try to find �deep structure� behind language�s fa�ade; some create maps depicting the boundaries of various dialectal usages and pronunciations; some try to decipher the remaining fragments of languages that were dead before Troy fell; some study the way in which children acquire language competence, or explore the brain sites that seem to be associated with language acquisition and use; some even study the possible linguistic significance of sounds made by whales and apes, or try to teach these animals the rudiments of human language.

All of these investigations and projects have their interest and value, but they give little promise of converging toward a comprehensive and unified theory of language. The only thing that these studies have in common is that they all deal with language in one way or another�and this, it seems clear, is not enough to unify them and make them all part of a recognizable discipline. There is no human activity that language is not part of; to take everything language-related for one�s domain is to take on too much, to cast one�s net too wide. In declaring no aspect of language alien to itself, linguistics has shown itself admirably catholic and generous, but undermined its hopes of being a science.

In another way, though, linguistics is too narrow and selective; while speech and writing are important parts of almost every human activity, they are the whole of very few. Human life is shot through with language as good beef is marbled with fat, and as no edible cut can be without fat, nor composed wholly of it (despite the reported tastes of the Sprat family), so no humanistic study can either ignore language or deal with it alone. Linguists have tried to carve from reality a cut consisting of all and only the fat, without regard to where the animal�s joints are, or the edibility of the resulting cut. It is as if a group of scholars decided to study all and only the things humans do with their left hands, and then claimed for Sinistrics the status of a science.

A lesser but still serious problem linguists face is one that besets all the human sciences: they are studying creatures who are increasingly aware of being studied, and whose behavior increasingly reflects that knowledge, and subverts those studies. We, the tens of millions who have been to college, all own dictionaries, and sometimes consult them; we buy books on usage and style, try to develop our vocabularies, and read such writers as William Safire. And while we manipulate and toy with language (with no apology for our ignorance of the goals and methods of linguistic science), linguists try desperately to discern behind all this noise the intrinsic laws of language that they hope and believe are there. One can�t help sympathizing with them, however misguided one thinks them, just as one might sympathize with would-be herders of cats.

These I believe to be the reasons why linguistics in the twentieth century came up with no triumphs comparable to those of the nineteenth, nor seems likely to in twenty-first. And if they are, the prospects for a science of linguistics are very dim. The language-stabilizing forces and the five types of human language innovation noted at the outset will have their way, and any underlying purely linguistic laws, assuming they existed, would continue to be as much drowned out by the resultant tumult as would Kepler�s laws if every back-yard amateur astronomer could manipulate planetary orbits to suit his notions of elegance. The conditions under which linguistic laws�again, assuming they existed�could be observed are vanishing, if not already gone; it may seem to the linguists that the inmates have taken over the asylum, but it is merely that the guinea pigs are taking over the laboratory. Even in the remotest parts of the earth, it is getting very hard to find naive subjects to observe; nearer home we have wiseguys who are so perverse as to try to turn the tables on the linguists, and feed them �observations� fabricated just in order to mislead or embarrass them.

What the future holds, then, for linguists on the one hand, and lay usage arbiters on the other, is divorce�with the latter getting custody of the language. And not only will linguists lose their bid for recognition as usage experts, they will lose their status as practitioners of an autonomous discipline; linguistics itself will be broken up, and its fragments annexed by its neighboring departments. Some linguistic projects and investigations will be absorbed by anthropology, some by archaeology, and some by ethology; probably the most fruitful projects will be taken over by departments of cognitive development and neuroscience. The good work being done by many linguists at present will go on, but under different auspices, and within different scholarly contexts.

Questions of usage�judgements as to how we should write and speak today�will be recognized as lying within the purview of the general educated public, with philosophers, literary critics, and poets perhaps seen as leaders. We, the new usage arbiters, may occasionally turn for assistance to the findings of what is now called linguistics, if we judge such information to be relevant to our own objectives, but if we do, we will be looking not for judicial rulings, but for expert testimony on technical points, whose value we will assess by our own lights. We will of course have access to the OED and to the works of Saussure and Humboldt and Jakobson and Chomsky and all the rest, but we will regard these great figures as expert witnesses only, and witnesses whose testimony is only occasionally to the point. And in the hands of its most skillful users rather than in those of its academic observers, the language will take on the dignity and efficiency of a tool that is being shaped and wielded by its proper masters.

For one thing, we should see an end to the use of the �living, growing language� fallacy as an excuse for misuse of the language�meaning not the solecisms that allegedly trouble schoolteachers, but that truly dangerous abuse, the building of prejudices into the language so as to shut down criticism before it can even raise its voice. The archetypal example of this stratagem is progressive, which is now generally accepted, even by conservatives, as the proper name of a loosely-defined cluster of marxisant views, and tacitly awards those views the twin crowns of virtue and inevitability. For such a linguistic manipulation, designed to bypass critical thinking by building a desired conclusion into the language itself, I propose the term mindshunt, meaning �a term so tendentious that one cannot use it without at least seeming to share the attitude of the term�s coiner.� (The mindshunt was a well-known stratagem at least as long ago as the middle of the 19 th century; here is James Fitzjames Stephen, writing in 1862: �Men have an all but incurable propensity to try to prejudge all the great questions which interest them by stamping their prejudices upon their language.�)

The full story of what awaits us when linguistics is absorbed into its neighboring disciplines, as geography has been, and when language change is understood to be not something that hidden laws impose upon us, but something that we bring about ourselves, can only dimly be seen from where we are. But one thing is clear even now: it will bring greater clarity, coherence, and honesty to our cultural life, as is always the result when a process swathed in mystery is brought into the light, and we begin to understand and take responsibility for our own actions.


Article Footnotes

1 O'Connor, "The Nature and Aim of Fiction," in Mystery and Manners: Occasional Prose, ed. Sally and Robert Fitzgerald (N.Y.: Farrar, Straus & Giroux, 1969), pages 84-85.