Pocket worthyStories to fuel your mind

The Case Against the Grammar Scolds

The lexicographer Kory Stamper’s book, “Word by Word: The Secret Life of Dictionaries,” is an eloquent defense of a “live and let live” approach to English.

The Atlantic

Read when you’ve got time to spare.

Photo by Louis du Mont / Getty.

These are boom times for linguistic pedantry. Never before have there been more outlets for opinionated humans to commiserate about the absurdities of “irregardless” or the impropriety of “impact”-as-a-verb or the aggressive affront to civil society that is the existence of the word “moist.” This is an age that found Bryan Henderson, Wikipedia editor and empowered peeve-haver, taking all the instances he could find of the phrase “is comprised of,” within the vast online encyclopedia, and replacing them with “is composed of” or “consists of”—more than 40,000 word-swaps, in all. It’s an age, too, that found Lynne Truss, author of the usage polemic Eats, Shoots & Leaves, garnering plaudits and book sales by offering readers rallying cries like this one: “If you still persist in writing, ‘Good food at it’s best,’ you deserve to be struck by lightning, hacked up on the spot and buried in an unmarked grave.”

The vitriol is ironic—and, yes, I do mean ironic, Alanis-wise and otherwise—and not merely because it puts the pendants in a precarious place, karmically. (The subtitle of Eats, Shoots & Leaves, one might point out, itself contains a usage error: The Zero Tolerance Approach to Punctuation might more properly be written as “The Zero-Tolerance Approach.”) The irony is broader: To engage in such peevery, playful or otherwise, is also to ignore the long, chaotic, and deeply creative history of the English language. It is to assume that someone’s adherence to the moment’s current rules of usage is a signifier of that person’s education and worth. It is to assume, on the flip side, that to violate those rules is also to commit a very particular kind of violence against English and, by extension, its many speakers.

Well, you know what they say about assumptions. Jane Austen, it turns out, employed the possessive “it’s” in her writing, and still managed to die a relatively dignified death. Thomas Jefferson used such an “it’s,” too. So did Abraham Lincoln, those four score and many more years ago. Even David Foster Wallace, master of contemporary English and self-proclaimed linguistic “snoot,” committed the ultimate usage sin of the committed usage snob: He used “literally,” yep, figuratively.

Was Wallace wrong, or merely prescient? Was he disrupting the English language, or, you know, disrupting it? As Kory Stamper, a longtime lexicographer at Merriam-Webster, suggests: both. And, more to the point, the senses of disruption here can’t be meaningfully distinguished from each other, when it comes to the underlying Darwinism that guides English diction. Stamper’s excellent book, Word by Word: The Secret Life of Dictionaries, is, like a dictionary itself, a composite affair: It’s a memoir that is also an explanation of the work that writing a dictionary entails—and, in both of those things, an erudite and loving and occasionally profane history of the English language.

Between the lines, though, Word by Word is something broader still: It’s a cheerful and thoughtful rebuke of the cult of the grammar scolds. It’s a spirited defense of the messiness and experimentation that allows a language to thrive as a tool for human communication. Contemporary English is what it is, Stamper suggests, not just because the islands of Britain happened, across the distance of history, to have been conquered by speakers of Latin and German and French; English is also English because Shakespeare appreciated a good fart joke, and because Lewis Carroll found the words invented by the time his century came along to be lacking, and because, in 2015, a 16-year-old named Peaches Monroee looked at her image in a car mirror and decided that the best way to describe her perfectly styled eyebrows was “on fleek.” English, like any other language, is a geopolitical phenomenon that evolves by way of individual genius. The scolds are offering top-down rebukes about a language that changes from the bottom up.


And, so, dictionaries. There’s a common assumption, Stamper notes, that such reference volumes—whether they exist in print or online, whether they’re branded Merriam-Webster or American Heritage or Oxford English—operate with grand, hushed Authority, their bulky contents the final words on our current words. Under this view of things, Stamper writes, the dictionary operates as “some great guardian of the English language,” a book that ensures, in its very bookishness, that “the language is thus protected, kept right, pure, good.

It’s a view that is both rosy and wrong. Dictionaries are human-written documents, with all the subjectivity and fallibility that such production-side origins will entail. (Merriam-Webster’s version happens to be produced not, as one might assume, within a Gothic library, dank with stuffy history, but rather in a beige-toned office, its cubicles outfitted in typical corporate chic, that is located in the post-industrial city of Springfield, Massachusetts.)

Dictionaries, Stamper argues, are the result of art and craft and, above all, dull and dutiful labor. Lexicographers are researchers who pay intense and obsessive attention not just to the rise of new words, but also to the new shadings of old ones. This requires that they be at once passionate about, and objective toward, the vagaries of English usage. They must put aside their individual feelings about “less”-versus-“fewer” and “whether”-versus-“if” and “literally”-as-“figuratively” and simply record the living language as it is being used by their fellow language-users. This requires a type of realism when it comes to both usage and diction. The lexicographer might certainly prefer a world in which “genocide” and “cunt” and “hate crime” need not be enshrined in the dictionary; when the task is to provide a description of a language, though—and, by extension, a description of a world that becomes communal in large part through language—there those words are. And, of course, there they must be.  

And that is because—here Stamper, a wry and charming correspondent, is at her most insistent—dictionaries exist, she argues, “merely to record the language as people use it.” They are, in lexicographical terms, descriptive rather than prescriptive: They describe the language as it is deployed, rather than prescribe how it should be. Dictionaries, in this vision, allow for a kind of linguistic libertarianism: One can follow their conventions, or not. One can disrupt, or not. English is a notoriously irrational language, bound occasionally by sensible rules of grammar but more often by arbitrary norms that have sprung up, proven useful, and then become—a word that has been a popular lookup in Merriam-Webster of late—normalized.

Which is to say that, as Stamper puts it, “your high-school English teachers lied to you.” The rules of English—the words it has created, the grammar it has codified—are “rules” only insofar as speakers and writers find it useful to abide by them. The rise of “y’all” and the decline of “whom” and the reclamation of “queer” and the many, many terms that arose from digital discourse to find relevance in IRL communicationthey are innovations that emerged in spite of the norms of American English, and, then, that stuck around precisely because of them. And they are terms that, once they proved their momentary staying power, were dutifully added to the record by the logo-loggers at Merriam-Webster and its fellow dictionaries.

“Good grammar,” however, remains a powerful, and persistent, and troublingly political, idea. It is to some extent a necessary one, sure, lest we have linguistic anarchy. But it can also be weaponized. It can be used not just to connect people, but also to sort them and categorize them and otherwise divide them. Stamper notes that African-American Vernacular English, AAVE, is sometimes interpreted as less than, rather than merely slightly distinct from, other dialects of English. And she suggests the many ways that pointing out other people’s usage errors can function not merely as a nerdy sport, but also as an exercise in elitism.

She also suggests how intimately snobbery itself—grammar, seen as a sign of education and social worth—has been bound up in the overall history of the English dictionary. While “grammar” as we tend to think of it today—eight parts of speech, tidily (and sometimes frustratingly) modular in nature—was first codified in the 2nd century B.C. (and based on, as so many other linguistic edicts would be, ancient Latin and Greek), English, that messy mishmash of a language, wasn’t standardized until the 15th century. It didn’t become the bureaucratic language of England itself until 1417, when Henry V ordered the first bureaucratic record to be made in Britain’s newish vernacular.

The earliest English usage guides—the precursors to dictionaries as we know them today—sprang up as English wealth was expanding from the aristocracy to the merchant class, and doubled as guides to etiquette. They were part of the same prescriptivist impulse that motivates many of the grammar scolds of the 21st century: Daniel Defoe wanted to establish an English “academy” that would inject “purity and propriety of style” into the newly literarily-legitimized English language; Jonathan Swift and John Dryden entertained similar aspirations to create early forms of the governmental language protections that France and Spain enacted in attempts to keep their languages “pure.”

The men’s desires for English, which arose from their love for English, were a response to their historical moment: In an age of newfound social and economic mobility, good grammar—good, patriotic English grammar—became seen as a sign not just of one’s education, but of one’s social standing and worth. In 1762, Robert Lowth, the bishop of London, published A Short Introduction to British Grammar: With Critical Notes. Lowth’s preface declared that “it is with reason expected of every person of a liberal education, and it is indispensably required of everyone who undertakes to inform or entertain the public, that he should be able to express himself with propriety and accuracy.” His manual was one of many guides that were sold, Stamper notes, under the general assumption that “good manners, good morality, and good grammar all go hand in hand.”

It’s an idea that remains with us, embedded in Lynne Truss’s attack on the use of “it’s” as a possessive pronoun, and in the tagline of the popular Facebook group The Official Grammar Police (“I am judging you’re your grammar”). It’s an idea that lingers, as well, in the businessman-turned-self-appointed-grammarian N.M. Gwynne’s argument, made in his 2013 book The Ultimate Introduction to Grammar and the Writing of Good English, that good grammar is a means not just to social respectability, but to happiness itself. (“In summary of the proof: Grammar is the science of using words rightly, leading to thinking rightly, leading to deciding rightly, without which—as both common sense and experience show—happiness is impossible. Therefore, happiness depends at least partly on good grammar.”)

These notions of the normality—and, indeed, the morality—of “good English” complicate Stamper’s own professional humility. Language may be a tool of human communication, and dictionaries may simply record the shape of those tools as they exist within a particular moment in time; but the reality, as Stamper well knows, is much more complex than her simple observe-record-define formula might suggest. The history of the dictionary is not merely a history of English being defended, but also a history of dictionaries being debated: What should dictionaries do, actually? Should they prescribe English usage? Should they proscribe it?


This is the paradox of the English dictionary, and it is one that quietly infuses Stamper’s delightful history of that product: A dictionary is a determinedly apolitical document that will always be, to some extent, determinedly politicized. Its pages, whether tangible or digital, will always offer an awkward interplay between the normal and the normative. Samuel Johnson, the ur-lexicographer (and a laborer who argued that dictionary-writing required “neither the light of learning, nor the activity of genius,” but merely “the proper toil of artless industry”), nonetheless resisted including Americanisms in A Dictionary of the English Language; Noah Webster, for reasons both opposite and identical, resisted including British conventions in his American dictionary.

In 2003, Merriam-Webster, using the typical lexicographical method—observe words in the wild; exhaustively document evidence of new usages; update words’ entries accordingly—updated its definition for “marriage” to acknowledge that same-sex couples were entering that institution. The new definition provoked a write-in campaign to Merriam-Webster, accusing the dictionary of politics-via-lexicon. The anger was extremely predictable. Because—and this is the other thing a history of dictionaries will make abundantly clear—words are not merely words, and not merely tools. They are intimate. They are extensions of ourselves. They are one of the few immediate ways we have to take that small piece of reality that is ours—the mind, the self, the soul, choose whichever word in the dictionary seems most apt to you—and offer it to other people.

Word by Word, as it happens, enters the scene during a time of anxiety about those words—in American English, in particular. “Fake news” and “alternative facts” and the moment’s general epistemic panics come with some other panicky questions: Do words, in the end, matter? Can they be used, still, to share information and change minds? Language has long been a weapon, and a front, in the American culture wars (see: marriage, n., the state of being united as spouses in a consensual and contractual relationship recognized by law). Now, in a very real sense, it is also the thing being fought for. Facts, meaning, a culture enabled by shared truths—these are all, currently, at stake. Dictionaries (and Merriam-Webster, in particular) are doing their own part in all this. They are, in bids for continued relevance, operating more and more as quasi-journalistic outlets. They are functioning not merely as reference works or as histories, but also as soft suggestions that the ground beneath us can still be “common.”

It’s hard to read Stamper’s eloquent love letter to letters themselves and not come away convinced that dictionaries, stodgy and nimble and proud and humble and old and new, are something else, too: metaphors. For linguistic utilitarianism, for the productive interplay between the amateur and the professional, for the creative communal energies that will tend to drive the most thriving cultures. And for the notion that progress has a way of ignoring whatever rules might be enacted to prevent it. “It’s” becomes “its,” and “you’re” becomes “your” becomes “ur,” and lightning fires have not, as yet, burned it all down. And taco, sari, woke, multicultural, love—there they all are, in the dictionary, not just because they are useful vessels of human connection, but also because, in the dictionary, inclusion is the default setting. Diction is more useful when it is varied. Grammar is more fun when it allows for liberties to be taken with it. Language is more effective when it is quirky, and experimental, and above all welcoming—whatever, and irregardless of what, the scolds might have to say about it.

Megan Garber is a staff writer at The Atlantic, where she covers culture.

How was it? Save stories you love and never lose them.

Logo for The Atlantic

This post originally appeared on The Atlantic and was published March 16, 2017. This article is republished here with permission.

Make your inbox more interesting.

Get The Atlantic Daily email