Pocket worthyStories to fuel your mind

Why Futurism Has a Cultural Blindspot

We predicted cell phones, but not women in the workplace.

Nautilus

Read when you’ve got time to spare.

7071_059d9e01176ab2f0892fe2215835bf19_orig.gif

Illustration by Robin Davey.

In early 1999, during the halftime of a University of Washington basketball game, a time capsule from 1927 was opened. Among the contents of this portal to the past were some yellowing newspapers, a Mercury dime, a student handbook, and a building permit. The crowd promptly erupted into boos. One student declared the items “dumb.”

Such disappointment in time capsules seems to run endemic, suggests William E. Jarvis in his book Time Capsules: A Cultural History. A headline from The Onion, he notes, sums it up: “Newly unearthed time capsule just full of useless old crap.” Time capsules, after all, exude a kind of pathos: They show us that the future was not quite as advanced as we thought it would be, nor did it come as quickly. The past, meanwhile, turns out to not be as radically distinct as we thought.

In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.

These observations apply neatly to technology. We don’t have the personal flying cars we predicted we would. Coal, notes the historian David Edgerton in his book The Shock of the Old, was a bigger source of power at the dawn of the 21st century than in sooty 1900; steam was more significant in 1900 than 1800.

As Amazon experiments with aerial drone delivery, its “same day” products are being moved through New York City thanks to that 19th-century killer app: the bicycle.

But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?

Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,”1 people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.”

In one experimental example, people were asked how much they would pay to see their favorite band now perform in 10 years; others were asked how much they would pay now to see their favorite band from 10 years ago. “Participants,” the authors reported, “substantially overpaid for a future opportunity to indulge a current preference.” They called it the “end of history illusion”; people believed they had reached some “watershed moment” in which they had become their authentic self.2 Francis Fukuyama’s 1989 essay, “The End of History?” made a similar argument for Western liberal democracy as a kind of endpoint of societal evolution.

This over- and under-predicting is embedded into how we conceive of the future. “Futurology is almost always wrong,” the historian Judith Flanders suggested to me, “because it rarely takes into account behavioral changes.” And, she says, we look at the wrong things: “Transport to work, rather than the shape of work; technology itself, rather than how our behavior is changed by the very changes that technology brings.” It turns out that predicting who we will be is harder than predicting what we will be able to do.

ezgif.com-webp-to-jpg(2).jpg

Old School: Although Amazon is experimenting with hi-tech delivery options like drones, many of its “same-day” products continue to be delivered by bicycle. Photo from Gibson Pictures.

* * *

Like the hungry person who orders more food at dinner than they will ultimately want—to use an example from Lowenstein and colleagues—forecasters have a tendency to take something that is (in the language of behavioral economics) salient today, and assume that it will play an outsized role in the future. And what is most salient today? It is that which is novel, “disruptive,” and easily fathomed: new technology.

As the theorist Nassim Nicholas Taleb writes in Antifragile, “we notice what varies and changes more than what plays a larger role but doesn’t change. We rely more on water than on cell phones, but because water does not change and cell phones do, we are prone to thinking that cell phones play a larger role than they do.”

The result is that we begin to wonder how life was possible before some technology came along. But as the economist Robert Fogel famously noted, if the railroad had not been invented, we would have done almost as well, in terms of economic output, with ships and canals.3 Or we assume that modern technology was wonderfully preordained instead of, as it often is, an accident. Instagram began life as a Yelp-style app called Burbn, with photos an afterthought (photos on your phone, is that a thing?). Texting, meanwhile, started out as a diagnostic channel for short test messages—because who would prefer fumbling through tiny alphanumeric buttons to simply talking?1

Transportation seems to be a particular poster child of fevered futurist speculation, bearing a disproportionate load of this deferred wish fulfillment (perhaps because we simply find daily travel painful, reminding us of its shared root with the word “travail”). The lament for the perpetually forestalled flying car focuses around childlike wishes (why can’t I have this now?), and ignores massive externalities like aerial traffic jams, and fatality rates likely to be higher than terrestrial driving.

The “self-driving car,” it is promised, will radically reshape the way we live, forgetting that, throughout history, humans have largely endeavored to keep their daily travel time within a stable bound.4 “Travelators,” or moving walkways, were supposed to transform urban mobility; nowadays, when they actually work, they move (standing) people in airports at a slower-than-walking speed. In considering the future of transportation, it is worth keeping in mind that, today, we mostly move around thanks to old technology. As Amazon experiments with aerial drone delivery, its “same day” products are being moved through New York City thanks to that 19th-century killer app: the bicycle.

Edgerton notes that the “innovation-centric” worldview—those sexy devices that “changed the world”—runs not merely to the future, but also the past. “The horse,” he writes, “made a greater contribution to Nazi conquest than the V2.” We noticed what was invented more than what was actually used.

One futurist noted that a 1960s film of the “office of the future” made on-par technological predictions (fax machines and the like), but had a glaring omission: The office had no women.

In the same way that our focus on recent innovations causes people to overemphasize their importance, to see them as hastening a radically transformed future—like Google Glass was supposed to—the backward look is distorted so that technologies are rendered prematurely obsolete. The prescience of near-future speculations, like Bladerunner, comes less from uncannily predicting future technologies (it shows computer identification of voices, but Bell Labs was working on spectrographic analysis of human voices in the 1940s5) than in anticipating that new and old will be jarringly intermingled. Films that depict uniformly futuristic worlds are subtly unconvincing—much like historical period films in which cars on the street are all perfect specimens (because those are the only ones that have survived). Dirt and ruin are as much a part of the future as they are the past.

People in the innovation-obsessed present tend to overstate the impact of technology not only in the future, but also the present. We tend to imagine we are living in a world that could scarcely have been imagined a few decades ago. It is not uncommon to read assertions like: “Someone would have been unable at the beginning of the 20th century to even dream of what transportation would look like a half a century later.”6 And yet zeppelins were flying in 1900; a year before, in New York City, the first pedestrian had already been killed by an automobile. Was the notion of air travel, or the thought that the car was going to change life on the street, really so beyond envisioning—or is it merely the chauvinism of the present, peering with faint condescension at our hopelessly primitive predecessors?

“When we think of information technology we forget about postal systems, the telegraph, the telephone, radio, and television,” writes Edgerton. “When we celebrate on-line shopping, the mail order catalogue goes missing.” To read, for instance, that the film The Net boldly anticipated online pizza delivery decades ahead of its arrival7 ignores the question of how much of an advance it is: Using an electronic communication medium to order a real-time, customizable pizza has been going on since the 1960s. And when I took a subway to a café to write this article and electronically transmit it to a distant editor, I was doing something I could have done in New York City in the 1920s, using that same subway, the Roosevelt Brothers coffee shop, and the telegram, albeit less efficiently. (Whether all that efficiency has helped me personally, or just made me work more for declining wages, is an open question). We expect more change than actually happens in the future because we imagine our lives have changed more than they actually have.

* * *

In her book The Making of Home, Judith Flanders describes an offhand reference by the diarist Samuel Pepys, in 1662, to something called a “spitting sheet.” She speculates this was a sheet affixed to a wall near a spittoon to protect wall coverings from an errant spitter. It is an example of what she calls “invisible furniture.” We all know what a spittoon is. And yet, as they scarcely register in literature and are rarely depicted in art, it is easy to overlook just how commonplace the act of spitting was, even in polite society.

Flanders notes that the United States actually used to regulate where spitting was allowed on trains, stations, and on platforms. A 1917 conference of boards of health, held in Washington, D.C., mandates that “an adequate supply of cuspidors shall be provided” in train cars. Today, both the word “cuspidor” (meaning spittoon) and the object have virtually vanished (though Supreme Court Justices still get one). Its disappearance is not because some technology went obsolete. It is because our behavior has changed.

While the technological past and future appear to be more different than they actually are, these cultural differences in time seem surprising. Working as historical consultant on the video game Assassin’s Creed, Flanders had to constantly remind writers to cut the word “cheers” from the script, because, she told me, “people didn’t use that word until the 20th century.” The writers wanted to know what they did say. “They had huge trouble wrapping their heads around the idea that mostly people didn’t say anything. Giving some form of salutation before you drink is so normal to them, it’s actually hard to accept that for centuries people didn’t feel the need.”

The self-driving car has, in a sense, always been a given. But modern culture hasn’t.

The historian Lawrence Samuel has called social progress the “Achilles heel” of futurism.8 He argues that people forget the injunction of the historian and philosopher Arnold Toynbee: Ideas, not technology, have driven the biggest historical changes. When technology changes people, it is often not in the ways one might expect: Mobile technology, for example, did not augur the “death of distance,” but actually strengthened the power of urbanism. The washing machine freed women from labor, and, as the social psychologists Nina Hansen and Tom Postmes note, could have sparked a revolution in gender roles and relations. But, “instead of fueling feminism,” they write, “technology adoption (at least in the first instance) enabled the emergence of the new role of housewife: middle-class women did not take advantage of the freed-up time ... to rebel against structures or even to capitalize on their independence.” Instead, the authors argue, the women simply assumed the jobs once held by their servants.

Take away the object from the historical view, and you lose sight of the historical behavior. Projecting the future often presents a similar problem: The object is foregrounded, while the behavioral impact is occluded. The “Jetsons idea” of jetpacking and meals in a pill missed what actually has changed: The notion of a stable career, or the social ritual of lunch.

One futurist noted that a 1960s film of the “office of the future” made on-par technological predictions (fax machines and the like), but had a glaring omission: The office had no women.9 Self-driving car images of the 1950s showed families playing board games as their tail-finned cars whisked down the highways. Now, 70 years later, we suspect the automated car will simply allow for the expansion of productive time, and hence working hours. The self-driving car has, in a sense, always been a given. But modern culture hasn’t.

* * *

Why is cultural change so hard to predict? For one, we have long tended to forget that it does change. Status quo bias reigns. “Until recently, culture explained why things stayed the same, not why they changed,” notes the sociologist Kieran Healy. “Understood as a monolithic block of passively internalized norms transmitted by socialization and canonized by tradition, culture was normally seen as inhibiting individuals.”10

And when culture does change, the precipitating events can be surprisingly random and small. As the writer Charles Duhigg describes in The Power of Habit, one of the landmark events in the evolution of gay rights in the U.S. was a change, by the Library of Congress, from classifying books about the gay movement as “Abnormal Sexual Relations, Including Sexual Crimes,” to “Homosexuality, Lesbianism—Gay Liberation, Homophile Movement.” This seemingly minor change, much touted by activists, helped pave the way for other, larger changes (a year later, the American Psychiatric Association stopped defining homosexuality as a mental illness). He quotes an organizational psychologist: “Small wins do not combine in a neat, serial form, with each step being a demonstrable step closer to some predetermined goal.”

We might say the same about the future.

Tom Vanderbilt writes on design, technology, science, and culture, among other subjects.

References

  1. Lowenstein, G., O’Donoghue, T., & Rabin, M. Projection bias in predicting future utility. The Quarterly Journal of Economics 118, 1209- 1248 (2003).
  2. Quoidbach, J., Gilbert, D.T., & Wilson, T.D. The end of history illusion. Science 339, 96-98 (2013).
  3. Fogel, R.W. Railroads and American Economic Growth: Essays in Econometric History The Johns Hopkins University Press, Baltimore (1964).
  4. Marchetti, C. Anthropological invariants in travel behavior. Technological Forecasting and Social Change 47, 75-88 (1994).
  5. Solan, L.M. & Tiersma, P.M. Speaking of Crime: The Language of Criminal Justice The University of Chicago Press (2010).
  6. Rodrigue, J.P. The Geography of Transport Systems Routledge, London (2013).
  7. Crugnale, J. How cyberthriller ‘The Net’ predicted the future of the Web. The Kernal (2015).
  8. Samuel, L.R. Future: A Recent History University of Texas Press, Austin, TX (2009).
  9. Eveleth, R. Why aren’t there more women futurists? The Atlantic (2015).
  10. Healy, K. Social change: mechanisms and metaphors. Working Papers (1998).

How was it? Save stories you love and never lose them.


Logo for Nautilus

This post originally appeared on Nautilus and was published October 11, 2018. This article is republished here with permission.

Did you enjoy this story?

Subscribe to Nautilus