×
50% OFF

Brighter, Better, Smarter YOU!

Save 50% on 12min Premium and start learning NOW!

409 reads ·  0 average rating ·  0 reviews

Amusing Ourselves to Death

Amusing Ourselves to Death Summary
Society & Politics

This microbook is a summary/original review based on the book: Amusing Ourselves to Death: Public Discourse in the Age of Show Business

Available for: Read online, read in our mobile apps for iPhone/Android and send in PDF/EPUB/MOBI to Amazon Kindle.

ISBN: 014303653X

Publisher: 

Also available in audiobook

Summary

In 1948, George Orwell published his most famous work, “Nineteen Eighty-Four.” A disturbing dystopian novel, the book satirized the dehumanizing trends of the postwar world and presented humankind with a bleak vision of the future, which Orwell envisioned as one of propaganda and surveillance, one of “a boot stamping on a human face – forever.” When the actual year 1984 came and Orwell’s prophecy didn’t, “thoughtful Americans sang softly in praise of themselves.” Back then, nobody doubted that the subject of “Nineteen Eighty-Four” was communism and particularly the Soviet Union, the “evil empire” of Ronald Reagan’s memorable phrase. The West, in other words, hadn’t been visited by dystopian nightmares; the roots of liberal democracy had held.

American media theorist and cultural critic Neil Postman wasn’t so sure – and he made that known at the 1984 Frankfurt Book Fair, as a participant in a panel on “Nineteen Eighty-Four” and the accuracy of the book’s prophecies. In his talk, Postman argued that whereas Orwell’s dark vision may have become reality only for the Soviets, the contemporary Western world was still living in a dystopia, one that was better reflected by another novel, slightly older than Orwell’s, but as equally chilling: Aldous Huxley’s “Brave New World.” Based on this talk, “Amusing Ourselves to Death” is “about the possibility that Huxley, not Orwell, was right.” Get ready to discover why that’s even more frightening.

Two dystopian visions: George Orwell vs. Aldous Huxley

In almost all despairing discussions about the future, people seem to enjoy alluding to two famous dystopian novels: George Orwell’s “1984” and Aldous Huxley’s “Brave New World.” The two authors’ dystopian visions are so often conflated that you might have already heard someone saying “we’re heading toward a brave new world,” not long after remarking how Orwellian everything seems or how omniscient and soul-piercing the eye of Big Brother has become thanks to the advancements of technology and social networking services. In reality, however, the futures imagined by Orwell and Huxley cannot be more different from each other. Not only did they not prophesy the same thing, but their prophecies are conflicted and even mutually excluding.

First and foremost, whereas Orwell assumed that humanity will be overwhelmed by “an externally imposed oppression,” Huxley thought that “people will come to love their oppression, to adore the technologies that undo their capacities to think.” Secondly, whereas Orwell anticipated Ray Bradbury by a few years in fearing the harmful effects of censorship on free speech and progress, Huxley was afraid that there would be no reason for governments to ban books in the future because nobody would bother reading them. Next, whereas Orwell warned against the advent of powerful people who would be able to deprive us of information, Huxley feared the imminent arrival of those who would give us so much information “that we would be reduced to passivity and egoism.”  Orwell, in other words, feared that truth would be concealed from us. Huxley, on the other hand, feared that the truth “would be drowned in a sea of irrelevance.” 

Orwell also feared that humans would one day become slaves, unable to break free from a captive culture. In contrast, Huxley feared that humans might one day become just too free and, paradoxically, get trapped in the inconsequential goings-on of a global, trivialized culture. In fact, in “Brave New World Revisited” – a nonfiction exploration of the themes of his most famous novel – Huxley explicitly remarked that all those civil libertarians and rationalists who are ever on the alert to oppose tyranny somehow fail “to take into account man’s almost infinite appetite for distractions.” In the same book, Huxley argued that whereas in Orwell’s “Nineteen Eighty-Four” people are controlled by inflicting pain, in “Brave New World,” they are controlled by inflicting pleasure. “In short,” Postman memorably concludes, “Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us.” Well, it already has.

Whatever happens in Vegas… happens in the United States as well

In Postman’s opinion, at different times in the history of the United States, different cities have been “the focal point of a radiating American spirit.” For example, in the late 18th century, it was Boston that was “the center of a political radicalism that ignited a shot heard round the world.” In a way, during the struggle for American independence, even Virginians became Bostonians at heart. The spirit is best embodied in a recognizable statue: the Lexington Minuteman monument.

Soon after, all of America symbolically became New York, “the melting-pot” for the “wretched refuse” of the world. The figurative embodiment of this spirit was, obviously, the Statue of Liberty. In the early 20th century, Chicago, “the city of big shoulders and heavy winds, came to symbolize the industrial energy and dynamism of America.” According to Postman, the statue personifying this period hasn’t been built, but should be one of a hog butcher – an appropriate “reminder of the time when America was railroads, cattle, steel mills and entrepreneurial adventures.”

And what about the city that represents the American spirit – and even the global spirit – today? It’s, of course, none other than Las Vegas, Nevada, “a city entirely devoted to the idea of entertainment,” a city that loudly proclaims “the spirit of a culture in which all public discourse increasingly takes the form of entertainment.” Its symbol, appropriately, is “a thirty-foot-high cardboard picture of a slot machine and a chorus girl” and the very fact that this is our current reality spells doom for humanity – maybe not in that Orwellian boot-on-the-face manner, but certainly in the Eliotian going-down-with-a-whimper way. We’ve become, quite literally, a waste land.

“Our politics, religion, news, athletics, education and commerce,” ominously comments Postman, “have been transformed into congenial adjuncts of show business, largely without protest or even much popular notice. The result is that we are a people on the verge of amusing ourselves to death.” How did we get there? Well, they say that the only way to understand the present is by knowing the past. So, allow us to paint, with Postman, a quick sketch of the devolution of American culture – and its unguided transformation from one rooted in words to one entirely image-based.

Word-based culture: bestseller pamphlets and lengthy speeches

Back in 1776, Thomas Paine published “Common Sense.” The book sold about 100,000 copies in just three months and almost half a million by the end of the decade. And you know how many people lived in the United States according to its first, 1790 census? A little less than four million! In other words, a sixth of the American population bought and – based on the fervent debates at the time – read a sermon-like 50-page pamphlet advocating American independence from Great Britain at a time when the idea wasn’t really given serious intellectual consideration. The only books that are as popular today are cookbooks and epic fantasy novels. But even those are read by far fewer Americans, as a percentage of the total population.

Things remained the same over the course of the 19th century, during which American culture became word-centered to the point that many intellectuals at the time casually talked in a style that, to today’s readers, would probably seem pretty much indistinguishable from writing. Just for illustrative purposes, allow us to take you back to the middle of the century. On October 16, 1854, two future presidential candidates – Abraham Lincoln and Stephen A. Douglas – were supposed to give speeches in Peoria, Illinois, before a fairly large, cheering crowd. First, it was Douglas who delivered a long, 3-hour address to the gathered people and then Lincoln, as it was previously agreed, stepped up to the podium to respond. However, he didn’t – at least not immediately. 

Rather, Lincoln proposed that the audience go home and have dinner, and then return to the place because he estimated that he would probably need at least as much time as Douglas to present his counterarguments. In reality, he needed an hour more than his opponent. Postman’s amazement doesn’t stop at the speakers and their ability to improvise so long for such important matters as slavery. “What kind of audience was this?” he asks in awe. “Who were these people who could so cheerfully accommodate themselves to seven hours of oratory?” 

The most fascinating part of the story is that neither Douglas nor Lincoln were presidential candidates in 1854; in fact, they weren’t even candidates for the Senate back then. Lincoln, particularly, was an obscure lawyer and Congressional hopeful. And yet, a large number of people chose to spend their entire day listening to his speech. Nowadays, even a 20-minute oral defense of a dissertation seems boring and long to most people. What have we become? And how?

Image-based culture and the devolution of discourse

At the time Lincoln was beginning to climb the political ladder, a businessman named Cyrus West Field was gathering money to construct the first transatlantic telegraph cable. Philosopher and naturalist Henry David Thoreau wasn’t a fan of the idea, explicitly doubting the benefits – and even the need – of such an endeavor. “We are in great haste to construct a magnetic telegraph from Maine to Texas, but Maine and Texas, it may be, have nothing important to communicate,” he wrote in “Walden,” just a few months before Lincoln and Douglas had one of their first many-hours-long debates. 

“We are eager to tunnel under the Atlantic and bring the old world some weeks nearer to the new,” Thoreau went on, “but perchance the first news that will leak through into the broad, flapping American ear will be that Princess Adelaide has the whooping cough.” Even though, to modern ears, Thoreau might sound like a Luddite here, it seems that he was more than right and farseeing about this last thing. Namely, he correctly understood that better connection doesn’t lead to better communication, but quite the opposite, it cancels it out.

“The telegraph,” remarks Postman “made a three-pronged attack on typography’s definition of discourse, introducing on a large scale irrelevance, impotence, and incoherence,” which he deems “the three demons of discourse.” In retrospect, the telegraph was merely the first blow, the one that began the Word’s descent from the throne. The second blow, quite deadly, was dealt by photography at the turn of the century. The third one, the fatal one, came with the advent of television. As soon as the technology of reproducing images became available, advertisers realized that an image is indeed worth a thousand words, and that, if properly taken out of context, it can bear the weight of a million more. In that realization, show business was born and Las Vegas was first envisioned. 

As images took over culture, so did trivial entertainment. At the time “Amusing Ourselves to Death” was first published, the President of the United States was Ronald Reagan, a former Hollywood movie actor. Postman saw that not only as a symptom of the banalization of intellectualism, but also as a warning that the Huxleyan brave new world was edging nearer. Even smart people, he remarked back in 1984, seem to like stars better than scholars; and more than serious intellectual endeavors, they seem to like entertainment and amusement. So, why should we be shocked at the fact that pop stars become presidents, when we are the ones more interested in other people’s faces, rather than their words?

Word-centered vs. image-centered cultures

To understand the devolution of culture even better, you need to also take into consideration the fact that, before the advent of telegraphy and television, “the printed word had a monopoly on both attention and intellect, there being no other means, besides the oral tradition, to have access to public knowledge.” Consequently, public figures – even the most famous ones – were almost exclusively recognized by way of their written words, not by their looks and not even by their rhetorical skills.

“It is quite likely that most of the first fifteen presidents of the United States,” observes Postman, “would not have been recognized had they passed the average citizen in the street. This would have been the case as well of the great lawyers, ministers, and scientists of that era.” In other words, to think about these great individuals meant, by definition, to think about the books they had written, “to judge them by their public positions, their arguments, and their knowledge as codified in the printed word.”

In contrast, the first thing that comes to our minds when we think of the presidents that came after – Nixon, Kennedy, Reagan (or to be more contemporary, Obama, Trump, Biden) – is an image, a picture of their face, and most likely, one carefully selected to present the respective candidate in the best possible light. Even worse, the same holds true for much smarter people than politicians, such as scientists and inventors. Think of Albert Einstein, for example. You can certainly envisage his face (and you probably do at this moment). But can you see, in your mind’s eye, the covers of his books or explain what’s in them? Of course not. Of words, almost nothing will come to your mind.

That’s not a slight difference, because it affects thinking itself. To think in images means to think in parcels, to think in boxed compartments. The very fact that images pack more information than words means they make it quite difficult for one to think about the details. An image, by design, conceals the pixels that comprise it. A sentence, on the other hand, would be unintelligible unless the words that compose it are comprehensible on their own. Word-centered cultures raise more critical individuals because they make it possible for people to see more. Conversely, image-centered cultures are fertile fields for sowers of prejudice and discrimination. Unfortunately, word-centered cultures are a thing of the past. Image-centered cultures represent both our present – and the future.

The twilight of intellectualism and the three commandments of modern television

The average American spends four to five hours watching television. Add to that the number of hours they spend watching YouTube videos, and Postman’s verdict of our middlebrow age sounds even more damning. Real intellectuals, he says, aren’t produced by the passive absorbing of images, but through a careful and rather dull study of words and numbers and equations. This is not because all videos are inherently bad for education, but because videos that are entertaining stop being educational after a certain point. Put in simpler terms, if you’re studying a subject properly, after some time, the process will stop being engaging and entertaining. That’s simply how learning curves work: the more you know, the more effort you’ll need to make the next step.

“Education philosophers,” Postman explains, “have assumed that becoming acculturated is difficult because it necessarily involves the imposition of restraints. They have argued that there must be a sequence to learning, that perseverance and a certain measure of perspiration are indispensable, that individual pleasures must frequently be submerged in the interests of group cohesion, and that learning to be critical and to think conceptually and rigorously do not come easily to the young but are hard-fought victories.” The philosophy of the education offered by TV programs and YouTube is completely different. It promises fun and amusement, pleasure and leisure. And it operates under the following three commandments:

  1. Thou shalt have no prerequisites. In other words: no prior knowledge is necessary. “There must not be even a hint that learning is hierarchical, that it is an edifice constructed on a foundation,” remarks Postman. “The learner must be allowed to enter at any point without prejudice.”
  2. Thou shalt induce no perplexity. The more complicated a TV program is, the fewer viewers it has. That means that only simple TV programs get financed. “It is assumed that any information, story or idea can be made immediately accessible, since the contentment, not the growth, of the learner is paramount,” explains Postman.
  3. Thou shalt avoid exposition like the ten plagues visited upon Egypt. This one may sound a bit strange, but it is still a pretty straightforward commandment. Let us rephrase it: evade “arguments, hypotheses, discussions, reasons, and refutations” at all costs; even better, sacrifice them at the altar of the narrative. After all, whereas arguments and refutations must be unpacked, stories can be easily packed into boxed compartments. So, try to conduct teaching through them. Combine it, when possible, with dynamic stock images and inspiring music. 

Bread, circuses, gladiators, Soma – and YouTube

Whether you’re watching an Apple commercial or Carl Sagan’s “Cosmos,” whether you’re showing your children a “Sesame Street” episode or reminding yourself of “Star Trek’s” best seasons, never forget that all of these programs are primarily interested in seducing you, in apprehending your attention by way of powerful images and stories that need no prior knowledge to be gobbled down at once. Unfortunately, the difference between a TV commercial, a PBS documentary and a binge-worthy Netflix show is not one of type, but one of degree. 

In Rome, the rulers used gladiator fights to superficially appease the public whenever they needed appeasement to remain subservient. Juvenal, a Roman poet active in the late first century, even gave a name to this diabolical practice: panem et circenses, “bread and games.” Analogically, in “Brave New World,” the World State city of London keeps its citizens peaceful by constantly feeding them a calming, happiness-inducing drug called Soma. The citizens don’t rebel against anything because they are too distracted to notice there are reasons to rebel. Just like in Rome. In both cases, by satisfying the most immediate or base requirements of the ordinary people, the rich and the powerful gave themselves immense space to do what they want.

Even though at first glance this might sound too dystopian to become true, it’s pretty much the reality we’ve all been living in for a while. YouTube videos and social networking sites have become for us what gladiator fights were for Roman peasants: mere distractions, annoyingly persistent disturbances, pure entertainment. Along the way, we’ve gone a step further, as we’ve managed to transform our entire reality into a reality show, one so good that – just like Jim Carey in “The Truman Show” – we’ve stopped being aware of it. 

Unfortunately, that’s how a Huxleyan dystopia operates: while rotten on the inside, it’s so perfect on the outside that it blinds people from exploring further. It’s easy to rebel against an Orwellian totalitarian state, because the enemy is always on the other side – there are boots on human faces and tanks on the streets. In a Huxleyan dystopia, you become your own worst enemy. To resist the powers that be means to resist your own weaknesses and temptations. And to rebel against them means to renounce your own comfort.

A counterintuitive call – for worse TV!

Television and YouTube videos are so bad for society precisely because they look so good to individuals. Think of the news, for one. Why is it always presented by beautiful hosts and announcers? Why are there flashcards and soundbites and dramatic music that crescendos right before the presenter utters the first word? Moreover, why is most news related in a story-like fashion? Why is it interspersed with commercials and announcements of other programs? Does it really make sense to hear tragic news about a school shooting between a McDonalds commercial and a trailer for tonight’s debate?

Speaking of, what has happened to public debates? They died ever since newspapers began presenting them as boxing matches, the relevant question being, “who would knock down whom?” People don’t win debates anymore – they simply destroy and dismantle their opponents. The victor in a modern debate is never the one with the facts and the arguments, but the one with the jokes and adages, the one who can be described as the better orator. Aristotle and Plato, the twin columns upon which the temple of Western culture was raised, despised such debates, criticizing all those itinerant intellectuals who employed rhetoric to achieve their purposes, and whose main objective wasn’t the truth but to impress or persuade an audience by using the right words and sentences at the right time. Aristotle and Plato called them sophists and couldn’t understand how anyone in love with the truth would ever ask for money to share it with other people. 

Unfortunately, today’s world is a world of sophists. Think of supposed modern intellectuals such as Jordan Peterson or Slavoj Žižek. Are they really scholars or are they just contemporary sophists? After all, they seem to think the way boxers think – in terms of victories and earned money, and strive to retire undefeated. Moreover, if there’s one thing intellectuals should abhor, it is certainly staging shows for the public. Peterson and Žižek enjoy this act. So do many supposedly religious people, which is why there are 24/7 religious TV channels and thousands of self-proclaimed YouTube preachers. How? What’s so sacred about the very media that have contributed to the creation of a global society that habitually glorifies show-biz scandals, violence, and sex? Nothing, of course. Well, except for the money, that is. 

The good news is that, according to Neil Postman, there is a way out of it all; but it’s counter-intuitive. Namely – bad TV. “Television,” concludes Postman, “serves us most usefully when presenting junk-entertainment; it serves us most ill when it co-opts serious modes of discourse – news, politics, science, education, commerce, religion – and turns them into entertainment packages. We would all be better off if television got worse, not better.” Want examples? Postman has them: “The A-Team” and “Cheers,” he says, are no threat to our public health. However, TV programs such as “60 Minutes,” “Eye-Witness News” and “Sesame Street” are. You know their modern correlatives, so no need to list them here. There is a need, however, to remind you that it’s not enough to know them, but to also stop watching them. Otherwise, 

Final notes

“Amusing Ourselves to Death” is an exquisite book. It is not only well written and well argued, but also illuminative, insightful and immensely powerful. Unfortunately, it is also amazingly far-sighted in its predictions. 

“As a fervent evangelist of the age of Hollywood,” wrote Camille Paglia, recently and quite repentantly, “I publicly opposed Neil Postman’s dark picture of our media-saturated future. But time has proved Postman right. He accurately foresaw that the young would inherit a frantically all-consuming media culture of glitz, gossip, and greed.”

Read his words to discover why you must rise up against its temptations and why the future of the world depends on your current discomfort. Then renounce your comfort. Because soon it may become too late – for your grandchildren.

12min tip

If you want to change the world, start off by turning off your TV and the notifications on your phone. Then, as soon as you feel comfortable, delete all of your social media accounts. Finally, head to the library. To paraphrase Angelica Schuyler, rather than a revolution, we need a revelation. Work!

Sign up and read for free!

By signing up, you will get a free 3-day Trial to enjoy everything that 12min has to offer.

Who wrote the book?

Neil Postman was an American media theorist and cultural critic. A life-long educator, he avoided including personal computers in school and became one of the pioneers of media ecology theory, the study of how modern technology... (Read more)