Memes, Clichés, Repetition, Terrible Truths, and the Danger of Boredom
“We’re bored. We’re all bored now. But has it ever occurred to you, Wally, that the process that creates this boredom that we see in the world now may very well be a self perpetuating, unconscious form of brainwashing, created by a world totalitarian government based on money? And that all of this is much more dangerous than one thinks? And it’s not just a question of individual survival, Wally, but that somebody who’s bored is asleep, and somebody who’s asleep will not say no?” — Andre Gregory in “My Dinner with Andre” (1981)
I try not to lecture my students. When I was a high schooler (back in the 12th Century), my life seemed like a never-ending series of lectures. “Okay, Boomer,” I would have said, had that meme existed at the time. (The time was actually circa 1991. Public Enemy and Ministry and Beavis and Butt-Head were the soundtrack.) I got told how to study, how to write, how to live, how to think, what to wear, what not to say, and how not to dream.
Today I’m an anarchist Freirian PotO consciousness-raising revolutionary “each one teach one” educator. I structure my classes around authentic revelation linked to long-term self-interest. The last thing I want to do is waste my students’ time as mine was so often wasted. At the same time, I know that repetition can be useful for retention. I tell my students that everything I know stays in my head because I read it/heard it/saw it 20 times. But I also know that in the 21st century, repetition is a cardinal sin against the multitasking mind.
People turn off the instant they encounter something they’ve already encountered. I’ve seen that ad already — change the channel. I already saw that news alert; move on. We’ve been trained to believe that when we experience something the first time, we take everything we need from it. This is most starkly evident in the classroom (at least, that’s where I encounter it most often), but it shows up everywhere.
There are exceptions to this rule, obviously. We listen to our favorite songs over and over. We binge-watch beloved TV shows straight through for a third or fourth or ninth time. (My faves are The Simpsons and Futurama and 30 Rock and The Office and Parks & Rec.) Also obviously, our society reacts best to things that are familiar. Hollywood re-hashes themes and characters and scenarios we’ve seen a billion times. Experimental avant-garde music doesn’t make it onto the radio. Our political discourse mostly repeats accepted truths.
US culture, therefore, is an odd mix of thrill-seekers who live in bubbles. We seek out the fresh, the new, the different, the unique, the unusual, while never venturing too far from the mainstream. Being bored is bad, but being boring is worse.
Memery Uber Alles
21st-Century internet culture has brought an acceleration of these trends, with instant messaging, always-on connectivity, and simple tools for creation + markup. Platforms like Twitter, Reddit, Instagram, and TikTok have incubated a constant — and infinite — stream of images, jokes, puns, video clips, factoids, and bite-size commentaries. At the risk of oversimplification, we’ll refer to these as “memes”. (That term has a variety of specific definitions, but it will suit our purposes here. I hope.)
In 2020, memes are the ubiquitous de facto form of online communication. Flame wars are mostly dead (hey look I memed Princess Bride), discussion forums are dying, and the only thing worth doing in comment threads is bashing the fasc (antifameme) or triggering libz (altrightmeme). It’s all memes, all the time.
This isn’t necessarily a bad thing. Memes are funny. They reward decoding (and encoding, and translation) skills. They are easily shared, and convey large amounts of information in a small space. Memes amalgamate politics, language, advertising, history, pedagogy, sociology, and pop culture.
More to the point, memes are quickly becoming pop culture, and a big part of it. When 30 Rock goofed the 10-second sitcom Makin’ It Happen in 2007, Fey et alum were serving part parody, part prophecy. Bill Watterson had nailed that head 18 years earlier with this stripmeme, and then later decided he was off by nine seconds.
A few years ago I was shocked to hear a student announce “I don’t watch movies”. I had long abandoned my dream of universal conversations about books in the classroom (a ~40% market share of students knowing Harry Potter notwithstanding), but this was new. Now Star Wars and MCU texts were litterae non grata. How, I Stand-And-Deliver-via-South-Park-memed, do I reeech these keeeds? Oh well, I thought, it’s just one keeed.
But now it’s a constant. It’s lots of keeeds. At least one student in each class doesn’t bother with feature films. They watch YouTube. They chase the ‘gram. They [???]. (When I ask what they actually do when they’re on break or “hanging out with friends”, they shrug.) They absorb culture in tiny nibbles, not 90-minute movie meals — and certainly not five-hour novels.
This is the logical post-postmodern corollary of Jerry Mander’s 1978 hypothesis, written on the eyeballs of all people with decent internet connections, especially young people. (Cf. also Gleick 2000.) It’s not a new language, but it’s a new semantic grammar. Texts fold into themselves and meld into shitposts. Referents appear from nowhere and vanish into nothingness. Feedback loops stay closed and tourists need not apply. The Rise of Skywalker apparently has a line at the start that references the video game Fortnite, but I — a lifelong video game addict — have no idea what it is or what it means. (I get my revenge in class by using the word “fortnight” to mean 14 days. Confere also 30 Rock’s “one glorious fortnight”.)
And because these memes fly so quickly and so constantly, the prime directive is: Avoid repetition. Reposts get downvoted to hell on Reddit. Stale jokes don’t get retweeted. If biting rhymes is verboten in hip-hop then biting memes (or even recycling the humorous contents of another’s meme) is a technical foul online. Accidental repetition is even worse, because it proves that you’re not keeping up with the mindstream. Okay, boomer? Got all that?
Problem #1: The Shallows
The accele-meme-eration of online discourse has two serious downsides. One is the oft-discussed (but not yet well-researched) danger of The Shallows. A mountain of anecdotal evidence suggests that people who spend lots of time consuming quick bits of info online are unprepared for deeper thinking, longer texts, and complicated ideas. This pattern is especially worrying for the minds of young people, which are now being formed in an intellectual ecosphere of metatexts, hyperlinks, and infinite memery. It seems like I encounter more and more students every year who have serious difficulty approaching things that require deep thought or more than one page of reading.
On the other hand, this complaining is part of a pattern that has existed since the beginning of recorded time. In 375 BCE, Plato referred to “the old quarrel between philosophy and poetry”, and warned against the latter’s emphasis on emotion. “Such poetry,” he said, “mustn’t be taken seriously as a serious thing laying hold of truth.” When novels first appeared in the 17th and 18th centuries, moral crusaders warned about the apocalyptic dangers they posed, especially to women. One of these crusaders, James Fordyce, wrote in 1766 that “she who can bear to peruse [novels] must in her soul be a prostitute, let her reputation in life be what it will”. So maybe the hand-wringing about these “memin’ kids today” is just like that scene in The Breakfast Club.
Meanwhile, my students can locate and absorb information — sometimes using impressive cognition tools — when digging through the noise in search of signals. What took me two hours in the 1990s with the Reader’s Guide to Periodical Literature at my local library now takes a student two minutes. Some of them are experts at using Snopes, Wikipedia, Google, and PolitiFact to separate the informational wheat from the chaff. When I meme [citation needed], they bring receipts.
So maybe the internet is making us all dumber and maybe not. It’s probably making some of us dumber and some of us smarter. (Or more accurately: It’s making us all dumber in some ways, and smarter in others.) It’s hard to tell, and the research is only just beginning. “Experts” on one side are clutching their pearls with hysterical alarm; “experts” on the other side are exploding with enthusiasm for the megaminds we’re all becoming. I’m curious, but I feel like the only thing we can really do is continue to urge kids to read, and tell them: “If you don’t like books, you haven’t found the right books yet.” But there’s another problem, one I don’t know how to approach.
Problem #2: Terrible Truths
In his 2005 commencement address at Kenyon College, David Foster Wallace promised repeatedly that he was not offering “some finger-wagging Dr Laura sermon”. This echoed the lyrics of NWA members in the 1990 anti-gang-violence track “We’re All in the Same Gang” from the West Coast All-Stars supergroup: “Yo, we’re not here to preach because we’re not ministers / We’re telling like it is ‘cause Ren and Dre is like sinister”. Nothing could be worse than an adult telling young people what to do, even during a graduation speech or anti-violence rap song. (It’s worth noting that both Dre and Wallace were guilty of violence toward women, so their words have a wretched context.) Besides, they’ve heard it all before, so why bother?
In the same address, however, Wallace explains the value of the cliché as a tool to emphasize “a great and terrible truth”. After reminding his audience that worship of self and stuff will bring ruin, he acknowledges his lack of fresh material:
On one level, we all know this stuff already. It’s been codified as myths, proverbs, clichés, epigrams, parables; the skeleton of every great story. The whole trick is keeping the truth up front in daily consciousness.
This is the second problem of the accele-meme-eration of our culture: If people shut off instantly when things are repeated, then how do we keep these terrible truths up front in daily consciousness? Must we constantly repackage these truths inside different Advice Animals, to make the contents seem fresh?
It’s a special problem for educators, especially teachers like me, who have six different classes of ~25 different students all constantly rotating through the room. If I repeat the phrase “be here now” (perhaps the most urgent of all the great and terrible truths, especially amid a paroxysm of teenage anxiety and depression and self-harm) over and over and over and over and over and over, each kid has only heard it once. If they — like me — need to hear a thing 20 times to lodge it firmly in their mind grapes, then I need to say it 120 times. But on a block schedule, we only meet 90 times a semester. To complicate it all, I sometimes have the same student in two different classes. Some unfortunate souls must endure three hours with me every other day.
The kids aren’t shy about alerting the world to their boredom when I start reviewing stuff. And I don’t blame them! It gets dull being Aaron in Primer, lip-synching trivia and going through the same conversations again and again. But it must be even worse to endure my logorrhea more than once. On the other hand, isn’t it their responsibility to inject something new into these discussions? Even Abe threw his shoe at the mini-blinds. Sensitive dependence on initial conditions, yo, like Malcolm said in Jurassic Park.
Meanwhile, the pedagogical literature is clear about the need to repeat: We’re supposed to preview, provide essential questions, gather exit cards, and review next time. Does this clash with the 21st Century fetish for non-repetition? Of course. Is there a way out of it? Of course not.
So we educators (and parents too, I assume) are trapped in a paradox, just as we always have been. Do we risk boring young people with our terrible truths, or do we throw away our shot by letting a teachable moment pass? The only difference is the speed and frequency of the antibodies in our body politic. If the television screen is “the retina of the mind’s eye”, then what is the cell phone screen? And what, then, is reality? How do we use it for good, instead of just making money?
The answers might be too boring to explore.