Berfrois

Skip the Intro?

Print

Au Hasard Balthazar (1966)

by Justin E. H. Smith

On Time, Storytelling, and the New “Content” Industry

1. Don’t skip the intro

Is there any bit of popular philosophical wisdom more useless than the pseudo-Epictetian injunction to “live every day as if it were your last”? If today were my last, I certainly would not have just impulse-ordered an introductory grammar of Lithuanian. Much of what I do each day, in fact, is premised on the expectation that I will continue to do a little bit more of it the day after, and then the day after that, until I accomplish what is intrinsically a massively multi-day project. If I’ve only got one more day to do my stuff, well, the projects I reserve for that special day are hardly going to be the same ones (Lithuanian, Travis-style thumb-picking) by which I project myself, by which I throw myself towards the future. If today were my last day, I might still find time to churn out a quick ‘stack (no more than 5000 words) thanking you all for your loyal readership. But the noun-declension systems of the Baltic languages would probably be postponed for another life.

2. What is poetry?

My friend Jeff Dolven has ventured (in recording an upcoming podcast) the idea that poetry is the art form that strives toward instantaneity. In a poem, even if it might take a few stanzas or even a massive epic to establish this desired quality of all-at-once-ness, the work differs from a prose work in that it does not transmit its most basic animating idea through the temporal unfolding of a narrative, but rather seeks to establish the eternity of the characters or kinds it evokes. This remains the case even when there is a significant narrative dimension of the poem, as generally in epic: Odysseus travels and does many things along the way, but the epithets that attach to him, the meter and other formalities, bring him ever back to the same, to what he always was.

3. The eternal tense and the single panel

There is a peculiar rendering in the King James translation of the Bible that appears as an attempt to capture in English something like an absent “eternal tense”: “Before Abraham was,” the English has Jesus saying, “I am” (John 8:58). Why is this not rendered in the sense of the pluperfect, of the more-past-than-thou, “Before Abraham was, I had already been”, or “Before Abraham was, I was already being”? Because, presumably, having been implies the possibility of ceasing to be; being implies only ever more of the same. Only a person who is also divine is in a position to experiment with tense in this way: being divine and human at once, he is also eternal and temporal at once, and has to find a way of speaking that accommodates both the sequence of events that constitute his life, the birth that places his life within a succession of lives, as well as the unchanging fact of his nature.

One of the more profound moments in Family Circus cartoonist Bil Keane’s oeuvre is a single-panel image of one of the boys looking up at the North Star like some li’l suburban magus (if I recall correctly), and declaring: “God invented time to prevent everything from happening at once.” Part of the unintentional grace of this piece of American commercial-folk art is that, even as it recycles in its caption a form of simplified Augustinian-Leibnizian temporal idealism, the thought conveyed in the cartoon plays out, precisely, in a single panel, that strange “poetic” variant on the quintessentially prosaic form of sequential cartoon boxes that emerged with the Katzenjammer Kids. For our boy in the panel, everything really does happen at once — there is nothing to the left or to the right of him, no before or after. The cartoon is an instance of its own idea, like Malevich’s Black Square.

In this light it is perhaps more than mere incongruity-humor that makes the internet-era randomized mash-up of Nietzsche Family Circus work so well: the philosopher and the cartoonist are equally preoccupied with the eternal recurrence of the same, and the ways in which eternal repetition may be reduced to an instant in thought.

4. Collapse

If my own life sometimes feels “poetic” to me, it is because, while indeed this is a life that has been embroiled in its share of common human events (chicken pox, parallel-parking scrapes, hit a deer once…), these events often seem not so much to constitute the life, as rather to give it a pretext for meditating on what is unchanging in it. In moments of lucidity, moments that appear to me to “disconceal” something about the order of reality and my place in it, I tend to apprehend my life as a single panel. This is particularly so when I am thinking about the passage of time and the cultural change it measures, about faded fashions, and experience the “pain of homegoing” we call nostalgia.

It sounds reckless to say this, but I have a distinct memory of clearly knowing, circa 1982, that I was living in “the 1980s”. It’s not just that I knew what calendar year it was and how to place that year in its decade; I mean I knew what the 1980s themselves were, their eternal nature and meaning. We were only a year or two into the decade and that meaning had not yet formed, but I distinctly recall being in the back seat of a friend’s dad’s new Honda Accord. The dad couldn’t figure out how to work the new digital stereo, to pop out the cassette and switch to FM radio. The kid razzed his elder: “Get with the ‘80s, man!” I laughed at his skill and delight in razzing, a veritable prophecy of Bart Simpson to come, but I was also suddenly stunned by the utter reality of this abstract force, “the ‘80s”, that my friend had just invoked by name. As I laughed I pictured skinny ties, bangs pouring diagonally over one eye, Reaganomics, keytars, pastel triangles, the whole delirious scene, all envisioned as if through a Hipstamatic filter on early-eighties mall-developed photographic prints. I pictured all of these things as if their effigies had already been made and propped up in the permanent exhibition of the imaginary museum of shared historical memory.

One possible way of accounting for this strange temporal “collapse” is to hypothesize that it was never really 1982 to begin with, or that it was never only 1982, and that my idea of what 1982 was all about was shaped in part by the idea of it that would only slowly come into consciousness over the subsequent years. I am not asserting that this is the true nature of reality (I’ll leave that to Augustine and Leibniz), but am only saying that there seems to me an important connection between the experience of life in a poetic key and the experience of time crunching down to an instant. To experience the world as structured poetically, is to apprehend it all as happening all at once, as collapsing into a single panel.

5. Embellishments

I’ve been reflecting these last days on something I strangely had never noticed before: the strict impossibility of recording your thoughts as they are happening. You just can’t do it, since the very instant you take your thoughts themselves as the object of your attention, you are no longer simply “having thoughts”. If I were accurately recording my thoughts when I set out to record my thoughts, what I would in fact be recording is an account of the effort to pay attention to them, and this would necessarily be discontinuous with whatever I was thinking before I began the effort. In order to truly record your thoughts you would have to be able to split your consciousness into two tracks, and have the one take note of the content of the other. Instead we rely on the next best option available: we stay in the only track we’ve got, and we retrieve thoughts from memory. But memory is slapdash — thrown together to make sense of things, extracted from the flow just to help us get our bearings through narration.

If all recording is recording of memory, and all memory is really just consciousness seeking to fix into place the sort of stable moments that may be seen as constituting a life, it follows that everything we say about our inner lives is fiction, and conscious embellishments —the dad was, it now seems to me, struggling to eject a cassette tape of Pat Benatar’s 1982 album, Get Nervous, only to grow more annoyed when, having popped it out, Midnight Star’s “Freak-A-Zoid” came on the radio— seem scarcely more dishonest than the unconscious ones.

Storytelling, the narrative art, has for most of human history been a communally shared tekhné for orienting the individual lives of a community’s members by placing them in a shared cosmological and historical frame. In this frame, embellishments, even up to the appearance of talking animals (see G. E. Lessing’s 1759 On the Use of Animals in Fables) or the apotheosis of historically real men into gods (in the Sakha Olonkho, the Mongolian name of Genghis Khan appears, but as a divinity rather than as a warlord), may be made without any threat to the register of truth that the narration aims to transmit. In order to establish this register of truth, typically a sort of incantation is made at the beginning, marking a shift to “another time”. This incantation in effect makes all of the temporally unfolding events the protagonists endure into a sort of single-panel artwork, which is to say into poetry.

6. Turned inside out

How do we enter into the ritual time outside of time in which narrative unfolds? One common way is through the evocation of shared “historical” representations of how time began, of the origins of the succession of moments that characterizes our temporal reality: the spirits that moved over the water in Genesis, the cosmogonical preludes that initiate many epic recitations around the world. “My lady Mother Earth,” one Olonkho legend begins, telling of the early moments of the cosmos, “Was still the size of a grey squirrel’s claw / Spreading out and stretching, / Generating and growing, / Like the suede of the ear / Of a two-year-old doe / Turned inside out”.

Even the formulaic markers that initiate non-epic lore-sharing in many languages, the equivalents of “Once upon a time…”, evoke a period in which either the human will had not yet been subjugated to natural forces (“Back in the days when it was still of some use to make wishes,” as the German has it), or in which the laws of logic had not yet become fixed (“There was and there was not,” they say in Lithuanian), seem to be striving towards an eternal-tense plane from which the events subsequently narrated are understood somehow to descend. Even if the bulk of the events related in epic after the initial cosmogonical framing describe the lives of mortals, bound by the law of the excluded middle and all sorts of other natural constraints, still the framing itself places these mortals along a continuum with the divine and unchanging.

A cosmogonical revue, with the relevant foundational events passing in quick succession —a flash, some flagellates, dinosaurs, another flash, ape-men, farm implements, &c.— is a familiar convention in middle-brow entertainments passing as high-brow, most notably Terence Malick’s 2011 Tree of Life. As a critic a decade ago pointed out, Malick’s opening sequence, from the Big Bang to T. Rex to Brad Pitt, is in fact a dilation upon the maximally panned-out “universal” shot that begins films dating back to the silent era, eventually fixed most iconically in the various renderings of the Universal Pictures logo.

Whether frozen into a logo or stretched out over several images, this convention of “beginning at the beginning” serves to place the human-level story that follows within a fixed and maximally widescale cosmic frame. It is this same frame that the untrained student seeks to establish when, writing a paper on some narrow topic like steel tariffs or Brexit, feels compelled to declare, “Since the dawn of time…”. G. W. Leibniz himself begins his study of the medieval origins of the noble Guelf and Este families’ respective medieval genealogies with an account of the first formation of the Earth and the congealing of the continents out of lava.

These are literally clichés: stock images or incantations that help to move us into a different register of language and that prime us for receptivity to the narrative art, whether of the “fictional” or the “non-fictional” variety. Modern literature and film have sought to banish cliché in favor of “real talk”. Martin Amis declared “war” on cliché. But real talk is useless as a catalyst for transit into another temporal dimension. Literature that has given up on the incantatory function of language has freed itself of cliché, perhaps, but in so doing has simply blended into the “real world” whose richness it was aspiring to reflect.

The problem here is that part of the responsibility for the world’s richness was in any case up to us, and depended on our descriptive power in attention to worldly objects of our choice. This isn’t an anti-realist point, but rather an expression of what I’ve taken to calling “selective realism”. The world is real, but it’s still up to you to pick out the parts of it you wish to acknowledge, and to ignore those you don’t. And here arts and entertainment, by their single-minded obsession over the past century or more with real talk, with sounding the way people sound when they aren’t telling stories, have done a terribly poor job in choosing the objects of reality that they hold up and illuminate.

And now, as the already narrow sliver of reality we care about in our human life on earth gets vacuumed up by a single clever and voracious machine of our own invention, arts and literature are proving little able to do anything more than to mirror back the algorithmic drone logic on which this new machine works.

7. The opposite of poetry

It should be clear by now that I think the new “content” monopoly portends a dark future indeed for the narrative arts. Nowhere is this clearer to me than in the “Skip Intro” option that a viewer is given at the start of a new episode of any made-for-binge-watching series. To skip the intro is not just to “get on with things”, as an innocent viewer might imagine. It is rather to opt out of the transition into ritual time that storytelling requires, and to permit us to misapprehend the nature of the material with which we are engaging. I believe this is an inevitable rather than a merely unfortunate consequence of the reconception of the narrative art as “content”: the hooks and incentives that keep you attentive to it need be no different from the non-narrative gratification mechanisms built into video games and social media.

Social media is pure temporality unstructured by narrative. You move from stimuli to lulls to stimuli, from lulls to lulz and back again, but there is no development or progression or conceivable sense of possible completion. In this sense, social media is also the opposite of poetry, again conceived as the condensing of an idea, perhaps an idea with some narrative dimension to it, into a single instant or point. When we zoom out from the stimuli embedded in “content”, whether the likes of social media or the plot conventions of made-for-binging television, what we see is a uniform mass, a homoeomery, something that has neither the atemporal perfection of poetry nor the purposive temporal structure of prose, but only the unstructured lapsing of time — pure purposeless chronophagy.

8. “The Writers”

Tragically, throughout the collapse of narrative art into content production over the past years, human beings, some of whom are too young to have any serious experience of self-contained narrative artworks, continue to seek narrative cohesion for their own lives from entertainment sources that see them entirely as content consumers and not at all as moral subjects. This process is signaled memetically for example in the figure of the college freshman, sitting in a new dorm, saying or thinking something to the effect that “this show’s new season is just beginning”, where “the show”, pitiably, is the show of the kid’s own small life.

It has become common, in fact, for the sort of people who would never commit to the reality of the fates or the gods, to joke instead about the fiction of “the writers” who shape our lived reality. For example it was often said of 2020 that “the writers really lost the plot this year”. This habit is more revealing than it might appear. Other than the formidable Dick Van Dyke Show decades ago, and a few other small exceptions, up until the last few years the public does not seem to have been generally aware that TV shows had writers at all. When Homer Simpson said something funny in 1995, it was Homer himself who said it, or may as well have been for all we cared to think about the matter. Now by contrast it is common in online chatter about TV shows to hear speculation and authoritative assertions about who was in the writers’ room for a particular episode of a particular show, what that individual writer’s sensibilities are, and so on.

This follows a general trend toward greater competence, or at least the display of an appearance of greater competence, among the online public in domains that would previously have been circumscribed by vocationally determined insider sociolects. Thus not long ago it seems everyone was saying, of certain anticipated high-profile court cases, that the facts would “come out in discovery”, even though most who said this were not themselves attorneys. And who has not heard an aspiring business whiz say, of a niece’s lemonade stand or a spouse’s homemade-jewelry project on Etsy: “Problem is, it doesn’t scale”?

There was a time when we were not generally expected to be up on such things as legal discovery and market scalability. But social media have leveled the walls that once kept the métiers peacefully separated off from one another, and now everyone works, and does not work, everywhere. Among other things, we are all in the entertainment industry now, and good workers, even the ones working on precarious terms (signaling parasocial affinity for programs and characters, in exchange for a potentially monetizable micromeasure of social influence) will know to keep close tabs on the writers’ rooms.

We are thus at a doubly curious juncture, where TV writers have been elevated in the popular imagination to the role of ersatz demiurges, and at the same time the writers themselves have largely shirked the traditional quasi-divine role fulfilled by any storyteller or practitioner of narrative art, shifting instead to the work of mere content production. In this work, any narrative dimension in the final product is strictly speaking vestigial, from an earlier era of narrative art with which the new content only pretends to be continuous. Narrative adorns content in the same way faves adorn a tweet; both function only to maximize “user engagement”, and if either appears to be tailored to the specific affective or aesthetic expectations of the “user”, this is only an appearance.

If this assessment sounds bleak or cynical, consider Amazon’s recent acquisition of MGM for $8.45 billion. Jeff Bezos now holds the rights to numerous treasures of twentieth-century American entertainment, not least Albert R. Broccoli’s almost boutique-style James Bond films with their iconic, mythos-incanting musical opening numbers. Bezos has explicitly stated his intention to “reimagine and redevelop that I.P. [sic] for the 21st century.” On the surface, his idea of what a “good plot” looks like would seem to make twenty-first century content scarcely different from the most archaic and deep-rooted elements of myth and lore. Thus he thinks there needs to be a heroic protagonist, a compelling antagonist, moral choices, civilizational high stakes, humor, betrayal, violence…

“I know what it takes to make a great show,” Bezos has confidently said, “this should not be that hard. All of these iconic shows have these basic things in common.” The problem is that Bezos’s purpose in returning to a quasi-Proppian schema of all possible storytelling is not at all to revive the incantatory power of cliché to move us into the ritual time of storytelling. It is rather to streamline and dynamicize the finished product, exactly as if it were shipping times Bezos were seeking to perfect, rather than the timing of a hero’s escape from a pit of conventional quicksand.

And so the college freshman imagining her life as a show seems doubly sad: she turns to the closest thing we have to new narrative art in order to frame her own life and make it meaningful, but the primary instances our culture yields up to her to help with this framing are in fact only meaningless content being passed off as narrative art. It is no wonder, then, that what she will likely end up doing, after the passing and briefly stimulating thought of life itself as a TV show, is to go back to doomscrolling and vain name-checking until sleep takes over.

9. Just like us!

It is not that imagining your own life narratologically is inherently stunting or harmful. We have long made sense of our own small actions on the temporal plane by conceiving them as shadows or approximations of what the heroes of legends do on the eternal plane. And this continues into the era of cinematic mass entertainment: if I can think of the Big Bang as pushing by inexorable teleology towards the eventual appearance of Brad Pitt, then I might be able to understand myself and orient myself in the world along similar cosmic coordinates. The problems arise when the heroes of the legends, the saints and ancestors, and even the Golden Age movie stars, get sucked down onto the temporal plane and come to be conceived parasocially as being “Just Like Us!” —to use the favored cliché of Us Magazine—, as people with whom we might have been bidirectionally amicable in another possible world, but whom we are almost as happy to imagine asymmetrically as our friends in this one.

The paparazzi who furnished Us with the images by which it documented its claim were the harbingers of a now totalizing image of reality, in which every fictional character, every actor who plays a fictional character, every consumer of the content in which this character figures, adheres to a uniform set of norms and styles of expression. These are dictated by social media, and now even fictional characters must follow them in a rear-guard manner. These characters do not belong to a different plane of reality at all, let alone a higher one. They are after-echoes, just as the content of the courses we teach at the university or the program at the symphony are also shaped by the downstream flow of social media.

The freshman who shares the meme about how her being a freshman feels like the central conceit of some imaginary show is, then, getting something right: the new entertainment industry, and the people whose attention this industry seeks to hold captive, have converged in their parallel evolutionary paths upon exactly the same “feel”: the feel of abandonment, of pointless addiction, of desire for more, without any idea of what eventual fulfillment of that desire would even look like.

10. Iterations vs. remakes

We all know, or ought to know, that movie stars are unexceptional human beings. It is not even clear that they can do the one thing for which they are purportedly famous better than any randomly chosen person could. Whenever I watch a film in which the director makes the “experimental” choice to use non-professionals in starring roles (Carlos Reygadas’s Japón of 2002, Dmitriï Davydov’s Тыалга кытыастар кутаа of 2016 (sometimes identified by the English Bonfire and by the Russian Костер на ветру)), I am struck by how perfectly competent these commoners are — how believable they are as commoners!

I would even note that in what I have sometimes said is my favorite film, when I am called upon to pick a favorite —Robert Bresson’s Au hasard Balthazar of 1966— the lead role is played by a donkey, and that donkey is more compelling and believable, in its enactment of key scenes from the life of Christ, than any human being cast by Martin Scorsese or Mel Gibson ever was. Bresson’s donkey-Christ is compelling mostly because his literally artless acting is in the service of parables already framing and structuring our lived reality — the myths that structure a culture can be variously instantiated, and even a weak signal from the mythical plane is enough to imbue any new retelling of them with the full power of their original iteration.

In the narrative art, retellings draw their power from the original form of the story. In the production of content, by contrast, remakes seek to overcome the original by recasting a mothballed story no one cares about any more —or even worse, a story told in another language— with suitably contemporary design, effects, and idiom. An era of iterations is one that values and knows what to do with the narrative art and its power to structure reality; an era of remakes is one that can’t tell narrative art from content, and that suffers accordingly from a general dearth of artistic resources through which to impose meaningfully narrative order on one’s own life.

A formal device whereby we circle back to the beginning and give all eleven sections of this circuitous essay the appearance of unity and all-at-once-ness

I have said that while the commandment to live each day as if it were your last sounds wise, it neglects the fact that life requires a presumptive projection into the future that in part shapes what we do on the present day. This projection is “prosaic”, to the extent that it requires an image of life as a temporal unfolding or progression. Such prosaic projection can overlap in the same person with a “poetic” apprehension of one’s own projects past and present as, in some sense, already complete, already condensed from a divine point-of-view into a single-panel epitome of who you are and what you’re all about. I am not defending a “marks-and-traces” determinism —according to which there are traces in our souls of everything that will ever happen to us— as the correct account of reality, but only trying to discern the poetic state of soul of someone who might find such an account attractive.

As with life, so too is most narrative art an overlapping of poetry and prose: which typically relates a sequence of events, but also establishes its all-at-once-ness by the repetition of certain formal constraints, and particularly by the evocation of these constraints at the outset. No such constraints are recognized on the content-model of entertainment, which, again, follows the logic of social media rather than of cinema or even of golden-age television.

You might insist that the extent of the rupture is being exaggerated here, that the sort of elements Bezos identifies as constituting “a good series” are not so different from what you would find in The Lone Ranger or some other cliffhanger-dependent weekly entertainment of the 1950s. But this is like comparing Woolworth’s and Amazon. The true function of The Lone Ranger might well have been to sell Alka-Seltzer or cigarettes, but the system was so far from being optimized for maximum extraction of the desired resource (capital, attention) that there was still plenty of time to simply tell the story and to allow a viewer to move in a weekly cycle into high-story mode and back out again.

You remove that pacing, leaving yourself with nothing but pure optimized plot and no limit to a viewer’s dosage of it, and you find that the plot quickly sublates into pure undifferentiated content, impossible to distinguish from the plotless stimulations of the feed. The TV show becomes indistinguishable from the social-media timeline where everyone is busily speculating about the balance of humors in the show’s writers’ room.

In our new intensified stage of the “prosaic age” Walter Benjamin already described a century ago, we have finally obtained “pure prose”: temporal sequences that don’t lead anywhere, and that are entirely unframed by the eternal.


About the Author

Justin E. H. Smith is an author and professor of philosophy in the Department of History and Philosophy of Science at the University of Paris. The Internet Is Not What You Think It Is, will appear in 2021 from Princeton University Press.

Publication Rights

This essay was first published in Justin E. H. Smith’s Hinternet. Subscribe here. Republished with permission.

Comments are closed.