My So-Called Opinions

The Stone

The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

I.

Critics of the millennial generation, of which I am a member, consistently use terms like “apathetic,” “lazy” and “narcissistic” to explain our tendency to be less civically and politically engaged. But what these critics seem to be missing is that many millennials are plagued not so much by apathy as by indecision. And it’s not surprising: Pluralism has been a large influence on our upbringing. While we applaud pluralism’s benefits, widespread enthusiasm has overwhelmed desperately needed criticism of its side effects.

By “pluralism,” I mean a cultural recognition of difference: individuals of varying race, gender, religious affiliation, politics and sexual preference, all exalted as equal. In recent decades, pluralism has come to be an ethical injunction, one that calls for people to peacefully accept and embrace, not simply tolerate, differences among individuals. Distinct from the free-for-all of relativism, pluralism encourages us (in concept) to support our own convictions while also upholding an “energetic engagement with diversity, ” as Harvard’s Pluralism Project suggested in 1991. Today, paeans to pluralism continue to sound throughout the halls of American universities, private institutions, left-leaning households and influential political circles.

Those of us born after the mid-1980s grew up amid a new orthodoxy of multiculturalist ethics and ‘political correctness.’

However, pluralism has had unforeseen consequences. The art critic Craig Owens once wrote that pluralism is not a “recognition, but a reduction of difference to absolute indifference, equivalence, interchangeability.” Some millennials who were greeted by pluralism in this battered state are still feelings its effects. Unlike those adults who encountered pluralism with their beliefs close at hand, we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable. As a result, we continue to struggle when it comes to decisively avowing our most basic convictions.

Those of us born after the mid-1980s whose upbringing included a liberal arts education and the fruits of a fledgling World Wide Web have grown up (and are still growing up) with an endlessly accessible stream of texts, images and sounds from far-reaching times and places, much of which were unavailable to humans for all of history. Our most formative years include not just the birth of the Internet and the ensuing accelerated global exchange of information, but a new orthodoxy of multiculturalist ethics and “political correctness.”

These ideas were reinforced in many humanities departments in Western universities during the 1980s, where facts and claims to objectivity were eagerly jettisoned. Even “the canon” was dislodged from its historically privileged perch, and since then, many liberal-minded professors have avoided opining about “good” literature or “high art” to avoid reinstating an old hegemony. In college today, we continue to learn about the byproducts of absolute truths and intractable forms of ideology, which historically seem inextricably linked to bigotry and prejudice.

For instance, a student in one of my English classes was chastened for his preference for Shakespeare over that of the Haitian-American writer Edwidge Danticat. The professor challenged the student to apply a more “disinterested” analysis to his reading so as to avoid entangling himself in a misinformed gesture of “postcolonial oppression.” That student stopped raising his hand in class.

I am not trying to tackle the challenge as a whole or indict contemporary pedagogies, but I have to ask: How does the ethos of pluralism inside universities impinge on each student’s ability to make qualitative judgments outside of the classroom, in spaces of work, play, politics or even love?

II.

In 2004, the French sociologist of science Bruno Latour intimated that the skeptical attitude which rebuffs claims to absolute knowledge might have had a deleterious effect on the younger generation: “Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.” Latour identified a condition that resonates: Our tenuous claims to truth have not simply been learned in university classrooms or in reading theoretical texts but reinforced by the decentralized authority of the Internet. While trying to form our fundamental convictions in this dizzying digital and intellectual global landscape, some of us are finding it increasingly difficult to embrace qualitative judgments.

Matters of taste in music, art and fashion can be a source of anxiety and hesitation.

Matters of taste in music, art and fashion, for example, can become a source of anxiety and hesitation. While clickable ways of “liking” abound on the Internet, personalized avowals of taste often seem treacherous today. Admittedly, many millennials (and nonmillennials) might feel comfortable simply saying, “I like what I like,” but some of us find ourselves reeling in the face of choice. To affirm a preference for rap over classical music, for instance, implicates the well-meaning millennial in a web of judgments far beyond his control. For the millennial generation, as a result, confident expressions of taste have become more challenging, as aesthetic preference is subjected to relentless scrutiny.

Philosophers and social theorists have long weighed in on this issue of taste. Pierre Bourdieu claimed that an “encounter with a work of art is not ‘love at first sight’ as is generally supposed.” Rather, he thought “tastes” function as “markers of ‘class.’ ” Theodor Adorno and Max Horkheimer argued that aesthetic preference could be traced along socioeconomic lines and reinforce class divisions. To dislike cauliflower is one thing. But elevating the work of one writer or artist over another has become contested territory.

This assured expression of “I like what I like,” when strained through pluralist-inspired critical inquiry, deteriorates: “I like what I like” becomes “But why do I like what I like? Should I like what I like? Do I like it because someone else wants me to like it? If so, who profits and who suffers from my liking what I like?” and finally, “I am not sure I like what I like anymore.” For a number of us millennials, commitment to even seemingly simple aesthetic judgments have become shot through with indecision.

It seems especially odd because in our “postcritical” age, as the critic Hal Foster termed it, a diffusion of critical authority has elevated voices across a multitude of Internet platforms. With Facebook, Twitter and the blogosphere, everyone can be a critic. But for all the strident young voices heard across social media, there are so many more of us who abstain from being openly critical: Every judgment or critique has its weakness, making criticism seem dangerous at worst and impotent at best.

This narrative runs counter to the one that has been popularized in the press about the indefatigable verbiage of blog-hungry millennials, but it is a crucial one. The proliferation of voices has made most of them seem valueless and wholly interchangeable, even for important topics. To use social media to publicly weigh in on polarized debates, from the death of Trayvon Martin to the Supreme Court’s striking down of the Defense of Marriage Act, seems to do nothing more than provide fodder for those who would attack us. This haunts many of us when we are eager to spill ink on an issue of personal importance but find the page to be always already oversaturated.

III.

Perhaps most crucially, the pluralistic climate has confused stances on moral judgment. Even though “difference” has historically been used, according to the philosopher Cornel West, as a “justification for degradation and a justification for subordination,” we millennials labor to relish those differences and distances separating individuals, exalting difference at all costs.

We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?

Related
More From The Stone

Read previous contributions to this series.

Consider the challenge we might face when confronted with videos from the popular youth-oriented news outlet Vice. Here, viewers can watch videos of communities, from across the globe, participating in a host of culturally specific activities, ranging from excessive forms of eating to ritual violence to bestiality. While the greater Western culture may denounce these acts, a substantial millennial constituency would hesitate to condemn them, in the interest of embracing “difference.”

We millennials often seek refuge from the pluralist storm in that crawlspace provided by the expression “I don’t know.” It shelters the speaking-subject, whose utterances are magically made protean and porous. But this fancy footwork will buy us only so much time. We most certainly do not wish to remain crippled by indecision and hope to one day boldly stake out our own claims, without trepidation.

Zachary Fine

Zachary Fine is a junior at the New York University Gallatin School of Individualized Study.