• Kindle

No comments so far!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Among other, graver things, I will remember the summer of 2018 as the summer during which we tried to stop using plastic straws. This was largely at the behest of the #StopSucking campaign, spearheaded by actor-slash-environmentalist Adrian Grenier, previously best known for his starring role in HBO’s Entourage. A poignant viral video of a sea turtle having a straw surgically removed from its snout made the internet rounds, and, like that, straws were the new six-pack rings. Savvy to the fact that profit margins wouldn’t bear the strain of demanding that patrons sip directly out of glasses, restaurants and bars catering to the socially conscious found various workarounds. Some places would give you metal straws, opaque and inflexible in a way that immediately made you suspicious about whether or not they could ever possibly be sanitized. Others opted for waxed paper straws, which, in spite of possessing a certain old-timey charm, backfired in all of the ways that a piece of paper submerged in liquid might be expected to fail. Sitting at a café in Brooklyn in August, I desperately raced to down a sixteen-ounce (plastic) cup of iced coffee before a festively striped paper straw dissolved into it completely. No one talked about the cups, which were still just regular commodities; it was only abstaining from straws that would earn you Adrian Grenier’s ruggedly handsome moral benediction. A few days later, at the upstate wedding I was in town to attend, the bride vowed to give up straws as a 21st-century act of honoring and obeisance.

Shortly afterwards, I flew from Chicago to Seattle to visit my parents, who were spending the summer at our cabin on Lopez Island in the San Juan archipelago. The islands dot the Strait of Juan de Fuca, which traces the northwest corner of the contiguous United States and connects the Salish Sea to the Pacific Ocean. To get there you drive north from the city for a couple of hours, through the Skagit Valley at the base of the Cascade foothills, until you get to a former logging town called Anacortes; then you wait in line to catch the ferry. The crossing only takes forty minutes, but once you’re there it feels as if you’ve gone much further: the island is mostly evergreen forest and pastureland, with a year-round population of about 2,500 (not counting the summer people) and not a single stoplight.

When I arrived my parents and I acted out a practiced family routine: spirited and slightly uncivilized political debate over dinner (the positions argued for being left, lefter and leftest), followed by dessert in front of PBS NewsHour, with important personal updates exchanged rapid-fire during pauses in Judy Woodruff’s reportage. There was a lot to catch up on: our geriatric cat’s arthritis seemed to be improving in the warm weather, Costco’s Kirkland Signature house brand had started making a kombucha that they were selling in bulk, and, most important of all, my parents had recently attended a wake for a baby orca.

The story of Tahlequah’s calf had held sway over the Pacific Northwest since she had given birth on July 24th, garnering the same kind of frenzied media attention as the royal wedding had in May. In this case, however, the narrative was unequivocally sad. The calf, a female, had died soon after birth. Had she lived, she would have been the first surviving calf born to the pod of orcas native to the San Juans in three years. Shortly after the birth, the mother whale (labeled J35 but known colloquially by both researchers and the general public as “Tahlequah”), began swimming with the corpse, keeping it afloat by alternately pushing it with her nose, tucking it under one flipper, and balancing it on her head at the surface of the water. Whenever she lost hold of the calf’s decaying body, she dove down to the seafloor to retrieve it. Carrying a deceased calf for a few hours or days is a typical mourning behavior for orcas, but this case was extreme: what media outlets began referring to as “Tahlequah’s tour of grief” continued for seventeen days and a thousand miles, all near the coastline of the San Juan Islands.

The orcas are already celebrities in the Pacific Northwest, where their local public profile vastly outstrips the likes of Adrien Grenier, but the story about Tahlequah’s calf gave the resident pod a brief window onto a wider audience. The narrative was a ready-made melodrama: a mother driven to the brink of suicide out of overwhelming grief for her dead child. It was as if nature had produced a Lifetime Original Movie. Tahlequah’s act of mourning provoked an upwelling of empathetic responses, particularly among women who had lost children themselves. Today.com ran an article with the subhed “I think every baby-loss parent I have ever met relates to Tahlequah.” In December, Washington’s governor, Jay Inslee, who was rumored to be considering a 2020 presidential run, proposed a $1.1 billion budget package aimed at orca recovery.

In the terms of conservation biology, orca whales are charismatic megafauna, meaning big animals that are capable of generating particular interest among human beings. The term was originally introduced as a tag for describing “flagship species,” or the kinds of animals—pandas, elephants, snow leopards—that elicit the kind of emotional response that causes people to give money to a conservation campaign. It was only after the phrase was coined that researchers tried retroactively to nail down the precise nature of this human response and to specify exactly what it means for an animal to be “charismatic.” One study in this vein proposed that animal species be evaluated in terms of the following six traits: rare, endangered, beautiful, cute, impressive and dangerous. While some of these have measurable components, for the most part they are matters of subjective taste. Along these lines, other research paradigms have included even more abstract aesthetic qualities such as “grace” and “majesty.” In rough-and-ready terms, much of animal charisma is bound up with what we can see and what we like looking at.

The sociologist Max Weber defined charisma as “a certain quality of an individual by virtue of which he is considered extraordinary and treated as endowed with supernatural, superhuman, or at least specifically exceptional powers or qualities. These are such as are not accessible to the ordinary person, but are regarded as of divine origin or as exemplary.” The term originated in Biblical Greek, where it referred to leaders who had been favored with divine grace, or charis. In the modern context, we associate it more frequently with cult leaders and people who are good at professional networking. The thread linking all these cases together is the aspect of being deemed outstanding in a way that holds sway over others.

In this light, animal charisma can be construed as a similar form of bias towards what human beings find exemplary, but in the nonhuman world. It is an artifact of the way that we continuously draw distinctions between the ordinary and the extraordinary, and accord special attention and deference to the people and things that stand out to us.

It’s tempting to characterize this way of construing the natural world as a process of anthropomorphizing nature, of attributing human characteristics to the inhuman. But this isn’t quite right. The charismatic point of view is more accurately understood as being “anthropocentric” in a very literal sense: it presents an image of the world from a human perspective, a panorama of how things look to us. While we do care about some natural things because they’re similar to us, we also pay attention to things for a host of other reasons—because they are useful, confusing, trendy, pleasing or particularly disgusting, to name just a few.

Charismatic megafauna are useful for fundraising for specific causes, but less effective in the face of large-scale systemic problems like climate change. It is becoming a truism to remark on the way that the attempt to understand the totalizing character of the climate crisis—the sheer enormity of its reach and impending consequences—often seems to result in indifference and inaction. In response, authors and researchers intent on spurring people to action have employed various motivational tactics.

Works of popular journalism about climate change typically underscore numbers and facts, pairing a litany of statistics with a jeremiad against the failed government policies that lurk behind the numbers. The aim seems to be to convince deniers using evidence backed up by the gold standard of scientific objectivity, while, at the same time, rallying the already faithful by reminding us how long we’ve gone without fixing anything. These pieces instill a uniform feeling of dread in the reader, operating from the apparently unquestioned premise that nebulous anxiety is what the situation demands.

The summer of 2018 yielded an exemplary instance of the genre in Nathaniel Rich’s issue-length essay “Losing Earth,” published in the New York Times Magazine. The article examines what Rich identifies as a crucial ten-year span from 1979 and 1989, during which the climate crisis first fully arrived on the American political scene, and our government first cognizantly failed to do anything to forestall it. Although the historical narrative is well researched and presented, Rich ultimately finds himself running up against the confines of his approach to his subject matter. The reader gets the impression that what Rich finds truly fascinating is less the story itself than the deeper question it raises. That is: How can it be that, even when all the outcomes are understood in advance, human beings continually pick their individual short-term pleasure and gain over their long-term collective survival?

Faced with this question, Rich has a tendency to fall back on a set of gnomic truisms about a perverse and undefined x-factor that he refers to as “human nature.” In one such passage he writes:

We can trust the technology and the economics. It’s harder to trust human nature. Keeping the planet to two degrees of warming, let alone 1.5 degrees, would require transformative action. It will take more than good works and voluntary commitments; it will take a revolution. But in order to become a revolutionary, you need first to suffer.

And again in the concluding sentences of the article:

Human nature has brought us to this place; perhaps human nature will one day bring us through. Rational argument has failed in a rout. Let irrational optimism have a turn. It is also human nature, after all, to hope.

These interludes come (perhaps inadvertently) close to giving the impression that exactly the thing we desperately want to avoid—the elimination of human life as we know it—might in fact be the only viable solution to the problem. If only we could become as “trustworthy” as our technology and our economic models; if only we could behave in accordance with “rational argument”; if, in short, we could just elide the frailties endemic to being human and simply be good, then we could get everything straightened out in short order. Cue laughter from all the generations of dead philosophers in the first circle of Dante’s hell.

Recent academic debates about climate change have challenged this framework, in part by challenging the assumption that there is any such thing as “human nature,” or that “humanity” is the relevant category for debates about climate change at all. Instead, they often point to political economy.

Take Swedish scholar Andreas Malm, who favors the idea that capital rather than humanity writ large is the cause of anthropogenic climate change. In a 2015 piece for Jacobin, Malm criticized thinkers who engage in “undifferentiated collective self-flagellation” while appealing “to the general population of consumers to mend their ways.” For Malm and other ecological Marxists—for example, Alyssa Battistoni, who accused Rich of missing the point of his own story in her response piece, “How Not to Talk about Climate Change,” also published in Jacobin—responsibility for the climate crisis lies with the fossil-fuel industry and its various neoliberal enablers in the public and private sectors. According to this view, the focus on human nature exacerbates and obfuscates the main problem: “Species-thinking on climate change only induces paralysis,” Malm writes. “If everyone is to blame, then no one is.”

It is undeniable that fossil-fuel usage is one of the primary causes of climate change. It is less obvious, however, that replacing the general category of “human nature” with the more historically specific but only marginally less general category of “capital” does much to improve our insight into how we might inspire actual change. Being tasked with overthrowing our global economic system does not, on the whole, seem much easier than changing “human nature.” And, as many have noted, it’s not ultimately clear that the two concepts can be disentangled at all.

Taking up the angle of economic critique permits one particular form of human life to stand in for human life more broadly construed, and it allows for thinking in the optimistic, future-oriented terms of revolution instead of in the more pessimistic, culturally taboo terms of suicide and death.

But this shift of focus away from humanity glosses over one of the overlooked boons of the climate crisis: the clear view it offers of human finitude. Human nature, properly filled out with content, is perhaps not an entirely useless category after all. Climate change, if we can make ourselves pay attention to it, compels us to acknowledge that we as a species are limited, both perspectivally and temporally. Someday, maybe sooner and maybe later, human life as a whole will come to an end. Some time before that happens, all of us alive now will die. And in the time that remains (to borrow some millenarian watchwords), we can either learn to operate within the confines of our limitations—or not.

But while climate change telegraphs the fact of our death to us, it still leaves us with an open question about how to live. It may be, as Heidegger wanted to have it, that there is something more authentic about human life when we contemplate our finitude and come face to face with the primordial anxiety that lies behind the meaningless rote performance of our quotidian routines. But for most of us, even on good days, that’s a bit much. Perhaps we should instead be, as women’s magazines would have it, a little kinder to ourselves.

There is an intellectual distaste for ideology and mass culture that recoils from the facile character of phenomena like Adrien Grenier’s straw campaign and Tahlequah’s tour of grief. The fact remains that stories like these motivate people to act. It may not be dignified to change your coffee order because a hot celebrity recommends it, and it may be narcissistic to work to preserve animal habitats because one whale in particular reminds you of your struggle to conceive—but that doesn’t mean these impulses are worthless. Weber uses the term “divinity” in his description of charisma. The term sounds hopelessly old-fashioned, but perhaps the divine is in fact what the situation demands. Not the all-powerful, transcendent God from the Judeo-Christian tradition, but rather some minor spirits of the everyday variety. The kind of idols that aren’t beneath showing up to us in the guise of animals or celebrities.

There is a dominant cultural narrative that conceives of nature as having been “enchanted”—another appropriation from Weber—before the advent of modern natural science. According to this story, a naïve animism reigned over an idyllic world populated by household gods and monsters until science came along and revealed things for what they truly are. But science is also a human activity: regardless of how useful or destructive it is, it operates within the perspectives of the human researchers who undertake it. If past societies lived in better harmony with nature it isn’t because they saw it more or less truly; rather, they had a more functional grasp on their illusions, a better way of living with the charisma of the natural world.

Acknowledging this doesn’t mean embracing a full-scale back-to-the-land movement; it simply means refining and reinventing ways of seeing from a specifically human point of view. We have never stopped enchanting, venerating and mythologizing things; we’ve just gotten worse at it. And this is something we can change; something difficult, but not impossible.

Pacific Northwesterners, particularly in the islands, are known to be susceptible to a kind of spiritualism that is often dismissed as “New Agey.” At the wake, as my parents told it, groups of people gathered on beaches on both Lopez and the neighboring San Juan Island. On Lopez, a member of the local Swinomish Indian tribe led chants and songs. To conclude, they held a seventeen-minute silent vigil; one minute for each day Tahlequah had carried the calf. Writing this in Chicago, it sounds like a frivolous hippie ritual, but in the San Juans it makes sense. The situation is different there: nature still often seems to have the upper hand over civilization. It presses in from all sides, and the vastness of the ocean, mountains and forests makes the manmade things look small, sheepish and temporary.

Once on a summer crossing to Lopez, the ferry came to a halt in the middle of its route. We had intersected two of the orca pods swimming together, and they had the right of way. I walked out on to the deck in time to see the whales—I believe there were at least thirty—leaping and playing all around the ferry in the late evening light. Everyone was smiling; the whales were unmistakably recognizable as a natural expression of joy.

Photo credits: Matt Bins (CC BY / Flickr) ; Marco Vech (CC BY / Flickr); United Nations Photo (CC BY / Flickr)

  • Kindle

No comments so far!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.