• Kindle
  • fishfry

    Great piece. Another datapoint is the current field of artificial intelligence (AI). In the 1970’s, AI researchers tried to give the computer a model of reality. A tree has branches, a branch has leaves. The computer was expected to reason about the world from the facts about the world that we taught it. That approach has been abandoned. Now the “machine learning” algorithms simply interrogate a huge corpus of data. Reality is nothing; the algorithm is everything.

  • Al de Baran

    “Some of us are just so constituted as to not have quietism as an option.” Then, to be blunt, that is your problem. Try to re-constitute, because from my perspective, any other solution, is not merely puerile (or “twee”), but leads to madness (and not the good kind).

  • Bjorn Merker

    It is all self-inflicted. What on earth are you doing on Twitter, Facebook, and other social media, “checking in on the discourse”, as you put it? There is no financializing conspiracy afoot, only a lot of intellectually lazy users with bad taste and stupid enough to sign up for these vapid media platforms. Why rub shoulders with them? And not doing so does not consign you to hand made paper and a typewriter. What is wrong with e-mail?

  • Tedd

    Justin, I know this isn’t a complete answer to the problems you pose, but I’d like to introduce you to what I call neo Amishism: Defining, for yourself, what constitutes quality of life and then choosing among the available technologies the ones that are compatible with that definition. That’s how, for example, I decided not to join Facebook. Facebook was simply not compatible with what I consider a high-quality use of my online time. I’m not holding that out as a “correct” judgement about Facebook, only as correct for me. While perhaps not a solution to the problems you pose, I suspect that if the “neo Amish” approach was adopted by more people we would regain some of our agency as subjects.

  • Len Yabloko

    I am sorry, but your creative despair reminds me of a medieval alchemists accusing pharmacist – with their ready formulas for cures – in destroying the spirit of magic. Creativity does not stop just because someone invented a quick recipe to achieve basic effects previously only achievable by magic. All that happened with human supposedly turning into algorithms – is peeling off yet another layer of opaqueness covering the innate simplicity of form or systemic structure. The mystery does not disappear – it merely unravels. Individuality moves from linguistic forms to algorithmic ones; the game of language to the computer game. And yes – there are casualties, but only temporary while new craft or art is refined. AI is a new tool, not a new soulless overlord, or Matrix or whatever menace it projects in its shadow. It should be mastered, not feared.

  • jason kennedy

    Great piece. Then again, I am 46 years old, too, and W S Burroughs is a similarly irreducible part of my make-up. My only solution to the situation you describe has been to abandon the ‘discourse’, and watch silently as it swallows everything. For example, I used to occasionally read the Paris Review, but its editor was MeToo’d, and replaced with a woman who immediately began commissioning articles seeking to eliminate specific artists from the cultural sphere on the basis of, ultimately, their deviancy from a set of contemporary cultural norms developed and propagated by a self-selecting cultural elite. Here is an example: https://www.theparisreview.org/blog/2018/12/03/rethinking-schiele/

  • Alex Austin

    “What isn’t data will be. If it won’t be, it’s data now.”–Geo Maglio, Norval Corp, Los Angeles.

  • Alex Austin

    I would wager that the responders who downplayed Justin Smith’s concerns would agree that for the earth’s climate a tipping point exists. Beyond that point the earth will be irreversibly changed for the worse, perhaps uninhabitable. Why is it so implausible that human nature has a similar tipping point? I don’t think the comparison of alchemists to pharmacists adequately reflects current pressure on essential human values, as well argued in Smith’s essay.

  • John Stoner

    Algorithms have their own absurdity. I would recommend the excellent book ‘Weapons of Math Destruction.’

  • Leopard

    Of course, Burroughs and Gysin’s Third Mind cut up approach to writing is itself an algorithm. The reduction of subjectivity – the further reduction of subjectivity – to a category of capitalism for economic gain is the problem underlying what you’ve said. Quite well, I might add (as if that mattered as you say).

  • Matthew Beaver

    somebody once told me the world was gonna roll me

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Is there any way to intervene usefully or meaningfully in public debate, in what the extremely online Twitter users are with gleeful irony calling the “discourse” of the present moment?

It has come to seem to me recently that this present moment must be to language something like what the Industrial Revolution was to textiles. A writer who works on the old system of production can spend days crafting a sentence, putting what feels like a worthy idea into language, only to find, once finished, that the internet has already produced countless sentences that are more or less just like it, even if these lack the same artisanal origin story that we imagine gives writing its soul. There is, it seems to me, no more place for writers and thinkers in our future than, since the nineteenth century, there has been for weavers.

This predicament is not confined to politics, and in fact engulfs all domains of human social existence. But it perhaps crystallizes most refractively in the case of politics, so we may as well start there.

There are memes circulating that are known as “bingo cards,” in which each square is filled with a typical statement or trait of a person who belongs to a given constituency, a mouth-breathing mom’s-basement-dwelling Reddit-using Men’s Rights Activist, for example, or, say, an unctuous white male ally of POC feminism. The idea is that within this grid there is an exhaustive and as it were a priori tabulation, deduced like Kant’s categories of the understanding, of all the possible moves a member of one of these groups might make, and whenever the poor sap tries to state his considered view, his opponent need only pull out the table and point to the corresponding box, thus revealing to him that it is not actually a considered view at all, but only an algorithmically predictable bit of output from the particular program he is running. The sap is sapped of his subjectivity, of his belief that he, properly speaking, has views at all.

Who has not found themselves thrust into the uncomfortable position just described, of being told that what we thought were our considered beliefs are in fact something else entirely? I know I have been on many occasions: to be honest, this happens more or less every time I open my newsfeed and look at what my peers are discoursing about. For example, I admire Adolph Reed, Jr., a great deal; I believe he is largely correct about the political and economic function of “diversity” as an institutional desideratum in American society in recent decades; and I believe, or used to believe, that I had come to view Reed’s work in this way as a result of having read it and reflected on it, and of having found it good and sound.

But then, not so long ago, I happened to come across this from an American academic I know through social media: “For a certain kind of white male leftist,” my acquaintance wrote, “Reed makes a very convenient ally.” What would be a fitting response to such exposure as this? Should I stop agreeing with Reed? Easier said than done. It is never easy to change one’s beliefs by an exercise of will alone. But it would be will alone, and not intellect, that would do the work of belief change in this case: the will, namely, to trade in the algorithm that I’m running for one that, I’ve recently learned when checking in on the discourse, is preferred among my peers.

Another example: I have read that Tinder users agree that one should “swipe left’” (i.e. reject) on any prospective mate or hookup who proclaims a fondness for, among other writers, Kurt Vonnegut, Ernest Hemingway or William S. Burroughs. I couldn’t care less about the first two of these, but Burroughs is very important to me. He played a vital role in shaping how I see the world (Cities of the Red Night, in particular), and I would want any person with whom I spend much time communicating to know this. I believe I have good reasons for valuing him, and would be happy to talk about these reasons.

I experience my love of Burroughs as singular and irreducible, but I am given to know, when I check in on the discourse, that I only feel this way because I am running a bad algorithm. And the result is that a part of me—the weak and immature part—no longer wants the overarching “You may also like…” function that now governs and mediates our experience of culture and ideas to serve up “Adolph Reed” or “William S. Burroughs” among its suggestions, any more than I want Spotify to suggest, on the basis of my playlist history, that I might next enjoy a number by Smash Mouth. If the function pulls up something bad, it must be because what preceded it is bad. I must therefore have bad taste, stupid politics; I must only like what I like because I’m a dupe.

But something’s wrong here. Burroughs does not in fact entail the others, and the strong and mature part of the self—that is to say the part that resists the forces that would beat all human subjectivity down into an algorithm—knows this. But human subjects are vanishingly small beneath the tsunami of likes, views, clicks and other metrics that is currently transforming selves into financialized vectors of data. This financialization is complete, one might suppose, when the algorithms make the leap from machines originally meant only to assist human subjects, into the way these human subjects constitute themselves and think about themselves, their tastes and values, and their relations with others.

I admit I am having trouble at present differentiating between the perennial fogeyism that could always be expected of people who make it to my age (I’m 46), and the identification of a true revolutionary shift in human history. But when I check in on the discourse, and I witness people only slightly younger than myself earnestly discussing the merits of action-hero movies that as far as I can tell were generated by AI, or at least by so much market data as to be practically machine-spawned, I honestly think I must be going insane.

Spiderman: Into the Spider-Verse looks essentially the same to me as these videos that have been appearing on YouTube using copyright-unrestricted lullabies and computer graphics designed to hold the attention of infants. “Johnny Johnny Yes Papa,” for example, now has countless variations online, some of which have received over a billion hits, some of which appear to be parodies, and some of which appear to have been produced without any human input, properly speaking, at all. It is one thing to target infants with material that presumes no well-constituted human subject as its viewer; it is quite another when thirty-somethings with Ph.D.s are content to debate the merits of the Marvel vs. the DC Comics universe or whatever. If I were an algorithm, and I encountered an adult human happily watching Spiderman, I would greet that human with a “You may also like…” offer to next watch “Johnny Johnny Yes Papa” on a ten-hour loop. That is how worthless and stunting I think this particular genre of cultural production is.

Professional sport has long been ahead of the curve in depriving those involved in it of their complete human subjecthood, and it should not be surprising that FIFA and NFL and similar operations are producing for viewers the one thing even more stupid and dehumanizing than Hollywood’s recent bet-hedging entertainments. Is there any human spirit more reduced than that of an athlete in a post-game interview? The rules of the game positively prohibit him from doing anything more than reaffirming that he should like to win and should not like to lose, that he has done his best or that he could have tried harder; meanwhile, the managers and financiers and denizens of betting halls are reading him up and down, not as a subject with thoughts and desires at all, but as a package of averages, a bundle of stats. This process of deprivation was famously (and to much applause) accelerated in recent decades when new methods of mathematical modeling were applied in managerial strategies for team selection and game play. In even more recent years the tech companies’ transformation of individuals into data sets has effectively moneyballed the entirety of human social reality.

I see this financially driven destruction of human subjecthood as the culmination, and the turning inward and back upon ourselves, of a centuries-long process of slow mastery of the objects of our creation as they move through the natural environment. The first vessels to cross oceans simply set out as singular physical entities, as wood in water. But by the age of global colonialism and trade, ships were not just physical constructions. They were now insured by complicated actuarial determinations and economic commitments among men in the ships’ places of origin, and these operations, though they left no physical mark on the individual ship that set out to sea, nonetheless altered the way ships in general moved through the sea, the care the captain took to avoid wrecks, to log unfamiliar occurrences, to follow procedure in the case of accidents.

It seems this transformation, from physical object to vector of data, is a general and oft-repeated process in the history of technology, where new inventions begin in an early experimental phase in which they are treated and behave as singular individual things, but then evolve into vectors in a diffuse and regimented system as the technology advances and becomes standardized. In the early history of aviation, airplanes were just airplanes, and each time a plane landed or crashed was a singular event. Today, I am told by airline-industry insiders, if you are a billionaire interested in starting your own airline, it is far easier to procure leases for actual physical airplanes, than it is to obtain approval for a new flight route. Making the individual thing fly is not a problem; inserting it into the system of flight, getting its data relayed to the ATC towers and to flightaware.com, is. When I first began to drive, cars, too, were individual things; now, when on occasion I rent a car, and the company’s GPS follows me wherever I go, and the contract binds me to not drive outside of the agreed-upon zone, and assures me that if the car breaks down the company’s roadside service will come and replace it, I am struck by how ontologically secondary the car itself is to the system of driving.

The transformation of planes and cars from individual things into vectors of data in a vastly larger system has obvious advantages, safety foremost among them. An airplane is now protected by countless layers of abstraction, by its own sort of invisible bubble wrap, a technology descended from the first insurance policies placed on ships in the golden age of commerce.

This wrapping makes it possible for rational people (I am not one of them) to worry not about singular cataclysms, but rather about systematic problems that generally result in mere inconveniences, such as multi-plane backups on the runway. It is not surprising, in a historical moment in which such structural breakdowns are easily perceived as injustices, as occasions to ask to speak with a proverbial manager, that in more straightforwardly political matters people should spend more time worrying about structural violence than about violence: more time worrying about microaggressions or the emotional strain of having to listen to someone whose opinion does not entirely conform to their own, than about violence properly speaking, the blows that come down on individual heads like waves striking individual ships or individual birds getting stuck in individual jet engines on take-off.

Someone who thinks about their place in the world in terms of the structural violence inflicted on them as they move through it is thinking of themselves, among other things, in structural terms, which is to say, again among other things, not as subjects. This gutting of our human subjecthood is currently being stoked and exacerbated, and integrated into a causal loop with, the financial incentives of the tech companies. People are now speaking in a way that  results directly from the recent moneyballing of all of human existence. They are speaking, that is, algorithmically rather than subjectively, and at this point it is not only the extremely online who are showing the symptoms of this transformation. They are only the vanguard, but, as with vocal fry and other linguistic phenomena, their tics and habits spread soon enough to the inept and the elderly, to the oblivious normies who continue to proclaim that they “don’t like reading on screens,” or they “prefer an old-fashioned book or newspaper,” as if that were going to stop history from happening.

I have a book coming out soon, called Irrationality, which attempts to articulate some of these ideas (though some only came to me after submission of the final manuscript). I am struck by how much, at this point, what we still call “books” are no longer physical objects so much as they are multi-platform campaigns in which the physical object is only a sort of promotional tie-in. I have found myself coming away from discussions with my good PR people feeling vaguely guilty that I do not have enough followers on Twitter (five thousand is the cut-off, I think) to be considered an “influencer,” or even just a “micro-influencer,” and feeling dismayed to learn that part of what is involved in launching a book like this into the world is strategizing over how to catch the attention of a true influencer, for a retweet or some other metrically meaningful shout-out. You would be a fool to think that it is the argument of the book, the carefully crafted sentences themselves, that are doing the influencing.

And yet for me to try to insert myself into the metrics-driven system would be a performative contradiction, since the book itself is an extended philippic against this system. And so what I do? I play along, as best I can, until I start to feel ashamed of myself. I contradict myself.

Am I just a disgruntled preterite, who couldn’t play the new game, couldn’t gain the following that would have made me believe in this new system’s fundamental justice? I know that ten years ago I was very optimistic about the potential of new media to help advance creative expression and the life of ideas, and I acknowledge that I have done much cultivation of my own voice online. Perhaps it is not the medium that I should blame for my present disappointment, but rather the limitations of my own voice. I admit I consider this possibility frequently. Perhaps the book isn’t even that good. Perhaps there is a bingo card out there already anticipating everything I say in it. Perhaps silence is the only fitting response to the present moment, just as it would have been fitting to put down one’s needle and thread when the first industrial looms were installed, and to do something—anything—else to maintain one’s dignity as an artisan.

I often think of an essay I read a while ago by a prize-winning photojournalist who had tracked down Pol Pot deep in Cambodia, had taken pictures of him, spoken with him, conveyed this historical figure’s own guilty and complicated and monstrous human subjectivity to readers. The essay was about the recent difficulty this journalist had been having paying his bills. He noted that his teenage niece, I believe it was, had racked up many millions more views on Instagram, of a selfie of her doing a duck-face, than his own pictures of Pol Pot would ever get. She was an influencer, poised to receive corporate sponsorship for her selfies, not because any human agent ever deemed that they were good or worthy, just as no human agent ever deemed “Johnny Johnny Yes Papa” good or worthy, but only because their metrics signaled potential for financialization.

My own book may be crap, but I am certain, when such an imbalance in profitability as the one I have just described emerges, between photojournalism and selfies, that it is all over. This is not a critical judgment. I am not saying that the photos of Pol Pot are good and the selfies are bad. I am saying that the one reveals a subject and the other reveals an algorithm, and that when everything in our society is driven and sustained in existence by the latter, it is all over.

What to do, then? Some of us are just so constituted as to not have quietism as an option. There are ways of going off the grid, evading the metrics, if only partially. One can retreat into craft projects of various sorts, make one’s own paper and write on it with fountain pens, perhaps get a typewriter at an antiques store and write visual poetry with it by rotating the paper and pounding the keys with varying degrees of force. But it is hard not to see this sort of endeavor as only the more twee end of the normie’s prideful declaration that he still has a paper subscription to the Times. It changes nothing.

As we enter our new technological serfdom, and along with liberal democracy we lose the individual human subject that has been built up slowly over the centuries as a locus of real value, we will be repeatedly made to know, by the iron rule of the metrics, that our creative choices and inclinations change nothing. Creative work will likely take on, for many, a mystical character, where it is carried out not from any belief in its power to influence the world as it is at present, as it may remain for the next millennia, but as a simple act of faith, as something that must be done, to misquote Tertullian, because it is absurd.

Human beings are absurd, or, which is nearly the same thing, irrational, in a way that algorithms are not, and it was this basic difference between these two sorts of entity that initially made us think we could harness the latter for the improvement of the lives of the former. Following a science-fiction plot too classic to be believed, our creation is now coming back to devour our souls.

This essay was first published on Justin E. H. Smith’s blog.

. . . . . . . . . . . . . . . . . . . . . . . .

If you liked this essay, you’ll love
reading 
The Point in print.
Subscribe today.

  • Kindle
  • fishfry

    Great piece. Another datapoint is the current field of artificial intelligence (AI). In the 1970’s, AI researchers tried to give the computer a model of reality. A tree has branches, a branch has leaves. The computer was expected to reason about the world from the facts about the world that we taught it. That approach has been abandoned. Now the “machine learning” algorithms simply interrogate a huge corpus of data. Reality is nothing; the algorithm is everything.

  • Al de Baran

    “Some of us are just so constituted as to not have quietism as an option.” Then, to be blunt, that is your problem. Try to re-constitute, because from my perspective, any other solution, is not merely puerile (or “twee”), but leads to madness (and not the good kind).

  • Bjorn Merker

    It is all self-inflicted. What on earth are you doing on Twitter, Facebook, and other social media, “checking in on the discourse”, as you put it? There is no financializing conspiracy afoot, only a lot of intellectually lazy users with bad taste and stupid enough to sign up for these vapid media platforms. Why rub shoulders with them? And not doing so does not consign you to hand made paper and a typewriter. What is wrong with e-mail?

  • Tedd

    Justin, I know this isn’t a complete answer to the problems you pose, but I’d like to introduce you to what I call neo Amishism: Defining, for yourself, what constitutes quality of life and then choosing among the available technologies the ones that are compatible with that definition. That’s how, for example, I decided not to join Facebook. Facebook was simply not compatible with what I consider a high-quality use of my online time. I’m not holding that out as a “correct” judgement about Facebook, only as correct for me. While perhaps not a solution to the problems you pose, I suspect that if the “neo Amish” approach was adopted by more people we would regain some of our agency as subjects.

  • Len Yabloko

    I am sorry, but your creative despair reminds me of a medieval alchemists accusing pharmacist – with their ready formulas for cures – in destroying the spirit of magic. Creativity does not stop just because someone invented a quick recipe to achieve basic effects previously only achievable by magic. All that happened with human supposedly turning into algorithms – is peeling off yet another layer of opaqueness covering the innate simplicity of form or systemic structure. The mystery does not disappear – it merely unravels. Individuality moves from linguistic forms to algorithmic ones; the game of language to the computer game. And yes – there are casualties, but only temporary while new craft or art is refined. AI is a new tool, not a new soulless overlord, or Matrix or whatever menace it projects in its shadow. It should be mastered, not feared.

  • jason kennedy

    Great piece. Then again, I am 46 years old, too, and W S Burroughs is a similarly irreducible part of my make-up. My only solution to the situation you describe has been to abandon the ‘discourse’, and watch silently as it swallows everything. For example, I used to occasionally read the Paris Review, but its editor was MeToo’d, and replaced with a woman who immediately began commissioning articles seeking to eliminate specific artists from the cultural sphere on the basis of, ultimately, their deviancy from a set of contemporary cultural norms developed and propagated by a self-selecting cultural elite. Here is an example: https://www.theparisreview.org/blog/2018/12/03/rethinking-schiele/

  • Alex Austin

    “What isn’t data will be. If it won’t be, it’s data now.”–Geo Maglio, Norval Corp, Los Angeles.

  • Alex Austin

    I would wager that the responders who downplayed Justin Smith’s concerns would agree that for the earth’s climate a tipping point exists. Beyond that point the earth will be irreversibly changed for the worse, perhaps uninhabitable. Why is it so implausible that human nature has a similar tipping point? I don’t think the comparison of alchemists to pharmacists adequately reflects current pressure on essential human values, as well argued in Smith’s essay.

  • John Stoner

    Algorithms have their own absurdity. I would recommend the excellent book ‘Weapons of Math Destruction.’

  • Leopard

    Of course, Burroughs and Gysin’s Third Mind cut up approach to writing is itself an algorithm. The reduction of subjectivity – the further reduction of subjectivity – to a category of capitalism for economic gain is the problem underlying what you’ve said. Quite well, I might add (as if that mattered as you say).

  • Matthew Beaver

    somebody once told me the world was gonna roll me

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.