I’m an art critic. Most of my writing is on my website, the Manhattan Art Review. Probably the most distinctive feature of the site, and certainly the most divisive, is the “Kritic’s Korner” section, which uses a rating system of one to five stars. I originally intended the section to be an afterthought: quick reviews with the rating acting as a shorthand for my reaction. Five stars is as good as it gets (at the time of writing I’ve given ten five-star ratings out of roughly eight hundred reviews, and six of those have been for historical shows), four is an unconditional success, three is indifferent, two is an unconditional failure, and one star signifies something I found personally offensive. But I quickly realized that my habits were more suited to going to galleries every week than to working regularly on longer pieces, that there weren’t very many shows I wanted to write about at length, and that a regular stream of blithe, off-the-cuff reviews would attract more attention than intermittent longer essays. I’ve ended up writing ten or more reviews every week more or less consistently since November of 2019, minus the COVID-19 lockdowns and a couple of summer breaks.
All of this could sound like a rather obvious format to anyone familiar with Letterboxd, but it’s a disruption to the prevailing norms of art writing. One big reason that art criticism has always been a comparatively marginal practice—putting aside for now the special difficulties of writing about visual objects—is that there’s no market for the kind of quality-based reviews that have long proliferated for other kinds of cultural objects. Film, music, food and book critics write for a general public that can be swayed to spend their money one way or another, whereas the general public cannot afford to buy the art that is written about in Artforum. Critical discourse and consensus do have some limited correlation with the art market, but a good review generating a lot of foot traffic for a show is not at all guaranteed to generate income for artists and galleries—and, broadly speaking, participants in the art market mostly see critics as a threat to their investments. There’s no clear economic reason for art criticism that is not glorified public relations to exist, and so it barely does. But while art is an extreme case in this regard, it’s also a leading indicator: as a defender and judge of quality, the critic is an endangered species in many industries these days. This wasn’t always the case.
Addison could make quite a thing of it. Imagine how snide and vicious he could get and still tell nothing but the truth.
—Eve Harrington, All About Eve
For much of the twentieth century the critic constituted a compelling, if semi-sinister, literary stereotype. The critic was the decadent cynic who, having long since dissipated their capacity for artistic pleasure, used their rhetorical skill to manipulate popular taste for personal gain or idle sadism. These characters range from stock figures to outright parody: All About Eve’s Addison DeWitt, Basil Valentine in The Recognitions, even Statler and Waldorf on The Muppet Show. In common usage the critic and the cynic are nearly interchangeable terms: by one definition, courtesy of Merriam-Webster, a critic is “one given to harsh or captious judgment,” while a cynic is “a faultfinding captious critic, especially one who believes that human conduct is motivated wholly by self-interest.” The difference is mainly that the cynic believes selfishness is inescapable. That may seem a small distinction, but it already contains within it the existential question that any critic must put to himself: Is criticism nothing more than sophistry motivated by self-interest? Or does the critic have a role to play in helping us make “better” judgments about art?
Naturally, such a question is fallacious, as if the matter could be settled by a straightforward yes or no. For example, in the above quote, Addison DeWitt, by proxy of Eve, is manipulating another character, Karen, to go along with his scheme to make Eve a star. In threatening to blackmail Karen, he threatens to tell the truth about her misdeeds in his column. DeWitt is a malicious character, but he is not a hack. There is never any suggestion he is anything less than an actual expert on theater. It is, in fact, the thoroughness of his expertise that enables his corruption. After all, although DeWitt connives to make Eve a success, he does so because he knew she was worthy of stardom in the first place. That’s what makes it a great movie; Eve dethrones her idol Margo Channing by a calculated betrayal, but the result is not a complete injustice. The theater is simply a den of snakes, and backstabbing is the law of the land. This image of the critic may seem less than flattering, but at least it concedes that the critic’s social standing, however misused, is grounded in the possession of perceptual skills that are of cultural value. At present, even that allowance is no longer certain.
As Hegel defines it: “Thinking is, indeed, essentially the negation of that which is immediately before us.”
—Herbert Marcuse, “A Note on Dialectic”
Today the mere suggestion that some things are better than others, particularly in the arts, is met with confusion and hostility. The insistence that there is no reason not to “let people enjoy things” reigns, as if evaluation itself can be nothing but an act of antisocial pretension. There is, admittedly, a fragment of truth in this. I know very well the dangers that criticism can pose to enjoyment: I was born a pathological overthinker, neurotic and hard to please. For years I nursed vague artistic aspirations, but it turns out that obsessively thinking about art is a bad way to become an artist. Thinking about a movie or a piece of music while it plays is a mental digression, a self-awareness of the act of experiencing that pulls one out of the act of seeing and listening. Conversely, making art is an activity. Artists think, of course, but thinking of what to draw and drawing are two different things. Someone can stare at a canvas all day, thinking about painting, but they’re only a painter if they put some paint on it; whether it’s any good is a question that comes later.
Being stuck in thought negates engagement and enjoyment, so it’s natural that we approve of art as the product of courage and creativity and distrust criticism as so much foul-tempered grumbling. This criticism of criticism inevitably emphasizes that art is subjective, which, according to experience, it is. No two people will have the same exact experience of a work of art. However, to treat art as completely subjective represses the role that thinking plays in our subjective experience, and in particular the process of judgment (which is part of our experience). Once we make any judgment at all we are aspiring to be objective, or at least correct, to the best of our knowledge. This objectivity may not be fully achievable, but if we are to think critically, or at all, the attempt is necessary. It is plainly impossible to approach the world without making judgments: anything from choosing friends you can trust to picking out a ripe orange requires a differentiation of qualities we learn to recognize through experience. Art and media are no different. A toddler will tend to prefer The Very Hungry Caterpillar to Moby-Dick, subjectively, but a twenty-year-old should be able to discern that the latter is an objectively better work of literature, even if they may not go so far as to agree that Herman Melville is better than Harry Potter.
Of course, today’s twenty-year-old is certainly less likely to read Moby-Dick, especially if no one has ever made the case to them that it’s a better or more important book than others that are more accessible and “fun.” This underscores the subtext of “letting people enjoy things.” Refusing negative criticism is not only an instinctive rejection of negativity itself, but also a preemptive defense against the notion that anything strange, antique or otherwise difficult may be of more value than what is familiar, popular or easy.
Another implication of subjective absolutism, alongside the insistence that no one should feel guilty for their “guilty pleasures,” is that these guilty pleasures are the only real pleasures. This means that no one actually likes philosophy, or movies with subtitles, or boring old books like Moby-Dick. The upshot being that, culturally speaking, the arts have been demoted to the level of “media,” recreational content intended for unreflective consumption. What more could one desire than another Marvel spectacle?
Just how vacuous the formal objection to subjective relativity is, can be seen in the particular field of the latter, that of aesthetic judgments. Anyone who, drawing on the strength of his precise reaction to a work of art, has ever subjected himself in earnest to its discipline, to its immanent formal law, the compulsion of its structure, will find that objections to the merely subjective quality of his experience vanish like a pitiful illusion: and every step that he takes, by virtue of his highly subjective innervation, towards the heart of the matter, has incomparably greater force than the comprehensive and fully backed-up analyses of such things as “style,” whose claims to scientific status are made at the expense of such experience.
—Theodor Adorno, Minima Moralia
It seems unimaginable to us now that Adorno levied his cranky sneer at all of cinema and jazz as inherently vapid commodities, but we have the benefit of hindsight for a future that was impossible for him to predict. Such are the hazards of attempting a critical theory. For someone who worked on a theory of aesthetics from 1956 to 1969, the last years of his life, Adorno was remarkably indifferent to everything that had happened in art after World War II. This relativity of his own perspective to the passage of time is, however, directly relevant to what it means to make serious critical judgments. You either die young and hip or live long enough to become out of touch. There’s no point to living in fear that one’s judgments will age poorly; it’s certain that most of them will. In Adorno’s case, his judgments appeared pessimistic at the end of the twentieth century, but it seems that society has caught up with him in the 21st.
For literature is like love in La Rochefoucauld: no one would ever have experienced it if he had not first read about it in books. We require an education in literature as in the sentiments in order to discover that what we assumed—with the complicity of our teachers—was nature is in fact culture, that what was given is no more than a way of taking. And we must learn, when we take, the cost of our participation, or else we shall pay much more. We shall pay our capacity to read at all.
—Richard Howard, preface to S/Z
I’m well aware that liking Adorno qualifies me as an insufferable crank, but I contend that we are in dire need of more insufferable cranks. As it stands, society has no criterion for intellectual maturity beyond turning seventeen and being allowed unaccompanied into an R-rated movie. At that point, the whole of experience is considered revealed to us. This is, in a word, stupid.
I tried to read Ulysses when I was seventeen and made it one hundred pages in before giving up. I tried again last year and read the book from cover to cover (though most of the action in “Oxen of the Sun” still escaped me). Being 33 years old had very little intrinsic impact on how I read the book; the important thing was that I’d spent the intervening sixteen years developing as a reader. I also bought three different editions of Ulysses, a volume of annotations, a book of literary essays on each chapter of the book and a very complicated guide advising hundreds of amendments of typographic errors to each edition. I even spent a couple of months distracted from the book itself because I developed an obsession with the blow-by-blow records of John Kidd and Hans Walter Gabler’s “Joyce Wars.” Some may balk at the idea of going to so much effort, but the effort is its own reward. Even though Ulysses is one of the best books I’ve ever read, the pleasure of reading it is but one particular enjoyment within the perpetual appreciation of literature, which itself is an enrichment of one’s relationship to language, thought and life. As Richard Howard puts it, literature, sentiment, enjoyment, even love, require an education.
And this brings us, finally, to the process of becoming a critic, what that means, what it is. As I mentioned, I was drawn to the arts in high school, mostly music. Like (I assume) many teenagers, I intuited an authenticity and a sprawling sense of possibility in artistic expression that I didn’t find elsewhere. I had liked science fiction and video games as vehicles for fantasy and escape, but by the end of my teens I’d begun looking for something deeper in music, film and literature. In retrospect I was driven even then by a critical impulse. I started with the lowest-common-denominator “best of” lists: The 100 Greatest Novels of the 20th Century, IMDb’s Top 250, Rate Your Music’s Top 100. Aside from the few unimpeachable classics, these lists featured works favored by middling taste, but they were invaluable for giving me a framework grounded in a historical consensus of what constituted great art, something apart from an unreflective consumption of the pop-cultural treadmill’s newest products. With that foundation, I was free to sift through history and see what stuck. This is, crucially, not a merely subjective process. Rather, it’s a dialectic between subjective enjoyment and the abstract notion of great art, in whatever form, that is mediated by the work itself. Teenagers, for all their wide-eyed idealism and thirst for mind-blowing novelty, are both entirely ignorant and unbearably pretentious. I wouldn’t have known about A Love Supreme if it wasn’t the number-one album of all time on Rate Your Music when I was sixteen, but by the next year I recall insisting to a friend that John Coltrane was greater than Charlie Parker because I had read somewhere that Coltrane would practice scales for twelve hours a day. The friend, who had some conventional jazz education, was skeptical, as he should have been, because aside from the fact that Parker practiced just as much, I didn’t know what the hell I was talking about. Like most teenagers, I had no ability to make judgments outside of parroting opinions from elsewhere, no sense of the overarching history of the genre or even much ability to listen actively and make my own observations. My affectation of an opinion was laughably overstated, but that’s how one gets started developing an actual point of view.
This is how artistic self-education works today; a young person wants to develop their taste, but they have no framework for judgment except their own meager experience and mostly uninsightful information from the internet. On the one hand, deferring to experience is necessarily infantilizing to someone who has so far only consumed media for children; the mainstreaming of “nerd culture” is the result of society at large attempting to prolong childhood indefinitely. On the other hand, constructing one’s personality around consensus notions of high art is scarcely better, leading to another marginal subculture: the dreaded Reddit music geeks who believe that Radiohead is the greatest band of all time because their albums have the highest aggregated average ratings, or the self-proclaimed film buffs whose understanding of cinema never developed beyond Taxi Driver and Fight Club, rhapsodizing in YouTube vlogs on the profundities of a crane shot or a certain cut ad infinitum. These are two sides of the same coin, an ignorant versus a pretentious philistinism. The only means to bridge them is intelligence. The concept is as fraught as any, but by invoking it I don’t mean any kind of inherent superiority or quantifiable IQ. I mean rather the emergence of sensibility, sensitivity and a distinct personal taste, which are indistinguishable from the slow development of intellectual maturity.
Intelligence does not emerge by rote, or else pretentiousness would be enough. The value of the arts is the capacity to teach intelligence by learning to perceive intelligence, which is itself the content of art; the expression of perceptivity in whatever form. To return to my own education, I ravenously absorbed as much music, film and literature as I could, mostly superficially. My desire to expose myself to everything I was led to believe was good outstripped my patience with the slow development of my ability to understand, let alone enjoy, much of it. This wasn’t an ideal process, and it is probably what doomed me to criticism. It killed whatever intuitive, creative feeling I had (not much, I suspect) in favor of an abstract, detached tendency to think about and analyze art. At the same time, that labor created a discipline and taste for expending effort that eventually became second nature.
There is no happiness other than that of intelligence. I think people like Rousseau, Montaigne, Diderot all attained it.
—Marguerite Duras, “I Thought Often…”
From all this exertion a sort of sense begins to emerge, if not quite a logic. Kerouac is thrilling to teenagers, so, having been thrilled by his writing, you follow his example and seek out his favorite authors, other Beats, Buddhist texts. None of it matches the hormonal rush of On the Road, but some of it is enjoyable, some impenetrable but suggestive, some boring or off-putting. The process repeats, finding an artist whose sensibility seems appealing and who suggests other points of discovery. But a good critic can also be invaluable here, because such recommendations are part of their job. Unlike the compilers of the Top 100 lists, a critic is defined by their taste instead of any consensus or canon, although these are not completely separable. The point is not for the reader to fully subjugate their own judgment to the critic’s, but to recognize and respect that the critic’s sensibility represents some understanding of the scope of their subject, albeit in a contingent, individualized way. Take the first critical voice I really liked: the music critic Robert Christgau, best known for his tenure at the Village Voice from 1969 to 2006. Christgau is probably my most direct critical “influence,” even though I disagree with plenty of his opinions. What compelled me was his ability to articulate more in a couple of pithy sentences than I was used to reading in thousands of words of Pitchfork reviews.
Christgau was able to convey so much so economically, I suspect now, because of his ability to make judgments that were both coherent and convincing, according to standards he had developed in dialogue with a broader canon. The common music-writing formats of pop-culture hagiography, technical and aural descriptions, the “trip report” method of talking about ambient music that was popular on Blogspot in 2010, heinous creative-writing attempts from Tiny Mix Tapes—none of it serves to articulate whether a song is good or not, or how it compares to other songs that might appear superficially similar. Ironically, the line of Christgau’s that has stuck with me the most is one I’m not entirely sure he wrote; I’ve never been able to find it, but it sounds so much like him that I can’t believe that it was anyone else. The line is: “Sonic Youth is the least vocally gifted band since the Grateful Dead.” It’s a mundane statement in retrospect, and maybe not even that witty, but it’s a good example of what he impressed upon me as a college student: a mature perspective on popular music that synthesized a conception of the subject as a whole and allowed for those leaps of logic and reference that I found exciting. The value of these judgments is not in their being absolutely right or wrong, but in the way they crystallize the critic’s sensibility. As the title of Christgau’s Consumer Guide books attest, they also offer the reader an informed suggestion of how they might be advised to spend money and attention in the process of building their own taste.
You couch so much of this in terms of your individual response and the individual creativity of the people making the music—what I say is that all art, even arty-art, high art, which is really the kind of art you’re interested in, whether you like it or not, is dependent on a social context. And if the social context dries up, so does the art.
—Robert Christgau, from a conversation with Gerard Cosloy and Joe Levy in SPIN, 1989
Now, I write art criticism, but I haven’t yet talked about visual art. It doesn’t dominate when I think about art in general, probably because I’ve spent less of my life thinking about it. I was only introduced to art in a more than historical sense in my mid-twenties, so I never had a direct immersion in the art world, which is the only way to really engage with it, until I was already writing about it. I’ve probably learned more about contemporary art in the past three-odd years, since I started writing the Manhattan Art Review, than I knew going into it.
The discipline of art, even as a viewer, is rarefied and particular to itself. It’s hard to access, for one thing. Even if you’re a dedicated gallery- and museumgoer living in one of the few major cities where visual art is readily available, standing in front of specific artworks is still only a small part of one’s relationship to art. What occupies the rest of the relationship is thinking about it: flipping through catalogs and reading biographies, essays, criticism, art theory and artists’ writings. All this is part of the process of refining one’s thoughts about art at a remove from the art itself, and therefore part of the preparation for viewing. It’s in this sense that visual art is particularly tied to criticism; judgment and discrimination are essential to putting the experience of art in motion. It’s rare for an art lover not to engage critically with art at some level.
The disappointment of bad art is its inability to be anything more than what was expected, whereas one of the greatest pleasures of art—and one of the few well suited to the critic—is when it proves to be more than what was suggested by your preconceptions or by the small photo you could make out on your phone. In this sense, looking at art and, by extension, criticism, is a realization of that dialectic between subjective experience and the formal criteria of judgment. A critic who is hardened against having their perspective altered by experience has become staid and dogmatic (as has often, perhaps inevitably, been the fate of opinionated critics like Adorno, Clement Greenberg and Michael Fried), but a critic with no knowledge of art has no means of thinking about their experience in the first place.
Art criticism is not a document of experience: this green wowed me; that box filled me with awe; that figure reminded me of my mother; I cried. Those experiences are as singular and impossible to translate as art itself. Criticism is, rather, the documentation of thinking about art, and particularly about an artwork’s success or failure. A critical judgment may age well or poorly, but the value of these judgments is not in whether they are right or wrong. After all, judgments are never objectively true for all time. Artistic reputations have long risen and fallen in ways that seem ridiculous to us now: Bach was obscure in Beethoven’s day; Piero della Francesca was incompatible with the Victorian sensibilities of Ruskin’s generation, just as Jacques-Louis David and Ingres are off-putting to mine.
A critic’s sensibilities should not be held to a standard of infallibility but to their internal coherence, which is necessary for the value of criticism: to be eloquent and perceptive, to convey with intelligence the value the critic sees in art. Good writing about art serves to elevate and enrich the experience of good art and to clarify the inadequacies of bad art, to put words to the nonverbal aesthetic language that the critic has built. More particularly, a critic’s recognition of artistic quality does not simply put art into words but brings new qualities into being. The subjectivity of art extends beyond the artist’s own intentions, so a critic can discover new ways of seeing art in their criticism in the same way that artists find new ways of seeing the world in their art. This is easiest to conceive of regarding art history: we can see now how della Francesca’s geometric rigor foreshadows tendencies that would be taken up five hundred years later with minimalism, and Monet probably would have been bewildered by Greenberg’s writings on his paintings scarcely thirty years after his death. On less grand but more useful terms, artistic quality is never given; it has to be found, fought for and defended. This is the critic’s fight.
For [Daniel Joseph Martinez’s] 2022 new work, he photographed himself in the (prosthetically enhanced) guise of five pop-cultural “post-human” antiheroes including Frankenstein, Count Dracula and the Alien Bounty Hunter from “The X-Files.” But what makes the piece gripping is a statement that accompanies the images, a scathing indictment of the human race as the earth’s “ultimate invasive species,” one that’s about to self-destruct and take every other living thing down with it.
—Holland Cotter, “A Whitney Biennial of Shadow and Light,” New York Times, 2022
Criticism may be unpopular with the public these days, but what’s even more disturbing is how it has fallen out of fashion with critics themselves. The above is taken from Times art critic Holland Cotter’s indicative review of one of the most indicative art exhibitions in recent memory, the 2022 Whitney Biennial. From Cotter’s perspective, any work that gestures toward a political moral is considered a success as art for invoking that moral. So Daniel Joseph Martinez takes some self-portraits in monster makeup and we are told it is “gripping” because the artist calls them a commentary on human destructiveness. This art presents a platitude that the New York Times considers good, therefore the art is good. Cotter is certainly making a judgment. But the circular logic he employs to justify that judgment negates art itself in favor of generic sentiments, denying engagement with the particular qualities of an artwork that make artistic judgments meaningful in the first place.
The problem is not political art or “wokeness” as such, but rather with the way that treating activist slogans as sufficient criteria for good art—and any artist who peddles those slogans as an adequately accomplished artist—dismantles the function of art: the struggle toward expression, to eloquently articulate qualities that are beautiful, emotive or otherwise engaging. The problem is with a way of seeing that reduces art to a resolved formula, when in fact it is precisely the opposite. Art is actually always about insufficiency, the personal desire to achieve something greater than what is possible, to capture the universal in the particular. Just as objectivity is an elusive ideal of thought, art aspires to an impossible, singular finality, whether in a painting of an apple in all its appleness, an assemblage that fully resembles nothing but itself, or an abstraction that shows the face of God. These goals are unrealizable as absolutes, but nevertheless artistic quality comes from this aspiration. Art cannot be “the sublime,” but it can be sublime, just as a painting cannot be an apple, but it can suggest appleness.
The assertion of art’s insufficiency may seem out of left field, but it points back to the concept of learning and growth that I’ve been attempting to flesh out. Another term for this could be cultivation, as in becoming “cultured,” which underscores its social dimension. Individuals are, of course, implicated in their sociocultural context, which informs and directs their own development. New things, old things, art, historical information and so on, the things we learn from, are cultural products that are undetermined by our individual relationship to them. Subjective intuition, too, plays a role in the instinctual attraction and aversion to specific things, but in the same way that no one becomes a master chef without instruction, these outside influences are the primary means of our cultivation.
Moreover, as one can gather from pseudo-Dionysius, good radiates itself and existence. So what makes a thing good is what makes it radiate. Now radiating is a kind of activity, and substances act by way of powers; so things are called good because of something added to their substance. Being good then adds something real to existing…
—Thomas Aquinas, Quaestiones Disputatae de Veritate, 21.1
RUTE MERK – XP – TARA DOWNS – *
A woman in an Arc’teryx jacket holding the Wikipedia logo, two paintings of matcha lattes, fruit, flowers, etc., all painted in an imitation of PS2 graphics that doesn’t look impressive or appealing regardless of the technique required, but I’m not under the impression it’s particularly demanding either. Oh, she’s collaborated with Balenciaga? Say no more, I knew this pretentious, gutless, commodified rehash of Berlin from ten years ago reminded me of something.
—Manhattan Art Review, May 2023
In the contemporary context, becoming cultured requires a resistance to the prevailing culture, and could ironically be considered countercultural. Nevertheless the pursuit remains necessary, and perhaps even unavoidable, because it is intrinsic to our nature. Cultivation is the growth into a distinct individuality by means of culture, an understanding of oneself and the world that always seeks to more fully encompass this understanding, a knowledge of life, an intelligence. This aspiration reaches toward an absolute, an omniscience that is both desired by and denied to humans: something I might call God if I were religious, but that for our purposes we can call the good. This good is something we can only put ourselves in service to. Good art, by extension, is good by its achievement of the good, a channeling of an external sense of life into an artwork. Good criticism seeks to recognize this good in art as much as it can.
The process of learning to discern what separates the truly good from the seemingly good, and the failed attempts at the good from the irredeemably bad, does not follow rules. It cannot be learned like a scientific formula. In a society that has seen a universal decline of the cultural institutions that should exist to edify it, many will not even know that it is a process, or that there is any point in subjecting oneself to it. In this context it becomes doubly important, for the sake of culture as well as for one’s own good, to judge the differences between good and bad, the real, authentic and profound versus the shallow, the crassly commercial and the uninspired.
My own work with the Manhattan Art Review aims to do this, attempting as best I can to discriminate between artworks that possesses good qualities and those that lack them. Even though some might deem it unsophisticated or needlessly provocative, the rating system in Kritic’s Korner is central to this project. Writing about art can have any number of objectives, but lurking behind any analysis is the question of judgment. Most contemporary art writing uses interpretation as a way of sidestepping the problem of quality, but interpretations are impossible to take seriously if the art itself is bad. A critic who avoids evaluation may have a less contentious body of work; perhaps they will protect themselves from ever saying anything that will sound embarrassing to future generations. The cost is that they won’t be able to help their readers learn how to judge art or to understand it, which are in essence the same thing. In my judgments of particular art shows I convey my understanding of art, or of good art, which I can only hope is of some use to others interested in developing their own taste. The goal of the Review has been to provide a quixotic counterweight to the prevailing conventions of art and commentary about art, which might otherwise, in their greed, indifference and literal-minded sloganeering, counsel cynicism.
When I was younger and wanted to be an artist, my awareness of mature art made my immature attempts at creativity intolerable to me. Unlike young artists who unselfconsciously develop a sense for working before gradually refining it, I couldn’t stand the idea of not making good art immediately, so I resolved to try to understand art as best I could before I started to make it. My method didn’t pan out artistically, but I learned that thinking about art is its own discipline. After years of putting self-education above all else, I came into my own kind of maturity as an observer. Far from implying any lofty or profound genius, what I mean by maturity is only a familiarity gained through experience with the cycles of art and life, learning to slow one’s overbearing urge to consume everything while developing routines whereby the effort to, say, read philosophy ceases to be a chore and becomes a pleasure. Likewise, the task of learning about art only deepens with experience; instead of scouring the archives in an effort to find an objectively correct position, the point becomes expanding the understanding of an artist’s work as you become familiar with them, which widens your conception of art generally. Whether or not the criticism I write now is of any lasting value is not for me to say, but personally my engagement with art is its own reward, a self-perpetuating, inexhaustible exercise of effort and attention, which in the end is, I suspect, as close as I am able to come to the satisfactions of making art.
Artists never complete a single, perfect artwork, and a single work never instigates an absolute transcendence in viewers. We may aspire toward this quasi-theological ideal, but art only has the ability to suggest the sublime. The real sustenance of the artistic is the scope of experience it provides, the cumulative sense of growth and cultivation of ourselves through art, a tendency toward a good that we can never capture but only assist in radiating itself and existence.
Art credit: (1) Rafael Delacruz, Don’t sleep while we explain, oil and cochineal on canvas, 77½ × 144 in, 2022; (2) Rafael Delacruz, Human Audio Sponge, oil and cochineal on canvas, 27¼ × 30⅛ in., 2022. Courtesy of the artist and Mitchell-Innes & Nash.