Listen to an audio version of this essay here.
Fascination with the relationship between knowledge and power never dies. In just about every intellectual tradition, in essentially every documented era, the topic bristles through the canon, though the European tradition appears especially fixated upon it. Are the two domains compatible? Are they even distinct? Does the latter corrupt the former? While the philosophical treatments are perhaps the most systematic, it is literary representations that I find most alluring.
Take Victor Frankenstein, the hero (antihero?) of Mary Shelley’s 1818 novel, who sought to fuse the dreams of mystical alchemist sages with the humdrum empiricism of the lab tech. The result, predictably for a moral fable, was the destruction of the creator, the creation and everything in between. The subtitle of Shelley’s novel was “The Modern Prometheus.” This Greek mythological namesake brought fire down to the humans despite a direct prohibition from Zeus against any such meddling. (It did not end well for Prometheus: eagles, livers.) But of all the cautionary tales about the perils of the human temptation to wisdom there is one I keep coming back to: that of the late medieval scholar Faust, who, upon finding the life of the scholar a little tedious, ends up selling his soul to Mephistopheles for greater knowledge and power. When the story is represented and re-represented by the likes of Christopher Marlowe, Johann Wolfgang von Goethe, Charles Gounod, Mikhail Bulgakov, Thomas Mann and countless others, it is always a tragedy.
Of course, these are all fictions fixated on the problem of human (or, in the case of Prometheus, Titan) hubris abutting against the godhead. In our secular age—one is repeatedly told that it is indeed a secular age—we are supposedly beyond the superstitious fear of new knowledge and the dangers it might wreak upon the world. There’s a lot to argue against in that nugget of conventional wisdom, but I focus here on a key facet of the Faust story and its literary kin: that corruption in discovery hobbles true wisdom.
The prism of Faust appears again and again in commentaries about science during and after World War II, starting from the Manhattan Project to produce nuclear weapons in the United States and the V-2 ballistic missile enterprise in Hitler’s Germany. To be sure, the Faustian language did not appear instantly, but starting in the late 1960s and more intense popular critique of the thermonuclear standoff, it became too common to even provide a cursory list. The conceptual pairing is overdetermined: here we have the most rarefied of intellectual disciplines (physics often stands in as representative of all sciences) confronting the most diabolical of powerful entities: the nuclearized military-industrial complex. Cash was poured liberally upon the scientists, knowledge flowed back to their military paymasters, and all it cost was the former’s souls and a planet thrust into precarity. Faust is ready for his close-up.
Except that is not exactly how it happened—not for most scientists, either in the United States or, perhaps more surprisingly, in the Soviet Union. The vast majority of sciences engaged in an exchange, to be sure, but it wasn’t precisely Faustian. Science was then, and remains now, an incredibly expensive activity. In the twentieth century the public funds of the nation-state seemed the most obvious source of this lifeblood, and the military is often the most generous of potential government patrons. Although there was military-sponsored research into gunpowder and naval artillery before World War I, the scale really took off as the century progressed, and in the Cold War the scale of defense research was truly massive: a 1998 estimate put the price of just the U.S. side of the nuclear arms race from 1940 to 1996 at $5.8 trillion in 1996 dollars. (The numbers for military support today are still pretty large, but in the last three decades commercial firms have in aggregate come closer to outspending the generals, a point I will return to later.) The military got its weapons from this trade, but the scientists ended up with something as well, and for most of them it wasn’t a tattered conscience. What they got was a great deal of science.
●
If you are looking for a science-and-the-military poster child molded to the Faustian template, J. Robert Oppenheimer seems the perfect candidate (even if his biographers opted for the analogy with Prometheus). Raised in a wealthy family in New York City according to the secular humanist principles of the Ethical Culture School, he sailed through Harvard College under a punishing course load, learning Sanskrit along the way, all seemingly without breaking a sweat, and then popped over to Europe to pursue a doctorate in physics. His brilliance widely acknowledged, he took teaching positions at Caltech and UC Berkeley upon his return, gracing Californias North and South alike. He acquired a bevy of admiring graduate students and dabbled in Communist discussion groups. It was the 1930s.
Then came the war, and Oppie’s devil’s-bargain moment. He was tapped by General Leslie R. Groves, who helmed the Manhattan Project to develop a nuclear weapon, to be the scientific director of the bomb-design facility in Los Alamos, New Mexico. Security agents noticed his prior pink politicking and flagged the file, but Groves overruled them. If Oppenheimer had any qualms subordinating his knowledge to power, he did not show it. Not only did he expertly recruit and manage a whole flotilla of superb researchers, but he was even willing to accede to Groves’s suggestion that all scientists wear uniforms and submit to military discipline. (The recruited physicists talked him out of it, and civvies were the norm.) He was very, very good at this job, and the Ordnance Division designed both a uranium gun-type Little Boy and a plutonium implosion-type Fat Man bomb by July 1945, in time to devastate the cities of Hiroshima and Nagasaki in early August.
Later, after stepping down from Los Alamos and becoming Director of the Institute for Advanced Study in Princeton, New Jersey, Oppenheimer’s story took a different turn. Before the end of the war he had been behind almost every important closed door related to atomic bombs, and there he remained as the postwar curdled into Cold War, ruffling more than a few feathers along the way. He was not afraid to contradict the military on scientific grounds. He advocated against developing a nuclear-powered aircraft, despite the Air Force’s intense lobbying, and he was at first opposed to a crash program to develop a hydrogen bomb in 1949. This last contrarian position—combined with security services dredging up his attendance at various Berkeley salons, some guilt by association and a host of resentments of the icy superciliousness that was his general affect—proved sufficient for the Atomic Energy Commission to strip him of his security clearance in 1954. He grew gaunter, more spectral and, as the years advanced, increasingly expressed oracular statements about morality and science. The most famous of these, from a segment in a 1965 television documentary describing the first nuclear test on July 16, 1945, invoked his undergraduate days studying the Bhagavad Gita: “I am become Death, the destroyer of worlds.” Two years later he succumbed to cancer.
The tragic components in this story have attracted such expert impresarios as Peter Sellars and John Adams in the opera Doctor Atomic and Christopher Nolan with his cinematic extravaganza, currently in production (with Cillian Murphy as Oppie). And Oppenheimer wasn’t the only one with qualms. Robert R. Wilson was an up-and-coming physicist, Berkeley-trained and Princeton-postdoc’d, before joining the team at Los Alamos in 1943. When he heard about the devastation of Hiroshima, he felt ready to vomit. He devoted himself to scientists’ groups against the nuclear arms race, and eventually became the first director of the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, insisting on advancing civilian science free of military influence. Geophysicist Merle Tuve, who had done his own share of wartime work, tore himself apart with anxiety at the world that he and his colleagues had helped build. He became increasingly shrill as he refused invitations to work on fully funded military projects, refusing to “spend a third or half of my time on work” that would end up “wrecking most of what is valuable in my life.” He urged others to resist “this disfiguring disease” of military support.
These are rather familiar stories, even if you have never heard of Tuve. They fit the Faust model perfectly: working with power corrupts the knowledge-maker. The scientist risks losing his (in these stories, it is basically always “his”) soul. The rub, though, is that these stories are pretty rare. On the one hand, you have the Oppenheimers. On the other, you have no-less-emblematic counterparts like John von Neumann or John A. Wheeler or Edward Teller, eager to develop hydrogen weapons, nuclear mining or whatever else the military brass dreamed up. Even morally upstanding voices of reason such as Hans Bethe also consulted for defense without significant qualms. He was joined by thousands upon thousands of others who labored over transistors for fighter jets, ballistic missile fuels, refinements of nuclear warheads and many other military topics. Why did they do it?
Well, why wouldn’t they? First of all, many of those military topics were interesting. Studying the reentry of warheads into the atmosphere was a way to study the ionized plasma that gathered around them. Some of the most advanced programming of the day was figuring out how to model thermonuclear reactions on computers that were classified and not for sale to the public. Even Oppenheimer flipped on the H-bomb when a “technically sweet” solution to triggering fusion was presented to him.
But perhaps more importantly for the vast majority of the would-be Fausts who uncomplainingly went about their work was that they had projects of their own, without any obvious military application, that the Pentagon was quite willing to fund in the 1950s and 1960s. This was often called “basic research.” As the dominant science policy of the day had it, you needed to fund basic research because you never know what might come from it. Before the discovery of fission in 1939, the major use of uranium was for making a bright yellow pigment; nobody would have supported bombarding it with neutrons for the sake of a practical benefit—and look how that turned out for the military. Graduate fellowships from the Atomic Energy Commission (a civilian agency with a not-so-subtle military edge) and multiple outfits in the Department of Defense swelled the ranks of physics Ph.D.s in the first postwar decades to Brobdingnagian scales. Most of the scientists who took military money either had no reservations about doing weapons work or were not engaged in any obviously weapons-related work while on the military dime. The Department of Defense was willing to fund physicists, and their physics, just in case.
It was a good deal, and not just for the scientists. The results included mass production of penicillin, microwave ovens, GPS and, of course, the internet—a project to develop survivable communications for a nuclear exchange that was primarily used, before the creation of the World Wide Web, to enable scientists across the United States to talk to each other. Instead of science serving military ends, it looked a lot like the military was serving science’s.
●
In the Soviet Union during the same period the tune was not identical, but it harmonized. The dominant feature of USSR science policy was the monopoly of state funding, without the additional (usually smaller) support American science enjoyed from the private sector or philanthropies. The state was the only port of call whether you were working on nuclear propulsion for submarines or the migratory habits of the Siberian crane. In many sectors of Soviet science, the fact that there was only one patron resulted in significant differences in how the science system was structured compared to other industrialized countries. But with regard to military science, it yielded some homologies to the superpower across the Iron Curtain.
The Soviet Union budgeted a lot of money for science. It’s hard to find an exact analog to U.S. data, and GDP figures can be difficult to estimate—and the currency figures are not convertible—but we can say that in 1990, after several years of economic slowdown, the Soviet state was still spending 2.9 percent of its GDP on science, just about the percentage the U.S. spends today from all funding sources. Some of those rubles reached the universities (which were largely confined to teaching and did little research) and to the sprawling institutes of the Academy of Sciences (where much fundamental research was located). In fiscal terms, however, this was the proverbial tip of the iceberg. Seven percent of the Soviet budget for research and development went to the State Committee of Higher and Secondary Education of the USSR to support about six hundred thousand researchers; an additional 6.5 percent went to the Academy of Sciences with its 125,000. The other 87 percent (plus or minus) went to the eight hundred thousand researchers who worked in what were called “branch institutes,” which did research for industry and especially the military (the boundaries were not always sharp). Branch scientific research was much better funded than open academic research. Historians still do not know many of the details, since much of that work was classified then and remains so today. We do know that much of what in the U.S. would have been considered “basic” as well as applied research took place in secret cities—known only by their post-office box number and removed from all maps—dedicated to a particular industry: Arzamas-16 (today Sarov) for bomb design, Chelyabinsk-65 (today Ozersk) for plutonium production and so on. The military was the most stable bank account for science, and military research attracted some of the brightest minds.
The Soviet Fausts got other privileges too. They might not have been permitted to publish all their work, but they certainly could write up the unclassified parts. Sequestered in secret cities, these scientists often had limited ability to travel around the USSR and even less to journey abroad, but they enjoyed other advantages: within the cities’ fences, intellectual controls on expression, or access to reading materials, were substantially reduced. Scientists outside the secret cities were vulnerable to ideological control to a much greater degree. Notoriously, Mendelian genetics was suppressed in the Soviet Union from 1948 to 1965. This research was still funded by the branch sector—under the agricultural ministries—but since it seemed to have less connection to the military, it was harder to resist opportunists who sought to impose ideological orthodoxy. But physicists, even outside the cities, were able to coax the party-state into deflecting similar attempted incursions in their field; meanwhile, some of the unemployed geneticists were folded into physicist institutes as “radiation biologists,” earning some military shielding. The aegis of physicists’ military protection contributed heavily to the restoration of genetics in 1965. The dictator of biology who led the charge against genetics, Trofim Lysenko, was dethroned from his control of the field after an investigation initiated by former weapons scientists who had retired from the defense sector into the cushy confines of the Academy of Sciences. Military scientists weren’t utterly untouchable, but they were rather hard to touch.
One of those anti-Lysenko academicians was Andrei Sakharov, who in the late 1960s began a campaign against the nuclear arms race and for human rights that yielded the 1975 Nobel Peace Prize, house arrest, hunger strikes and the clearest moral conscience against the Soviet regime through to his death in 1989 at age 68. Sakharov gained his knowledge about the destructive effects of thermonuclear weapons firsthand: he had designed the first Soviet hydrogen bomb. From the outside he might seem to be an Oppenheimer figure, a Soviet Faust, but that doesn’t quite suit the facts. No less opposed to the arms race than Oppenheimer became, instead of invoking Sanskrit scripture to dramatize his personal guilt, he took a more forward-looking stance. In an essay addressed to “the leadership of our country and all its citizens as well as all people of goodwill throughout the world” that was published widely outside the USSR in 1968, he pushed for freedom of expression and disarmament through “open, frank discussion under conditions of publicity.” And that is what cost him all the state support he had previously benefited from. Sakharov’s inspiring moral clarity was present even before his open break, and it was nurtured by the environment of the secret cities. Certainly the aura of protection enabled him to speak more freely even outside those cities for over a decade—until he protested the Soviet invasion of Afghanistan and was arrested in 1980.
Sakharov was exceptional in the Soviet context no less than Oppenheimer was in the American one. Yet there were other physicists who found their heterodox political views shielded by doing military research. It was only by being willing to work on the atomic program that the Soviet theoretical physicist Lev Landau got sprung from prison (where he’d been interned for expressing anti-Soviet views at the height of Stalin’s Terror—that he wasn’t executed is a minor miracle), and it was such utility to the military, too, that shielded Nobel Prize-winning physicist Petr Kapitsa from reprisals when he objected to the way the secret police chief Lavrentii Beria was running that same program. These individuals expressed contrarian views, and the military made sure they did not come to harm. Hundreds of thousands of others uncomplainingly labored for the military, just as contentedly as their American counterparts. The work was interesting, it was funded, and they were patriotic. There was a bargain made between scientists and the military, but it was not a Faustian one as typically understood. The military provided the funds, and the scientists got a chance to do science, as well as some other benefits. Along this axis, the Soviet and the American cases looked remarkably similar.
●
When the Soviet Union dissolved, so did the lavish funding that had sustained the world’s largest scientific system. But in post-Soviet Russia (and to some degree other successor states), working for the military is still the surest way to fund interesting research, whether basic or applied. It no longer protects you the way it did under Stalin, but then there is less to protect against. (That is, until the Russian invasion of Ukraine in late February 2022 and the revival of many repressive state tactics in the weeks to follow; the implications for the many Russian scientists who are speaking out against the war are as yet unclear.)
In the United States, the ostensibly Faustian bargain unraveled a bit earlier. In 1969, in reaction to the escalating catastrophe of the Vietnam War, Senator Mike Mansfield (D-MT, if you are keeping track) pushed through an amendment to the 1970 Military Authorization Act, prohibiting the Department of Defense from using any of its funds “to carry out any research project or study unless such project or study has a direct and apparent relationship to a specific military function.” His goal was to starve the military, but that was not how things worked out: the Mansfield amendment drove military-sponsored research strongly toward applied science but did not reduce the size of military-sponsored science. To some degree, the generals at the Pentagon welcomed the new funding restrictions—studies from the prior decade had indicated a disappointing return on investment in military-sponsored R&D, measured in terms of actually useful scientific gizmos. Mansfield obligated them to concentrate on military applications in precisely the manner they had already begun to plan for.
Basic research did get some money back through a significant expansion in federal allocations to civilian outlets like the National Science Foundation. The funds still came with strings, of course—just different ones. Shorn of the secrecy typical for military R&D, congresspeople could now dive into the work scientists were doing. And dive they did. Senator William Proxmire (D-WI) had a field day with this arrangement, awarding “Golden Fleece Awards” every month from 1975 to 1988 for research that he deemed frivolous, silly or useless. Mockery was the new golden chain. Scientists had transformed from Faust to a character in a Punch and Judy show.
The other option was the private sector, whose share of domestic spending on research and development rose from 33 to 71 percent from 1960 to 2019—an expanding share of a ballooning pot of funds. (Federal support across this period dropped from 65 to 21 percent, and the share of federal funding devoted to the military shrunk by nearly half.) While members of the public might be leery of military-sponsored research for good reasons—such as secrecy—this tilt toward privatization that has accompanied the booms in biotechnology and computation often associated with Silicon Valley or Big Pharma (though there are many other sponsors) also has its Faustian elements. The military is, at least in principle, subordinate to civilian oversight, so there are mechanisms to pry information out of its clutches; this is far less true for the private sector, where trade secrets are proprietary and democratic control is more fraught. And what happens to basic research, once benignly tolerated by the military, in an age of IPOs and shareholder value? The Faust framing not only fails to represent most of the science of the past seventy years, but also distracts from the structural dilemmas of science funding. The military may not have been the ideal patron, but it was reliable and broadly tolerant. Absent a civilian commitment to consistently valuing science and scientists for their own sake, the post-Cold War era has shown us the alternatives to the military Mephistopheles: in the United States the vicissitudes of populist grandstanding or corporate monopolization of research on the one hand, or in Russia vanishing funds and a radical atrophying of the research sector. A devil’s bargain.
●
This essay is part of our new issue 27 symposium, “What is the military for?” Click here to see the rest of the symposium.
Art credit: Evan Hume. Equivalents, 2020. Past in Progress, 2020. All images courtesy of the artist.
Listen to an audio version of this essay here.
Fascination with the relationship between knowledge and power never dies. In just about every intellectual tradition, in essentially every documented era, the topic bristles through the canon, though the European tradition appears especially fixated upon it. Are the two domains compatible? Are they even distinct? Does the latter corrupt the former? While the philosophical treatments are perhaps the most systematic, it is literary representations that I find most alluring.
Take Victor Frankenstein, the hero (antihero?) of Mary Shelley’s 1818 novel, who sought to fuse the dreams of mystical alchemist sages with the humdrum empiricism of the lab tech. The result, predictably for a moral fable, was the destruction of the creator, the creation and everything in between. The subtitle of Shelley’s novel was “The Modern Prometheus.” This Greek mythological namesake brought fire down to the humans despite a direct prohibition from Zeus against any such meddling. (It did not end well for Prometheus: eagles, livers.) But of all the cautionary tales about the perils of the human temptation to wisdom there is one I keep coming back to: that of the late medieval scholar Faust, who, upon finding the life of the scholar a little tedious, ends up selling his soul to Mephistopheles for greater knowledge and power. When the story is represented and re-represented by the likes of Christopher Marlowe, Johann Wolfgang von Goethe, Charles Gounod, Mikhail Bulgakov, Thomas Mann and countless others, it is always a tragedy.
Of course, these are all fictions fixated on the problem of human (or, in the case of Prometheus, Titan) hubris abutting against the godhead. In our secular age—one is repeatedly told that it is indeed a secular age—we are supposedly beyond the superstitious fear of new knowledge and the dangers it might wreak upon the world. There’s a lot to argue against in that nugget of conventional wisdom, but I focus here on a key facet of the Faust story and its literary kin: that corruption in discovery hobbles true wisdom.
The prism of Faust appears again and again in commentaries about science during and after World War II, starting from the Manhattan Project to produce nuclear weapons in the United States and the V-2 ballistic missile enterprise in Hitler’s Germany. To be sure, the Faustian language did not appear instantly, but starting in the late 1960s and more intense popular critique of the thermonuclear standoff, it became too common to even provide a cursory list. The conceptual pairing is overdetermined: here we have the most rarefied of intellectual disciplines (physics often stands in as representative of all sciences) confronting the most diabolical of powerful entities: the nuclearized military-industrial complex. Cash was poured liberally upon the scientists, knowledge flowed back to their military paymasters, and all it cost was the former’s souls and a planet thrust into precarity. Faust is ready for his close-up.
Except that is not exactly how it happened—not for most scientists, either in the United States or, perhaps more surprisingly, in the Soviet Union. The vast majority of sciences engaged in an exchange, to be sure, but it wasn’t precisely Faustian. Science was then, and remains now, an incredibly expensive activity. In the twentieth century the public funds of the nation-state seemed the most obvious source of this lifeblood, and the military is often the most generous of potential government patrons. Although there was military-sponsored research into gunpowder and naval artillery before World War I, the scale really took off as the century progressed, and in the Cold War the scale of defense research was truly massive: a 1998 estimate put the price of just the U.S. side of the nuclear arms race from 1940 to 1996 at $5.8 trillion in 1996 dollars. (The numbers for military support today are still pretty large, but in the last three decades commercial firms have in aggregate come closer to outspending the generals, a point I will return to later.) The military got its weapons from this trade, but the scientists ended up with something as well, and for most of them it wasn’t a tattered conscience. What they got was a great deal of science.
●
If you are looking for a science-and-the-military poster child molded to the Faustian template, J. Robert Oppenheimer seems the perfect candidate (even if his biographers opted for the analogy with Prometheus). Raised in a wealthy family in New York City according to the secular humanist principles of the Ethical Culture School, he sailed through Harvard College under a punishing course load, learning Sanskrit along the way, all seemingly without breaking a sweat, and then popped over to Europe to pursue a doctorate in physics. His brilliance widely acknowledged, he took teaching positions at Caltech and UC Berkeley upon his return, gracing Californias North and South alike. He acquired a bevy of admiring graduate students and dabbled in Communist discussion groups. It was the 1930s.
Then came the war, and Oppie’s devil’s-bargain moment. He was tapped by General Leslie R. Groves, who helmed the Manhattan Project to develop a nuclear weapon, to be the scientific director of the bomb-design facility in Los Alamos, New Mexico. Security agents noticed his prior pink politicking and flagged the file, but Groves overruled them. If Oppenheimer had any qualms subordinating his knowledge to power, he did not show it. Not only did he expertly recruit and manage a whole flotilla of superb researchers, but he was even willing to accede to Groves’s suggestion that all scientists wear uniforms and submit to military discipline. (The recruited physicists talked him out of it, and civvies were the norm.) He was very, very good at this job, and the Ordnance Division designed both a uranium gun-type Little Boy and a plutonium implosion-type Fat Man bomb by July 1945, in time to devastate the cities of Hiroshima and Nagasaki in early August.
Later, after stepping down from Los Alamos and becoming Director of the Institute for Advanced Study in Princeton, New Jersey, Oppenheimer’s story took a different turn. Before the end of the war he had been behind almost every important closed door related to atomic bombs, and there he remained as the postwar curdled into Cold War, ruffling more than a few feathers along the way. He was not afraid to contradict the military on scientific grounds. He advocated against developing a nuclear-powered aircraft, despite the Air Force’s intense lobbying, and he was at first opposed to a crash program to develop a hydrogen bomb in 1949. This last contrarian position—combined with security services dredging up his attendance at various Berkeley salons, some guilt by association and a host of resentments of the icy superciliousness that was his general affect—proved sufficient for the Atomic Energy Commission to strip him of his security clearance in 1954. He grew gaunter, more spectral and, as the years advanced, increasingly expressed oracular statements about morality and science. The most famous of these, from a segment in a 1965 television documentary describing the first nuclear test on July 16, 1945, invoked his undergraduate days studying the Bhagavad Gita: “I am become Death, the destroyer of worlds.” Two years later he succumbed to cancer.
The tragic components in this story have attracted such expert impresarios as Peter Sellars and John Adams in the opera Doctor Atomic and Christopher Nolan with his cinematic extravaganza, currently in production (with Cillian Murphy as Oppie). And Oppenheimer wasn’t the only one with qualms. Robert R. Wilson was an up-and-coming physicist, Berkeley-trained and Princeton-postdoc’d, before joining the team at Los Alamos in 1943. When he heard about the devastation of Hiroshima, he felt ready to vomit. He devoted himself to scientists’ groups against the nuclear arms race, and eventually became the first director of the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, insisting on advancing civilian science free of military influence. Geophysicist Merle Tuve, who had done his own share of wartime work, tore himself apart with anxiety at the world that he and his colleagues had helped build. He became increasingly shrill as he refused invitations to work on fully funded military projects, refusing to “spend a third or half of my time on work” that would end up “wrecking most of what is valuable in my life.” He urged others to resist “this disfiguring disease” of military support.
These are rather familiar stories, even if you have never heard of Tuve. They fit the Faust model perfectly: working with power corrupts the knowledge-maker. The scientist risks losing his (in these stories, it is basically always “his”) soul. The rub, though, is that these stories are pretty rare. On the one hand, you have the Oppenheimers. On the other, you have no-less-emblematic counterparts like John von Neumann or John A. Wheeler or Edward Teller, eager to develop hydrogen weapons, nuclear mining or whatever else the military brass dreamed up. Even morally upstanding voices of reason such as Hans Bethe also consulted for defense without significant qualms. He was joined by thousands upon thousands of others who labored over transistors for fighter jets, ballistic missile fuels, refinements of nuclear warheads and many other military topics. Why did they do it?
Well, why wouldn’t they? First of all, many of those military topics were interesting. Studying the reentry of warheads into the atmosphere was a way to study the ionized plasma that gathered around them. Some of the most advanced programming of the day was figuring out how to model thermonuclear reactions on computers that were classified and not for sale to the public. Even Oppenheimer flipped on the H-bomb when a “technically sweet” solution to triggering fusion was presented to him.
But perhaps more importantly for the vast majority of the would-be Fausts who uncomplainingly went about their work was that they had projects of their own, without any obvious military application, that the Pentagon was quite willing to fund in the 1950s and 1960s. This was often called “basic research.” As the dominant science policy of the day had it, you needed to fund basic research because you never know what might come from it. Before the discovery of fission in 1939, the major use of uranium was for making a bright yellow pigment; nobody would have supported bombarding it with neutrons for the sake of a practical benefit—and look how that turned out for the military. Graduate fellowships from the Atomic Energy Commission (a civilian agency with a not-so-subtle military edge) and multiple outfits in the Department of Defense swelled the ranks of physics Ph.D.s in the first postwar decades to Brobdingnagian scales. Most of the scientists who took military money either had no reservations about doing weapons work or were not engaged in any obviously weapons-related work while on the military dime. The Department of Defense was willing to fund physicists, and their physics, just in case.
It was a good deal, and not just for the scientists. The results included mass production of penicillin, microwave ovens, GPS and, of course, the internet—a project to develop survivable communications for a nuclear exchange that was primarily used, before the creation of the World Wide Web, to enable scientists across the United States to talk to each other. Instead of science serving military ends, it looked a lot like the military was serving science’s.
●
In the Soviet Union during the same period the tune was not identical, but it harmonized. The dominant feature of USSR science policy was the monopoly of state funding, without the additional (usually smaller) support American science enjoyed from the private sector or philanthropies. The state was the only port of call whether you were working on nuclear propulsion for submarines or the migratory habits of the Siberian crane. In many sectors of Soviet science, the fact that there was only one patron resulted in significant differences in how the science system was structured compared to other industrialized countries. But with regard to military science, it yielded some homologies to the superpower across the Iron Curtain.
The Soviet Union budgeted a lot of money for science. It’s hard to find an exact analog to U.S. data, and GDP figures can be difficult to estimate—and the currency figures are not convertible—but we can say that in 1990, after several years of economic slowdown, the Soviet state was still spending 2.9 percent of its GDP on science, just about the percentage the U.S. spends today from all funding sources. Some of those rubles reached the universities (which were largely confined to teaching and did little research) and to the sprawling institutes of the Academy of Sciences (where much fundamental research was located). In fiscal terms, however, this was the proverbial tip of the iceberg. Seven percent of the Soviet budget for research and development went to the State Committee of Higher and Secondary Education of the USSR to support about six hundred thousand researchers; an additional 6.5 percent went to the Academy of Sciences with its 125,000. The other 87 percent (plus or minus) went to the eight hundred thousand researchers who worked in what were called “branch institutes,” which did research for industry and especially the military (the boundaries were not always sharp). Branch scientific research was much better funded than open academic research. Historians still do not know many of the details, since much of that work was classified then and remains so today. We do know that much of what in the U.S. would have been considered “basic” as well as applied research took place in secret cities—known only by their post-office box number and removed from all maps—dedicated to a particular industry: Arzamas-16 (today Sarov) for bomb design, Chelyabinsk-65 (today Ozersk) for plutonium production and so on. The military was the most stable bank account for science, and military research attracted some of the brightest minds.
The Soviet Fausts got other privileges too. They might not have been permitted to publish all their work, but they certainly could write up the unclassified parts. Sequestered in secret cities, these scientists often had limited ability to travel around the USSR and even less to journey abroad, but they enjoyed other advantages: within the cities’ fences, intellectual controls on expression, or access to reading materials, were substantially reduced. Scientists outside the secret cities were vulnerable to ideological control to a much greater degree. Notoriously, Mendelian genetics was suppressed in the Soviet Union from 1948 to 1965. This research was still funded by the branch sector—under the agricultural ministries—but since it seemed to have less connection to the military, it was harder to resist opportunists who sought to impose ideological orthodoxy. But physicists, even outside the cities, were able to coax the party-state into deflecting similar attempted incursions in their field; meanwhile, some of the unemployed geneticists were folded into physicist institutes as “radiation biologists,” earning some military shielding. The aegis of physicists’ military protection contributed heavily to the restoration of genetics in 1965. The dictator of biology who led the charge against genetics, Trofim Lysenko, was dethroned from his control of the field after an investigation initiated by former weapons scientists who had retired from the defense sector into the cushy confines of the Academy of Sciences. Military scientists weren’t utterly untouchable, but they were rather hard to touch.
One of those anti-Lysenko academicians was Andrei Sakharov, who in the late 1960s began a campaign against the nuclear arms race and for human rights that yielded the 1975 Nobel Peace Prize, house arrest, hunger strikes and the clearest moral conscience against the Soviet regime through to his death in 1989 at age 68. Sakharov gained his knowledge about the destructive effects of thermonuclear weapons firsthand: he had designed the first Soviet hydrogen bomb. From the outside he might seem to be an Oppenheimer figure, a Soviet Faust, but that doesn’t quite suit the facts. No less opposed to the arms race than Oppenheimer became, instead of invoking Sanskrit scripture to dramatize his personal guilt, he took a more forward-looking stance. In an essay addressed to “the leadership of our country and all its citizens as well as all people of goodwill throughout the world” that was published widely outside the USSR in 1968, he pushed for freedom of expression and disarmament through “open, frank discussion under conditions of publicity.” And that is what cost him all the state support he had previously benefited from. Sakharov’s inspiring moral clarity was present even before his open break, and it was nurtured by the environment of the secret cities. Certainly the aura of protection enabled him to speak more freely even outside those cities for over a decade—until he protested the Soviet invasion of Afghanistan and was arrested in 1980.
Sakharov was exceptional in the Soviet context no less than Oppenheimer was in the American one. Yet there were other physicists who found their heterodox political views shielded by doing military research. It was only by being willing to work on the atomic program that the Soviet theoretical physicist Lev Landau got sprung from prison (where he’d been interned for expressing anti-Soviet views at the height of Stalin’s Terror—that he wasn’t executed is a minor miracle), and it was such utility to the military, too, that shielded Nobel Prize-winning physicist Petr Kapitsa from reprisals when he objected to the way the secret police chief Lavrentii Beria was running that same program. These individuals expressed contrarian views, and the military made sure they did not come to harm. Hundreds of thousands of others uncomplainingly labored for the military, just as contentedly as their American counterparts. The work was interesting, it was funded, and they were patriotic. There was a bargain made between scientists and the military, but it was not a Faustian one as typically understood. The military provided the funds, and the scientists got a chance to do science, as well as some other benefits. Along this axis, the Soviet and the American cases looked remarkably similar.
●
When the Soviet Union dissolved, so did the lavish funding that had sustained the world’s largest scientific system. But in post-Soviet Russia (and to some degree other successor states), working for the military is still the surest way to fund interesting research, whether basic or applied. It no longer protects you the way it did under Stalin, but then there is less to protect against. (That is, until the Russian invasion of Ukraine in late February 2022 and the revival of many repressive state tactics in the weeks to follow; the implications for the many Russian scientists who are speaking out against the war are as yet unclear.)
In the United States, the ostensibly Faustian bargain unraveled a bit earlier. In 1969, in reaction to the escalating catastrophe of the Vietnam War, Senator Mike Mansfield (D-MT, if you are keeping track) pushed through an amendment to the 1970 Military Authorization Act, prohibiting the Department of Defense from using any of its funds “to carry out any research project or study unless such project or study has a direct and apparent relationship to a specific military function.” His goal was to starve the military, but that was not how things worked out: the Mansfield amendment drove military-sponsored research strongly toward applied science but did not reduce the size of military-sponsored science. To some degree, the generals at the Pentagon welcomed the new funding restrictions—studies from the prior decade had indicated a disappointing return on investment in military-sponsored R&D, measured in terms of actually useful scientific gizmos. Mansfield obligated them to concentrate on military applications in precisely the manner they had already begun to plan for.
Basic research did get some money back through a significant expansion in federal allocations to civilian outlets like the National Science Foundation. The funds still came with strings, of course—just different ones. Shorn of the secrecy typical for military R&D, congresspeople could now dive into the work scientists were doing. And dive they did. Senator William Proxmire (D-WI) had a field day with this arrangement, awarding “Golden Fleece Awards” every month from 1975 to 1988 for research that he deemed frivolous, silly or useless. Mockery was the new golden chain. Scientists had transformed from Faust to a character in a Punch and Judy show.
The other option was the private sector, whose share of domestic spending on research and development rose from 33 to 71 percent from 1960 to 2019—an expanding share of a ballooning pot of funds. (Federal support across this period dropped from 65 to 21 percent, and the share of federal funding devoted to the military shrunk by nearly half.) While members of the public might be leery of military-sponsored research for good reasons—such as secrecy—this tilt toward privatization that has accompanied the booms in biotechnology and computation often associated with Silicon Valley or Big Pharma (though there are many other sponsors) also has its Faustian elements. The military is, at least in principle, subordinate to civilian oversight, so there are mechanisms to pry information out of its clutches; this is far less true for the private sector, where trade secrets are proprietary and democratic control is more fraught. And what happens to basic research, once benignly tolerated by the military, in an age of IPOs and shareholder value? The Faust framing not only fails to represent most of the science of the past seventy years, but also distracts from the structural dilemmas of science funding. The military may not have been the ideal patron, but it was reliable and broadly tolerant. Absent a civilian commitment to consistently valuing science and scientists for their own sake, the post-Cold War era has shown us the alternatives to the military Mephistopheles: in the United States the vicissitudes of populist grandstanding or corporate monopolization of research on the one hand, or in Russia vanishing funds and a radical atrophying of the research sector. A devil’s bargain.
●
This essay is part of our new issue 27 symposium, “What is the military for?” Click here to see the rest of the symposium.
Art credit: Evan Hume. Equivalents, 2020. Past in Progress, 2020. All images courtesy of the artist.
If you liked this essay, you’ll love reading The Point in print.