Defending Cold War Science
Peace Corps Volunteers work with a water-well drilling team in Chad to provide clean water to the community, 1968
by Audra J. Wolfe
Last November I sat in a hotel ballroom surrounded by fellow historians of science as a baffling (to me, anyway) exchange unfolded over the legitimacy of the term “Cold War Social Science.” The occasion was a roundtable discussion at the History of Science Society’s annual meeting on a new book, bearing that very title, edited by Mark Solovey and Hamilton Cravens. Having just written my own book about science and the Cold War, I watched with growing alarm as colleagues spelled out their objections: decolonialization, various rights movements, the triumph of neoliberalism, pre-existing strains of social scientific thinking — surely each of these influenced the postwar social sciences as much as the conflict between Communism and Capitalism?
But why, I thought, is this a problem? The book under discussion focused on the social sciences, mostly in the United States, as conducted between approximately 1948 and 1963 — a period that most historians would agree coincides with the first half of the Cold War. Scholars throw around phrases like “Victorian science” all the time. And more to the point, decolonialization, civil rights, women’s rights and the rise of American conservatism aren’t exactly unrelated to the Cold War. So what exactly were the critics objecting to?
Soon enough, it clicked: this wasn’t a conversation about the past; it was a conversation about the present. Specifically, it was about the culpability of individual social scientists — including colleagues and mentors of people in the room (and possibly some attendees themselves) — in producing work that was either sponsored by or proved useful to American defense and intelligence operations. When Solovey and Cravens said, “Cold War,” their interlocutors heard “military-industrial-complex.” They heard judgment.
Nearly a quarter-century after the collapse of the Soviet Union, scholarly conversations in the United States about the Cold War still slip easily into attributions of heroism and blame. Who signed Faustian bargains? Who spoke truth to power? Though tempting in any number of fields, including the history of art, film and public intellectualism, nowhere is the normative pull stronger than in the history of science, where scholars continue to trade barbs over what historian of physics Paul Forman famously referred to as the “distorting” effects of military funding on scientific research agendas. Forman and others who have advanced this line of argument have a point: the effects of military funding on American academic and even corporate research were pervasive and pernicious during the Cold War. By some estimates, more than three-quarters of all U.S. federal investment in scientific research in the late 1940s and early 1950s came from a single military research agency, the Office of Naval Research. Even when patrons let their clients operate at arms’ length, most of these funds came with strings attached.
At the same time, the accounts of science in the Cold War that historians have produced by “following the money” increasingly feel claustrophobic. They limit the discussion to areas of specific interest to the military and intelligence establishments, omitting vast areas of scientific research, practice and policy that contributed to U.S. national prestige and identity. Tracing ties between academic offices and military patrons may tell us something about how the Cold War changed the practice of science, but it can’t answer the underlying question of why high-ranking government officials thought that funding for science — especially funding for so-called “pure” science — would be the thing that would help the United States win the Cold War, or why members of the general public tended to agree.
But insisting on categorizing Cold War-era projects (scientific or otherwise) as military or civilian, covert or overt, dirty or clean bears another cost. Blaming “the military” for certain characteristics of U.S. Cold War-era science is akin to blaming soldiers for the conduct of a war, or nuclear engineers for U.S. nuclear policy. The idea seems to be that by carefully identifying, tracing and condemning ties to the defense industry, the rest of science might somehow be excused for its contributions to American nationalism in the Cold War period. From this perspective, the term “Cold War science” becomes an accusation to level at certain researchers as a means to salvage “not-Cold War science”. Thus very different groups of scholars have rejected the term: those who hold (or held) some connection to work supported by the military-industrial complex object to its attribution of blame, while critics of U.S. national policy during the Cold War find the term, if anything, too exonerating.
Some fields have been more willing than others to embrace a more nuanced version of the recent past. One of the most vibrant areas in contemporary U.S. diplomatic history, for instance, looks at the history of cultural diplomacy — efforts to draw the non-aligned world closer into the U.S. orbit by promoting the American way of life, which might include anything from jazz and consumerism to labor relations and freedom of religion. The most famous of these efforts, the cultural extravaganzas produced by the Congress for Cultural Freedom, were covertly funded by the Central Intelligence Agency. But, as the historian Hugh Wilford makes clear in The Mighty Wurlitzer, his recent history of covert cultural diplomacy, not only did the CIA take a hands-off approach to most of these activities; some of them really were the efforts of individual acting as free agents, of men and women who felt the pull of ideology strongly enough to embark on missions of private diplomacy. This emphasis on ideological commitments comes across even in more traditional works of diplomatic history. Odd Arne Westad’s magisterial Global Cold War makes abundantly clear that both U.S. and Soviet leaders actually believed the things they were saying about Communism and capitalism, and that those beliefs, at least in part, drove foreign policy decisions.
Narrow definitions of “Cold War science” don’t get us very far in making sense of Cold War ideology. What would the history of postwar science look like if you took seriously the idea of the Cold War as a total war — an event that affected all aspects of American life? This was the question I wanted to tackle when I started writing Competing with the Soviets: Science, Technology, and the State in Cold War America. I wanted to know what would happen if you combined the obvious stories about the intersection of science and the Cold War (the nuclear arms race, the military-industrial complex, anti-Communism, secrecy, the space race, and so on) with less-obvious episodes relevant to the ideological struggle between the so-called Free World and the Soviet bloc (the rise of scientometrics, social scientific theories of race relations, and the origins of the biotech industry). Would it be possible to find something, aside from chronology and cash, that connects these stories?
In short, yes.
There is something special about the role of science in the Cold War, something that goes beyond funding patterns and or even displays of technological might. Science played a unique role in maintaining and projecting state power throughout the postwar era, and this special role derived directly from the ideological conflict we now think of as the Cold War. Science and technology have, of course, always contributed to state power. In the Italian Renaissance, patrons requested that natural philosophers supply them with telescopes and astrolabes; two centuries later, the imperial governments of Spain, France and Great Britain sent crews of naturalists to evaluate the commercial potential of the plants, animals, and minerals in their conquered lands. Even so, this relationship underwent a fundamental change in the years immediately following World War II. Scientific achievement, in the form of the bomb, radar, and the proximity fuse, had apparently won the war for the Allies; it would presumably be the critical factor in deciding the Cold War as well. For all their differences, leaders in both the Soviet Union and the United States agreed that science, and scientists, were critical weapons in the international battle for hearts and minds.
That both American and Soviet leaders embraced science as a tool for international military superiority — the driving forced behind high-tech weapons and surveillance — is perhaps not so surprising in an era dominated by the shadow of the atomic bomb. More intriguing is the extent to which both nations touted different definitions of science itself. Communist leaders trumpeted the accomplishments of centrally planned, results-driven Soviet science and technology in transforming agricultural economies into industrial powerhouses. Soviet science held no place for mere theoretical or abstract work. Legitimate scientific investigations needed to have some practical purpose in improving the lives of the people. This is not to say that Soviet scientists stopped conducting so-called basic research, but the more successful among them learned to describe in it Communist code.
In contrast, the United States offered the very structure of science — supposedly open, international, and free from government interference — as a beacon of freedom to citizens of the world. The decision to place such defense-related agencies as the Atomic Energy Commission and the National Aeronautics and Space Administration under civilian, rather than military, aegis was at least partially about demonstrating the United States’ commitment to open science to the rest of the world. In the U.S., this idea of “open science” sat uneasily next to the reality of a research infrastructure that was largely backed by state — particularly military — interests. In the fun-house logic of Cold War global politics, American expressions of dedication to scientific freedom and international cooperation were simultaneously sincere and chauvinistic.
Though I (mostly) avoided the phrase in my book, I ultimately find “Cold War science” (or Cold War social science, or Cold War physics, etc.) useful as an invocation of the nearly superhuman powers political leaders granted science in the postwar period. The phrase can invoke something more specific than chronology, especially when it refers to actions that relate to state policy. But it need not be considered an accusation. The uncomfortable fact of writing histories of the Cold War is that both the United States and the Soviet Union engaged in activities that were good, bad, and morally neutral. Surely one of the brighter lights of postwar American history was the belief that science, generously funded and left free from political interference, could be a force for both international peace and domestic prosperity: in short, a force for the public good. The Peace Corps, Head Start, and the War on Cancer were just as much products of “Cold War science” as were Apollo and the arms race.
The judgment of history is fickle, but from our contemporary perspective — the only moment from which we can ever write — it seems clear enough that the United States won the Cold War. It did so both through the embrace of the idealistic concepts of freedom, democracy and self-determination, and through campaigns of military, paramilitary and economic violence. The past is a complicated place. When historians object to “Cold War anything,” they are objecting to these more troubling aspects of U.S. national conduct, both at home and abroad. But recognizing the inherent contradictions in historical actors’ behavior can be politically liberating, a necessary first step in uncovering a useable past. In the case of “Cold War science,” casting the net in the widest possible terms may help us to articulate concepts of the public good that depend less on demonstrating the superiority of the American way of life, and more on fostering a truly collaborative, international approach to knowledge of the natural world.
Mark Solovey and Hamilton Cravens, Cold War Social Science: Knowledge Production, Liberal Democracy, and Human Nature (New York: Palgrave Macmillan, 2012).
Paul Forman, “Behind Quantum Electronics: National Security as Basis for Physical Research in the United States, 1940–1960,” Historical Studies in the Physical and Biological Sciences 18 (1987): 149–229.
Hugh Wilford, The Mighty Wurlitzer: How the CIA Played America (Cambridge: Harvard University Press, 2008).
Odd Arne Westad, The Global Cold War: Third World Interventions and the Making of Our Times (New York: Cambridge University Press, 2007).
About the Author:
Audra J. Wolfe is a writer, editor, and historian based in Philadelphia. Her research interests include the history of science, the history of the Cold War, and cultural diplomacy. She holds a Ph.D. in the history and sociology of science from the University of Pennsylvania. From 2007 to 2009 she was the executive producer of Distillations, a podcast on the past, present, and future of chemistry. She is the author of Competing with the Soviets: Science, Technology, and the State in Cold War America (Johns Hopkins University Press, 2013). Her current book-in-progress looks at the role of science in diplomacy. You can follow her on Twitter @ColdWarScience.