Introduction
Kenneth Branagh’s 2022 film of Agatha Christie’s Death on the Nile opens with a sequence not in the novel, plunging the Belgian detective Hercule Poirot into his earlier World War I incarnation as a soldier.[1] He is not just any soldier, however, but rather one of pronounced observational and deductive skills.[2] In this sequence, we witness Poirot’s scientific necromancy: his reading the flight of birds and their navigating winds specific to a particular time of day in a Belgian microclimate near the squadron’s entrenched position. An order from the Belgian military command center tells the soldiers to postpone a planned attack on the proximate German positions because of inclement weather, particularly with regard to wind direction and its negation of a planned gas attack. Poirot’s avian observations, however, lead him to an interpretation of meteorological conditions counter to that offered in the weather report from the command center situated far from their locations and predicting large-scale conditions. He suggests to his captain that they instead focus on a smaller-scale effect of the winds found in the bird flight, an effect that occludes the central command’s prediction of disadvantageous weather for attack.
This shift allows the Belgian forces a narrow window to launch a surprise gas offensive with breezes briefly at their backs and not in their faces: these dynamic systems have a temporal duration that changes the spatial contours of the land and thus their military strategy if only for a brief moment. Because the German enemy is likely armed with a similar weather report from their own general command also situated miles away, and thus in no apparent threat of immanent engagement, the Belgians hold a momentary strategic advantage of meteorological surprise. Poirot’s divination of bird flight reveals secrets of prediction that eluded those of both armies’ blunter and more generalized meteorological modeling. The birds trafficked not between the earth and the gods, as in antiquity, but between the earth and the aleatory (or random) properties assigned to aeolian forces: the protean shifting of the wind in contained space that offers the revelations contained in microclimates. Following the satirically pronounced powers of deduction exhibited by Edgar Allan Poe’s Dupin, Poirot works the system of prediction and information against itself, converting the signal of “accurate” predictive weather information into the “noise” of literal misdirection in the form of blowback.[3]
The short opening sequence in the Branagh film contains some primary issues relevant for the discussion in this article: transforming the aleatory into stability and predictability as a key element of military strategy, the conversion of noise to signal for one’s advantage while others still interpret it as noise, and the role of the micro (e.g., microclimates, microspatialities, microtemporalities, etc.) and its scalability in manipulating large-scale dynamic events. Wind becomes a medium of military agency past and present through which the aleatory is converted to friend, not foe.
Weather is a central actor in military history and strategy. Tales of snow-defeated campaigns, gale-swamped ships and sailors at sea, transport trapped in mud, and unseasonable deluges drowning the cleverest battlefield maneuvers abound in military literature and memory. Of the elements, perhaps none is more mercurial than wind, the very embodiment of capricious nature. Turbulence has long been the bête noire of physics and no less a villain for military planners. However, an insight into the complexities and mechanics of micro-effects on large-scale dynamic phenomena of which these effects are a part allowed military research to exercise one of its standard strategic moves: converting disadvantage to advantage, even if only for a very short period. Thus, the military began to understand weather from a different perspective constituted by a slow evolution from impediment to empowerment.
Although the insight about the complexities of micro-effects on larger dynamic systems as a means to scale and predict thermodynamic phenomena had roots in ancient philosophy, it emerged more fully during nineteenth-century research in physics and meteorology about the nature of dynamic systems. This discovery accompanied, and was abetted by, the turn in physics from a mechanistic to a probabilistic universe, which allowed the aleatory nature of the world to be leveraged. This discovery prefigures the import of system hermeneutics and interpretation as key to military operational technics in relation to the battlefield environment from the Cold War to the present. A consistent logic of microslicing dynamic environmental systems connects different historical periods of military weather interventions, a windowing procedure to find a signal in the noise of air or water flow, and possibly an exploitable one.
The temporalities invoked in this article are long but connected by the logics and technics found in the conversion of temporal phenomena to spatial units for storage, prediction, and manipulation. These are extracted as micro-units to potentially affect the larger dynamic system. This repetitive and generative logic of time-space configuration involved consistent refinement and application: from atomism in antiquity, to dynamic systems research in the nineteenth and twentieth centuries, to Cold War military research and contemporary military applications. The longer trajectories find entry points through specific technoscientific military phenomena: the influence of microclimates and meteorological modeling in World War I, atomic weapons development and the emergence of ecological research, and the contemporary application of very short-term low-atmospheric turbulence in twenty-first-century urban warfare.
These three moments provide specific anchors for the longer interwoven tales leading to them. In each section that follows, the logics and technics that allow the exploitation of the micro within these dynamic systems invoke and trace these different temporal trajectories to tell an overarching narrative about military strategy past and present as embedded within philosophy, science, and technology.
Military Weather and Atomic Ecologies: From Prediction to Weaponization
“Ecological catastrophes are only terrifying for civilians. For the military, they are a simulation of chaos, an opportunity for the state of warfare, which is all the more autonomous as the political state dies out.”—Paul Virilio (1986, 65–66), Popular Defense and Ecological Struggles
The ineluctable connection between the development of nuclear/atomic energy as a reality and environmental study as simultaneously a fear and an opportunity essentially sprang from the same moment in 1945 “when a white spark flashed” (Galway Kinnell). In her stellar study of the South Pacific islands, such as the Bikini Atoll, as labs for both weapons and ecological experimentation, Elizabeth DeLoughrey observes that “few [scholars] have traced the close relationship between the rise of the Age of Ecology and the Atomic Age, the multi-constitutive relationship between radioactive materialism and the study of the environment” (2012, 167).[4] She cites Donald Worster’s book Nature’s Economy as a vital starting point for understanding the co-constitutive relationship between atomic energy, weaponry, and testing and the emergence of ecology as a field of study at US universities, and she quotes Worster on the foundational moment that forged this union: “The Age of Ecology began on the desert outside Alamogordo, New Mexico on July 16, 1945, with a dazzling fireball of light and a swelling mushroom cloud of radioactive gases” (Worster 1994, 339; also see Sloterdijk 2009 on a related set of issues).
The title of Worster’s book owes something to Martin Heidegger’s argument that technological culture casts nature as standing reserve for the materialist extractive economies and environmental technics deployed to divide the world in Cold War geopolitics. DeLoughrey examines Cold War military intervention into the earth’s environments and understands the emergent study of ecology as a site of Manichean struggle between destruction and preservation with the environment itself as the battleground. This article does not concentrate on the military’s influence on ecological thinking, as such, but rather on how the environment and weather merely add to the military’s sources for potential weaponization. In this case, military planning approaches ecology, especially meteorology and wind, as a medium of potential strategic superiority with the destruction or preservation of that medium, from this Department of Defense perspective, as a secondary or a tertiary issue.
The connections between military and civilian meteorology are deep and difficult to disentangle. Fronts, as a meteorological metaphor, come from the military’s lexicon, and it was after all a repurposed telegraph and communications network from the US Civil War that initiated the first national weather information system. From antiquity until essentially World War II, the military usually considered weather an obstacle to achieving its goals. The case still holds, of course, and is often played out tragically in the seasonal display of armaments and conflict.[5] In Weather and War, John Fuller of the US Air Force Weather Service elegantly unfolds a plethora of meteorological effects on battle plans and events. One paragraph details numerous examples of winds and gales wreaking havoc in military plans for conflicts on land and sea, from a fifth-century BC account by Herodotus to a thirteenth-century “divine wind” driving Mongol invasions from the shores of Japan to Elizabethan naval battles between the English and the Spanish.[6]
The history Fuller details is built on the narrative arc of improving meteorological prediction and modeling, resulting in a correspondingly influential role in military decision-making, from the date and site of the D-Day invasion to the exact timing for the atomic bombs to be dropped on Hiroshima and Nagasaki. These atomic targets might have been chosen for their absence of military engagement to better understand, chart, and document the effects of the bombs (chilling laboratory geographic tabulae rasae, as it were), but the specific day and timing were determined by the military’s meteorology: the time-space configuration for nuclear conflagration in anger was planned by humans but determined by the meteorological gods found in predictive modeling.[7] All of which leads us back to DeLoughrey and nuclear testing in the Bikini Atoll and elsewhere: the dream of pristine and isolated, though clearly inhabited, environments (so essential to extractive enclosures of settler colonialism) for producing “laboratory conditions” beneficial to military testing purposes. This, in turn, suggests even bigger dreams in the heady Cold War moment of military technology unbound to turn the environment and the meteorological conditions in which it exists into a medium of global military gain.
The focus on the environment in the atomic era initially began in an instrumental fashion for the military in that weather, and especially wind patterns, exerted profound influence on the modeling of fallout directions and projected death tolls. Quickly, the models fired imaginaries surrounding the potential of the environment as not solely a recipient or victim of nuclear weapons, but instead as a malleable medium and therefore an agent of military leverage: the unharnessed atom yielding the potential to weaponize wind and weather. Weather modeling transformed into weather molding, shaping natural phenomena into productive delivery systems of military planning and thus an essential element in future scenario speculations.
Some examples of this enthusiastic embrace of geo-engineering for military advantage include nightmare scenarios in which the enemy could similarly use such innovations, thus speculating on the Soviet capacity to potentially render Florida an arctic landscape or creating a permanent drought in the farm belt of the Midwest. The Department of Defense (DoD) hatched a scheme in 1949 to examine the possibility of exploding nuclear bombs deep underwater to reshape the ocean floor and thus affect ocean currents and the various climates they influence (Marzec 2015, 34–35). The technophilic outpouring of military control that so characterizes the Cold War epistemic shift is not limited to the weather, of course, but also includes explorations of geological, chemical, oceanographic, biological, and botanical warfare.[8] A Virilian trajectory of dromoscopy and technicity plays out in these schemes from the harrowing to the hilarious that have so demarcated our world for the past eighty years.
The seemingly inevitable and inexorable trajectory of military ecological and meteorological control eventually hit some guardrails, or the attempt to construct the same, with the UN treaty: the Convention on the Prohibition of Military or Any Other Hostile Use of Environmental Modification Techniques. This international agreement prohibits the deployment of environmental modification techniques that could potentially have widespread, long-lasting, or severe effects. It was ratified by the United States in 1980; one can hope for some blushes at the moment of signing the official document considering that the United States had made Southeast Asian jungles a site of industrial-scale reconfiguration.[9] In spite of this treaty’s good intentions, weather hacking and control, as initiated in the 1940s, remains a viable arm of DoD research and development. Examples include controlling floods and droughts to undermine enemy economies, changing the color of the ice caps to reflect rather than refract the sun’s rays, and creating artificial lightning to generate very low frequency radio waves in order to map underground structures.[10]
The role that weather-modification plays in military plans is rather baldly articulated in the executive summary titled “Weather as a Force Multiplier: Owning the Weather 2025.” Written in the 1990s, this document displays an unbridled technophilic vision of converting meteorological impediments into full force superiority. The executive summary is worth quoting at length and reads as follows:
In 2025, US aerospace forces can “own the weather” by capitalizing on emerging technologies and focusing development of those technologies to war-fighting applications. Such a capability offers the warfighter tools to shape the battlespace in ways never before possible. It provides opportunities to impact operations across the full spectrum of conflict and is pertinent to all possible futures. The purpose of this paper is to outline a strategy for the use of a future weather-modification system to achieve military objectives rather than to provide a detailed technical road map.
A high-risk, high-reward endeavor, weather-modification offers a dilemma not unlike the splitting of the atom. While some segments of society will always be reluctant to examine controversial issues such as weather-modification, the tremendous military capabilities that could result from this field are ignored at our own peril. From enhancing friendly operations or disrupting those of the enemy via small-scale tailoring of natural weather patterns to complete dominance of global communications and counterspace control, weather-modification offers the warfighter a wide-range of possible options to defeat or coerce an adversary. Some of the potential capabilities a weather-modification system could provide to a war-fighting commander in chief (CINC) are listed [elsewhere in the document].
Technology advancements in five major areas are necessary for an integrated weather-modification capability: (1) advanced nonlinear modeling techniques, (2) computational capability, (3) information gathering and transmission, (4) a global sensor array, and (5) weather intervention techniques. Some intervention tools exist today and others may be developed and refined in the future. (House et al. 1996, vi)
Presented as a research paper to the planning group Air Force 2025, this document reiterates a key element of military thought from the onset of military philosophy: the strategic desire to transform the aleatory and random into stability and predictability. It is not by accident that the authors of this paper trumpeting the research necessary to make the weather yet another part of the US military’s arsenal begin their aspirational thesis with the invocation of the once impossible task of splitting the atom. Not only was it “high-risk, high-reward,” and controversial, they argue (rightly) that such a technoscientific gamble reconstituted physics, the natural world, warfare, and geopolitics. Again, moving in atomistic scalar fashion, they extol the benefits of weather-modification from “small-scale tailoring of natural weather patterns” to full “domination” of the weather as the medium for global communications and spatial control due to adverse weather conditions, not in spite of them. Clearly a speculative document intended to whip up a technologically driven fervor among the military brass and in the US Congress, and thus open up the government’s coffers for more whiz-bang technological wizardry, the document also follows the logics of atomistic thinking for realizing and enacting control and agency over that which is currently an impediment. This pattern of thought found in pattern recognition that marks signal-noise distinction is a familiar one for the document’s readers, and the authors know the power of this rhetorical move.
Wind: The Noisy Channel
“The history of climate science needs to be seen, then, as part of a history of scaling: the process of mediating between different systems of measurement, formal and informal, designed to apply to different slices of the phenomenal world, in order to arrive at a common standard for proportionality.”—Deborah Coen (2018, 16), Climate in Motion
“How strange that a-tom in Greek means the same as in-dividuum in Latin: unsplittable. The inventors of these words knew neither nuclear fission nor schizophrenia. Whence the modern compulsion to split into ever smaller parts, to split off entire parts of the personality from the ancient being once thought indivisible…”—Christa Wolf (1989, 29), Accident: A Day’s News
The release of atomic energy in the form of weapons is a signature moment of scaling from micro to macro, from atoms to atomistic logics, as deployed in the furtherance of probabilistic models of natural and human activity. The larger trajectory from a physics of immutable laws to one of randomness and chance that eventually yields some forms of control through the manipulation of small phenomena to exact large-scale change reflects a general persistent appeal of the micro as a site of agency that runs throughout Western thought and emerges in the mid-nineteenth century in uniquely powerful and contemporary ways.
In the 1860s, as the United States devolved into what is arguably the first technologically modern war, the philosopher and semiotician Charles S. Peirce was employed by the US Coastal Survey office, for which he conducted measurements of coastal change and erosion. His job consisted of measuring and observing natural forces, as well as building bespoke scientific instruments to measure them (Hacking 1990, 202–3). Standing on the Eastern Seaboard and watching the gravitational pull of tides and waves, Peirce increasingly saw the pull of the twentieth century and its interest in indeterminacy and probability play out on the fractal-mottled beaches of the New England coast. What Peirce perceived in coastal formations and erosions and ocean wave patterns was a gradual receding of a deterministic universe run by immutable laws. Pace Hacking, the turn to chance as the actual immutable of the world reflects a reaction against a two-century-long fascination with determinism and universal laws in natural philosophy that relegated anomalies to the status of mere epiphenomena (1990, 1–15).
As the nineteenth century drew to a close, thinkers such as Peirce began to be persuaded that the aleatory and the random held more explanatory power about the workings of the material world than universal mechanistic laws could achieve. In so thinking, they reached back to pre-Socratic thought as found in Leucippus, Democritus, Zeno, Epicurus, and Pythagoras (later rearticulated by Lucretius) to make the case for the random nature of Nature. Each of these strands is built on an atomistic logic that, far from undermining agency in a universe interpreted as chaotic, instead bestows it with potential for further control. Rather like fate, a mechanistic universe negates human agency, while one predicated on chance restores it through the capacity to determine trends and signals in the noise of uncertainty. The emergence of statistics and probability in the late nineteenth century helped elevate indeterminism and the control it afforded. Pattern and signal could be determined not by ultimate causality of universal laws but by statistical calculation within discrete sets.
In both the deterministic and the probabilistic universes, though, the ancient allure of digitality holds through the sustained microfoundations of mnemotechnics found in letters and numbers. The rise of measurement, modeling, and positivism helped “the imperialism of probability” come to the fore but required a world increasingly reliant upon numerical thought (Hacking 1990, 5). Atomistic or digital thought and technologies repeat the continuing appeal of the micro for certain human cultures. Minima, or indivisible units, provide the world’s building blocks for numerous natural philosophies in antiquity, especially in classical Indian thought such as Buddhism and Jainism, though in very different ways, as well as underpinning Dravidian languages and their metaphysics (Carpenter and Ngaserin 2020). For Greek atomism, these foundational elements of matter and the universe are indestructible but ever reassembled into the world humans occupy and perceive as static reality. Thus as Peirce was throwing his lot in with the roll of the dice, Bertrand Russell and other analytic philosophers were developing a logical atomism built upon “indivisible facts,” understandable and true in their own right—a limited number of knowable constituent elements of the world. The micro embodies, repeats, and generates a faith in scalability and control.
Atomism in philosophy differs from but informs atomistic logic as such beyond the realm of philosophical systems, inasmuch as the emphasis on micromateriality founded on its indivisibility and capacity for metonymic scaling exists outside of strictly bounded philosophical systems and finds capacious application elsewhere. Its role in disparate models such as indeterminacy and the aleatory, in contrast with probability and the stochastic, means that atomism provides some counterintuitive connections between apparently different but related phenomena. Francis Galton’s turn to statistical models and biometrics in the late nineteenth century helped elevate both the law of indeterminism and the control it afforded. In the chaos and noise, pattern and signal could be discerned without having to cede to universal determinates: causality yields to statistical calculation. The paradox of indeterminism is resolved in understanding noise as the necessary channel in which a signal can be found (Hacking 1990, 2).
The role of chance and probability in supplanting determinism as the primary form of causality in the world resulted, according to Ian Hacking (1990, 4), in “a quadruple success [that was] metaphysical, epistemological, logical and ethical,” to which we can also add ontological. Probability is, then, the philosophical success story of the first half of the twentieth century. In other words, rather than searching for transcendental causality as the signal above the noise of the world’s chaotic tumult, probability altered the metamove and found signal in the noise of actors, actions, and being. But “the imperialism of probabilities could only occur as the world itself became numerical” (Hacking 1990, 5), or increasingly numerical. The role of the ancient digital mnemotechnic of numbers in this fundamental shift in interpreting the world helps secure the purchase of measurement, modeling, and positivism.
However, the faith in probability and inductive reasoning also does not elide or negate the positing of a semitranscendental causality or trait. The exploration of turbulence and dynamic systems shifts toward a scalar understanding of the relationship between a mechanistic and a probabilistic world, in a quasi-dialectic fashion, by identifying patterns present in both micro and deep structures, thus revealing an “orderly disorderly” operative in empirical complexities (Hayles 1990). Such a gesture is alluded to by Thomas Kuhn in his article “The Function of Measurement in Modern Physical Sciences” when he writes, “The road from scientific law to scientific measurement can only rarely be travelled in the reverse. To discern qualitative regularity one must normally know what regularity one is seeking and one’s instruments must be designed accordingly” (Kuhn 1961, 189–90; italics in original). In Kuhn’s formulation, the signal is assumed and precedes the noise in which it is found, and the instruments that find that signal are attuned to do so. The target translates to the bow and quiver before the archer even reaches for them.[11]
Turbulence Prediction: The Synaesthetic Continuity of Cold War Military Seismology
“[The] appropriate application of weather-modification can provide battlespace dominance to a degree never before imagined. In the future, such operations will enhance air and space superiority and provide new options for battlespace shaping and battlespace awareness. ‘The technology is there, waiting for us to pull it all together’ … In 1957, the president’s advisory committee on weather control explicitly recognized the military potential of weather-modification, warning in their report that it could become a more important weapon than the atom bomb.”—Tamzy J. House et al. (1996, 3), “Weather as a Force Multiplier: Owning the Weather in 2025”
How to find a hidden signal in noise that others might miss not only provides a profound insight into the emergence of probability, as delineated by Hacking and others, but also becomes amplified in the turn to dynamic systems and eventually chaos theory. This turn results in a complication of the somewhat overly neat temporal transition from a mechanistic to a probabilistic world. The complexity led to understanding that apparently random systems can and usually do have a deep structure—not an immutable law underpinning them per se but a discernible structure that makes sense of microdynamics and subvariations that from one position appear wholly chaotic and random and from another a truly productive and predictive pattern partially obscured by noise (see Hayles 1990, 143–74).
As such, it is clear that how one models a system proved a key for the breakthroughs in studying turbulence from Edward Lorenz to Mitchell Feigenbaum and on to Kenneth Wilson. Wilson’s Nobel Prize–winning research in the simultaneity of “chaos and symmetry” (Hayles 1990, 154) analyzed the transition of a dynamic system from a state of flow to one of turbulence—with turbulence being the military’s biggest obstacle in weaponizing wind. A river’s flow contains microscopic fluctuations that cancel each other out, creating chaos at a micro level but smooth functioning at a larger one. Occasionally microscopic deviations persist, multiply, and magnify macroscopically, resulting in stagnant or violent conditions (Hayles 1990, 154). Borrowing from quantum mechanics, Wilson shifted from an empirical approach that considered the macrophenomenon of turbulence to an analytic approach focused on the micro-elements that recursively scale from miniature versions of forces found in the larger turbulent flow (Hayles 1992, 235–37). His shift followed leads offered a century earlier by Hann (as we have seen with microclimates) and Helmholtz, who observed while hiking in the Swiss Alps that smaller eddies of whirling clouds fed into larger versions of themselves. Wilson also took up insights from Osborne Reynolds, also in the 1880s, about scaling dynamic systems, insights that revealed that large-scale turbulence could be studied using small-scale models (Coen 2018, 212–13). Thus, a century later, Wilson built on these ideas and interpreted certain quantities of turbulence as not fixed (as when seen from afar) but as variables (when perceived up close). The shift of scale and the different perspective it allowed made it possible for previously ignored elements to be measured in their larger repetitive occurrences. The choice of what is measured—that is, determining what the signal is—and what is not measured as noise underscores how disadvantage can strategically convert to its opposite: in this case, how turbulence through microtechnics scales to a form of control.
Such efforts have long temporal trajectories, as is the case with all scientific and technological innovation and as already mentioned. World War I generated an intensification of interest in the irregular movement of gases, liquids, and other materials found in dynamic systems, especially as they could be applied to aircraft and submarine performance, as well as the use of chemical gas (Coen 2018, 217–18). As seen in Poirot’s battlefield decisions in our earlier example, civilian meteorology immediately shifted to a military footing with ease, and the problems of scale (contained in space or in open atmosphere) presented new challenges to researchers with seemingly more at stake than scientific knowledge for its own sake. The value of scale modeling emergent prior to the war took on even greater import in the post–World War I moment.
The military’s interest in dynamic systems then and now is by no means limited to those explicitly fluid forms of air and water but also includes those on and in land: the not-so-solid ground beneath our feet. The constitution of the land itself is not simply the crust of the earth’s surface but a heaving and vital combination of biomass, phytomass, and geological strata compiled over eons of tectonic alterations and mobility. These elements of our planet that strike our senses as solid and stable, of course, actually ebb and flow in constant processes of metamorphoses, as Empedocles, Hesiod, Lucretius, and others explored philosophically (through atomism). And this instability of terra firma leads directly to the conversion of turbulence into military boon.
Landscapes constitute “semipermanent registers of fluid dynamics” (Coen 2018, 218). Thus, the military’s continued interest in dynamic systems resulted not just in military meteorology but also military seismology, especially during the Cold War, and became a discipline of intensive investment. Nowhere does this become more apparent than in the pressing need to verify nuclear test ban treaties that drove covert tests underground rather than the more readily observable sites on land or in air.[12] A brief diversion into the military’s geoacoustic solution to this verification problem will reveal the exact same logic of windowing a signal from a vast amount of noise being touted as a means for modeling turbulence in the lower part of the atmosphere.
During the height of the Cold War, the capacity to discern the difference between an underground nuclear explosion and natural geological occurrences, such as earthquakes, resided in a particular kind of judgment built into the software of remote sensors. It entailed transferring that discernment to a frequency or sine wave for specific interpretation.[13] The viability of nuclear test ban treaties, such as the one ratified in 1963, depended on the capacity for verification and hence a machinic discernment in the form of a sine wave. To accomplish this feat, the resuscitation of a centuries-old mathematical formula sped up for fast calculations of frequency spectra emerged. Subsequently, the formula has become perhaps the most ubiquitous algorithm going today. Known as the fast Fourier transform (FFT), it operates in a host of applications, including but by no means limited to audio signal processing, all forms of scientific imaging, data visualization, pattern recognition, and electronic music. In its essence, Fourier analysis understands space-time configurations as frequency: “anything that takes place in time can be expressed as frequency” (Chua and Rehding 2021, 69; italics in original).
The FFT emerged in a specific moment for a very specific military purpose concerning remote verification capabilities to detect and discern Soviet nuclear testing underground. The means of detecting atomic tests remotely through seismological time series could help justify ratifying the test ban treaty the Kennedy administration sought. This information could be accessed by various offshore detectors; however, the computing power and time required for such analysis precluded this solution until mathematician John Tukey reduced the discrete Fourier transform into its faster version: the FFT. To develop specifically a system for the remote detection of underground nuclear explosions, ARPA (the R & D branch of the US military) sought acoustic signatures for earthquakes and explosions to distinguish between the two and to verify activity in Siberia, a site of both underground nuclear testing and much tectonic shifting. In such a landscape, the Soviets could aurally camouflage nuclear tests or claim misinterpretations of these as actually being natural phenomena and not military-technological. The acoustic sine wave becomes in essence the atomic unit of the FFT synaesthetically rendered as visual sign. The spectrum analysis of sound and light frequencies afforded by FFT calculation result in microtemporal units of remotely accessed geoacoustics converted into sine wave signatures that distinguished one signal from another in the noisy channel of the earth’s crust. The medium of the earth’s crust, as with that of the wind, poses channel switching potentials for the frequency detection capabilities of the FFT for military information and deployment.
The algorithm plays a significant part in the testing and application of the Lattice-Boltzmann wind modeling system used for turbulence prediction. More interesting for us here, though, is that both the FFT application for sine wave windowing and the Lattice-Boltzmann method display the same logic of framing small units of large noisy channels to find the needed signal to alter or use the dynamic system of which it is a part. For this specific application, the military uses FFT technology and logic to model and predict wind turbulence in the planetary boundary layer, the lowest layer of the atmosphere. At this altitude, wind is altered by trees and buildings as well as other vertical phenomena disruptive to the horizontal flow of air. Essentially microslicing vertical parts of this horizontal dynamic system in order to predict turbulence that might affect battlefield conditions and possibilities, as visually displayed in the image above, means that the soldier in the field becomes an updated version of Poirot converting microphenomena to a controlled variable.[14] Field soldiers are equipped with handheld gear capable of phenomenal computational power, and thus can access Lattice-Boltzmann modeling processes to exploit the underlying compression of complicated forces that the FFT made quickly calculable. This method predicts dynamic fluid behavior on a very small scale, slicing it out of the larger complex flow and removing a small portion of the turbulent atmospheric forces (see ARL Public Affairs 2018). As with the FFT, the Lattice-Boltzmann model eschews the differential equations of turbulence modeling by plucking spatially contained segments out of the air continuum and reducing computational calculation and time. The window for sound signatures that proved beneficial for remote geoacoustic sensing appears here as the window for a turbulence microsignal within the larger chaotic environment.
Paul Virilio’s concept of dromology, or increased speed as the primary goal and teleology of technology (1986), underscores the advantage afforded by this turbulence modeling to act in the battlefield resultant from the computational speed allowed by microslicing the complex forces deemed unpredictable at a larger scale. This is an affordance not allowed those whose meteorological accounts of current and near-future conditions discern only the noise of airborne turbulence and not the signal embedded within. This, in turn, becomes the condition of possibility for military meteorological advantage. In the same way that Poirot turns the microclimate of specific spatial conditions into a window of opportunity to avoid the blowback of gas attack predicted at a larger scale of wind conditions, the Lattice-Boltzmann model does exactly the same for US soldiers in complex urban or natural environments but without the need for information obtained from observing bird flight. Microcontrol converts noise to signal and converts aleatoric flux into potential strategic success.
Refrain: “Don’t Follow the Wind”
“Mr. President, I’m not saying we won’t get our hair mussed. But I am saying 10–20 million killed tops. Depending on the breaks.”—General Buck Turgidson in Dr. Strangelove (Kubrick 1964)
“According to what laws and how quickly does radioactivity spread? Best for whom? And would those living in the immediate vicinity of the explosion have a slightly better chance if it were spread by a fair wind? If it were to ascend to the higher strata of the atmosphere and there set off on its journey as an invisible cloud? In my grandmother’s day the word ‘cloud’ conjured up condensed vapour, nothing more…”—Christa Wolf (1989, 9), Accident: A Day’s News
“Radioactivity / Is in the air / For you and me”—Kraftwerk, “Radioactivity”
Off the coast of the Bikini Atoll, where the first hydrogen bomb test took place, a small fishing boat from Japan called the Lucky Dragon sailed unwittingly through the detonation fallout. Aikichi Kuboyama was the ship’s radio transmissions operator and through radiation exposure had the misfortune of becoming the first fatality of the hydrogen bomb. In so becoming, he joined a long, grisly list of victims proximate to nuclear test sites from deserts in the American West to Kazakhstan to Morocco to the Australian outback and the South Pacific waters and islands prior to the advent of the Nuclear Test Ban Treaty as made possible by FFT-generated remote sensing seismology. Kuboyama became a “downwinder,” the designation for the blowback victims of World War I gas attacks and civilian collateral damage (see n. 3), a term updated for radiation fallout patterns.
“Don’t Follow the Wind,” the title for this brief refrain, originates with an ongoing art exhibition inside the restricted and irradiated Fukushima Exclusion Zone. Opened in 2015 and allowing visitors as of 2022, the exhibition includes the work of twelve international artists who installed site-specific pieces in the abandoned homes of Fukushima residents evacuated immediately after the Fukushima Daiichi Nuclear Power Plant disaster in 2011. The exhibition was curated by the Japanese artist-activist collective Chim↑Pom. These evacuees are also downwinders, ones resulting not from military nuclear tests but from power plant accidents in the civilian sector, as in Chernobyl and Fukushima. “Downwinders” became an officially recognized designation of victims in the same year as the Fukushima disaster when the US Senate designated 27 January the National Day of Remembrance for Downwinders. The resolution partially reads, “The Downwinders paid a high price for the development of a nuclear weapons program for the benefit of the United States.” Who become downwinders is left to the whim of the wind and meteorological predictions… or seemingly so, and their victimhood is the inevitable collateral damage of military-technological progress. The analytic power devoted to microslicing phenomena for microclimatic prediction proved a belated arrival for the disposable communities placed in aleatoric harm’s way.[15]
Kuboyama’s recorded voice later became source material for a piece of experimental electronic music composed at the WDR lab in Cologne, where composers with large Cold War investments in sound labs worked with atomistic building blocks of sound to transform, manipulate, and scale: the phoneme and the sine wave became techniques of aleatoric composition. Herbert Eimert’s “Epitaph for Aikichi Kuboyama” (composed 1957–1962) is a work composed of tape loops and electronics using spoken word, looped utterances, and electronic manipulations to create a darkly chaotic soundscape. The same logic of microframing and slicing in the Lattice-Boltzman turbulence prediction processes and sound windowing we find operative in the granular synthesis work of FFT in remote acoustic sensing for treaty verification also operates in Eimert’s composition as he mimetically invokes the dissolution of the voice and corporeal frame of the Japanese radio operator on the Lucky Dragon fishing boat. Eimert’s use of the Tempophon electronic device allowed him to capture speech and compress it into a spatial rather than temporal object (as with FFT slicing) without losing pitch: the moment had become infinitely repeatable and expanded into static.
Like Brownian molecules in constant collision and motion, the phonemes in Eimert’s piece bounce in and out of aural focus from their static-ridden proto-noise/industrial music soundscape. The voice as carrier of the semantic signal becomes impossible to recognize as voice. The particles of sound found in phonetics emerge in their particulate specificity before dissolving into indecipherable blips, a cryptography with little apparent signal except the crypt to which downwinders are carcinogenically consigned by aleatory forces that elude military strategic control.
In our epigraphs for this closing refrain, when the character Buck Turgidson discusses the potential death toll for the United States (or “getting our hair mussed” resulting from the launch of a full offensive nuclear attack on the Soviet Union), “the breaks” he refers to depend on weather conditions, primarily wind directions. The first-person narrator in Wolf’s novel about living downwind of the Chernobyl nuclear meltdown ponders the same concerns as she parses the platitudinous scientific facts espoused by experts on state radio newscasts regarding degrees of endangerment dependent on unpredictable wind conditions. Chronologically between the film and the novel, Kraftwerk robotically intone our collective aleatory potential: punningly, the radio activity of Wolf’s newscast is the radioactivity of Turgidson’s military megadeath modeling. The chief difference is that the radio activity of terrestrial broadcast is channeled in a mostly predictable channel while airborne radioactive isotopes are not. Yet both are in the air for you and me.
Banner Image: “Environmental Racism in ‘Death Alley’, Louisiana” (Forensic Architecture) used with permission from Forensic Architecture.
I wish to thank my colleagues in the Weather Report project for their constant enthusiasm and inspiration, as well as editorial suggestions on this article. The two anonymous reviewers for the journal provided productive insights that have shaped this version of the piece, and their time, attention, and thoughtful consideration of the article are greatly appreciated. I also wish to thank specifically Sean Cubitt, Jussi Parikka, Bobby Pietrusko. and Cera Tan for thoughts and comments on earlier drafts of this article. This article and the special issue it contributes to are made possible by AHRC-DFG funds for the Weather Report project for which I serve as PI, as does Birgit Schneider, with Jussi Parikka as Co-I, and Maximillan Hepach and JR Carpenter as Research Fellows. Further thanks are due to the University of Southampton for Open Access funding for this article.
Given his various birth years and Christie’s own casting of Poirot, he would not have served in World War I. Thus, this opening sequence is apocryphal in the diegetic world of Christie’s character.
“Blowback” enjoys the ignominious company of other military euphemistic phrases including “friendly fire” (killing the soldiers on your own side), “collateral damage” (civilian deaths from an attack), “emergency release of airborne ordnance” (accidental aerial bombing of friendly or home territory), “unhousing the workforce” (bombing civilian urban centers), or “controlled flights into terrain” (an aircraft crash). Originally referring to an action in a rifle that used the propulsion to reload the empty chamber, blowback then became attached in the late nineteenth century to gases emitted from machinery and from there in World War I to include a kind of friendly fire in which poison gas blew back into the faces and trenches of those who had fired the canister. This is the manner in which Heidegger was concerned with the term though he did not use it. Finally, blowback now is used in intelligence as an unforeseen and unwanted effect of a given action—for example, training and arming Osama bin Laden to fight the Soviet invasion of Afghanistan, resulting in the World Trade Center terrorist attacks.
In the time since DeLoughrey published this piece, numerous scholars in geography and stratigraphy have explored these issues within Anthropocene discourse.
As I write this, the Russian invasion in Ukraine is entering its second spring, and in this instance spring negates rebirth except that of armed hostilities as the eerily retro ground war in the region—replete with trenches and tanks—can gain traction again in warmer weather.
Fuller’s account of the latter includes the angry response by King Phillip II of Spain that he had sent the country’s vessels to fight the English, not storms.
In fact, the site itself eventually was determined by weather conditions but not weather prediction. Despite the forecast, the original bomb site, Kokura, was enshrouded by unexpected cloud cover that prevented it from being enshrouded in atomic mushroom cloud cover. That fate fell on Nagasaki as the B-29 altered its course.
See Robert G. Pietrusko (2020) on aerial photography of vegetation as a proxy for numerous phenomena from soil composition to crop yield models; and for the military application of pesticides and “the Green Revolution,” see Nick Cullather (2013).
For more on the conversion of Southeast Asian jungles into a remote-sensor laden experiment in environmental weaponization, see Pujit Guha (2023).
Some of these were just wacky ideas that never got very far, or off the drafting table, but the artificial lightning project had some success and implementation; see Bishop (2011), reprinted in Peter Adey, Mark Whitehead, and Alison J. Williams, eds., From Above: War, Violence and Verticality (London: Hurst & Company, 2014), 186–202; and in Eyal Weizman, ed., Forensic Architecture (Berlin: Sternberg Press), 580–91.
Interestingly, the word “target” etymologically relates to “stochastic,” making a conceptual link worth some consideration.
Such monitoring demands were codified in a very large and long-running program known as Vela Uniform. Begun in the Eisenhower administration, it intensified under Kennedy, to provide total observational potential for any form of nuclear explosion (test or in war) from the height of the atmosphere to the bottom of the ocean and underground. As part of the Vela Uniform project, which was meant to detect all forms of nuclear testing—underground, terrestrial, stratospheric—seismology suddenly became a strategic science in the eyes and ears of DoD. The project was begun under the Eisenhower administration in 1950, as noted above, and most but not all of the documents and research conducted by Vela Uniform were never classified to indicate US commitment to nuclear testing as a global geopolitical project. For a sustained overview of seismology and Vela Uniform, see Barth (2003), part of a special issue of Social Studies of Science titled “Earth Sciences in the Cold War.”
For a more detailed discussion of the FFT in relation to military seismology and geoacoustics, as well as sound installation work for gallery space, see Bishop (2024).
In a sense, Poirot’s birds provided him the same windowing opportunity as the FFT, the microslicing of turbulence dynamics for short-term prediction.
For a related personal and critical reading of the disposability of “downwinders” in the US Southwest, see Borunda (2022).