On a Distributed Command and Control System Configuration: RAND Corporation, 1960.

Baran, 1960: 26

In her article Government, Business, and the Making of the Internet (2001), Janet Abbate traces the interactions between various competing interests in the technologies of digital networking from the mid-1970s up to the 1990s. Abbate provides an account of the initial impetus that motivated early research on the ARPANET—what is of course now generally understood as being the technical foundation for the internet as we know it today. Contrary to many other scholars writing on this history, Abbate contends that designing a system that could survive nuclear attack was not the priority of the developers at the Advanced Research Projects Agency: The real motivation was to deal with the more immediate problem of facilitating the sharing of resources across a group of researchers at universities and other technical institutions dispersed across the breadth of the United States (150). The story of the internet's creation is replete with lore and myth, and unsurprisingly the reality is much more complex. To state the obvious, the internet is not a singular technology that was "invented" in an instant: rather, it made use of a multitude of inventions, many with origins that are older than the Cold War—"the internet is really the work of a thousand people" (Baran, quoted in Isaacson, 2014: 245). The coming-together of telecomms concepts, digital computing, and electrical systems to facilitate a new communications network had commercial, military, and governmental applications—but where exactly does the connection with national defense against nuclear attack come from?

A memorandum in the RAND archives, titled On a Distributed Command and Control System Configuration and predating the ARPANET by almost a decade, draws a direct link between the possibilities of distributed networking and the high anxieties of the Cold War. Written by Paul Baran and published on New Year's Eve 1960, the document sets out his initial theories for how a new conceptualisation of distributed command and control would enable the military command structure to survive a nuclear attack. The implications of this idea on the careful balance of forces necessitated by Mutually Assured Destruction (MAD) were significant: it could potentially overcome the persistent fear that a carefully planned surprise attack on the "centralised" US Defense apparatus could render it completely inoperable, nullifying possibilities of counterattack. For the Soviets, a surprise nuclear attack that disabled the possibility of a military response must have been an attractive option—or so the thinking went amongst American defense strategists. A fully distributed command and control system, however, spread out across the entire country, would all but ensure a nuclear response from the US and thus act as a strong deterrent.

The Nuclear Accident and the "Insane Man" Hypothesis

As justification for his call for a renewed analysis of the US military command structure, Baran cites the military strategist Herman Kahn, who shocked America in the late 1950s with his controversial theories published in a series of lectures, memos, and his best-selling book On Thermonuclear War (1960b). Kahn's claim that a nuclear war was in fact "winnable" were qualified in an intensely rational account that, among other things, suggested that MAD was a comforting fantasy (1960a: 2). His theories, echoing the game theory of Von Neumann and Morgenstern, attempted to grapple with the possible aftermath of a nuclear war, whether it was triggered by "an insane man", "an accident, some miscalculation, or even irresponsible behaviour", or the catastrophic reaction of a cornered sovereign (1960a: 4). He writes in The Nature and Feasibility of War and Deterrence (1960a):

Would only an insane man initiate a thermonuclear war? [...] Are there circumstances in which a nation's leaders might rationally decide that a themonuclear war would be the least undesirable of the possible alternatives? (4)

But all is not lost should a thermonuclear war break out, according to Kahn. On the subject of the economic consequences, he writes that "recuperation looks more feasible than is generally supposed" (8). On the inevitable medical emergency from a hypothetical five-megaton bomb dropped on the United States, he says:

"The war might shorten by one or two years the life expectancy of those who were lucky or protected, and by five or ten years the life expectancy of those who survivors who were not so lucky or well protected. In any case, life would go on" (8).

Needless to say, it is doubtful that Kahn's weak ponderings would have inspired much confidence in the citizens of New York—or indeed any of the other major cities that were anticipated targets in a nuclear attack. Despite his morbidly statistical take on the horrors of nuclear warfare, he was not incorrect to confront the event of the nuclear accident. To paraphrase Paul Virilio, every new technology creates the associated possibility of its sudden and catastrophic failure, and it is absolutely necessary that this possibility must be confronted. In Eric Schlosser's Command and Control (2013), we are given a lengthy account of the many nuclear accidents and "miscalculations" of the Cold War, ranging from an early warning system that misidentified the moon as an incoming barrage of Soviet warheads (254) to a dropped socket wrench that tears a hole in a Titan II missile (3). A reader of Schlosser's book might (not altogether unreasonably) conclude that it was nothing short of blind luck—or perhaps a miracle—that stopped an all-out thermonuclear war from breaking out between the two superpowers.

How to Survive a Small Circular Error Probable

Paul Baran specifically references Kahn's Nature and Feasibility of War and Deterrence in his justification for the survivability of command and control systems. Was he thinking that there must be a way to update MAD for the age of intercontinental ballistic missiles, and sidestep Kahn's morbid rationality and calls to ramp up the arms race and prepare for nuclear war? Baran states that, in the missiles of the time, the extremely small "Circular Error Probable"—essentially a measure of the accuracy of a given missile—meant that US strategists had to engage with the idea that the entire military command structure could be compromised by a well-placed attack:

"Our communications systems are for the most part conceived upon the model of hierarchical networks. An entire network may be made inoperable at the cost of a few missiles cleverly directed against a few switching center targets." (5)

His proposed alternatives are unsatisfactory to him: for instance, burying command and control centres ever deeper into the ground would mean that communication links between sites would either be prohibitively expensive if tunnelled below ground, vulnerable if above ground, or suffer limited bandwidth (7). Baran thus asks:

"Is it possible to design decentralised organisations where over-all control may be safely vested in a large number of places—so large a number as to increase to a prohibitive level the cost of attacks directed at the command centers?" (9)

Baran's answer to this is the distributed network: an array of nodes—command and control centers—that are interconnected with links in the form of above-ground microwave repeaters and cables. The key to overcoming vulnerability is "redundancy". Should an attacker damage a given link between two nodes, communication is rerouted around the breakage. The network topology diagram at the top of this article, taken from Baran's memorandum, demonstrates how redundancy can function in a distributed network. For instance, if the link between nodes (F)—(G) was damaged, communication could still arrive via (D)—(E), or via (D)—(E)—(H)—(J)—(G), and so on. Baran explains that the key to his system is that each node is also "intelligent" in that, upon receiving a message, it authenticates the message, then repeats it to the nodes it is connected to, while ignoring repeated messages by tracking a "handover" number that is equal to the number of nodes the message has previously travelled through (33). Such a system goes some way towards solving Baran's conundrum he asks earlier in the document: "How, then, should we go about designing these structures so that destruction of one portion is no more disabling than destruction of another portion?" (10)

The Semi-Automatic Ground Environment (SAGE), an elaborate command and control system responsible for the detection of incoming Soviet bomber planes, had only gone online a few years before Baran's memorandum was published. Despite its advanced and ambitious development, it was ultimately obsolete by the time of its completion: ballistic missiles had replaced the bomber plane, and additionally, the SAGE system was vulnerable precisely to the kind of strategic nuclear attack Kahn and Baran discuss. In practical terms, the SAGE system was a network of "Direction Centres" outfitted with cutting-edge IBM computers and resonsible for the surveillance of a given sector of US airspace. A group of Direction Centres would be under the centralised control of a single Combat Centre, which would coordinate operations across the region (see Edwards, 1996: 96-99). Command and control is thus decentralised, but not distributed. Baran provides the following network topology as an example, highlighting its similarities to the SAGE model:

Baran, 1960: 26

Networked Power and Distributed Communication

To diminish such vulnerabilities, Baran proposes a conceptual decoupling of the networks of communication and the networks of power: automation opens up the capacity to distribute the data required for strategic decision-making across multiple command centers. Data is sent out across the network of commanders, rather than up a singular hierarchical chain of command. Such a networked system is reproducible and thus scalable:

"What is clearly noted is that the information network need no longer be tightly coupled to the hierarchical chain of command but may be duplicated at will. Thus, by judicious replication of apparatus, as many 'potential' command posts may be created as the pocketbook will allow." (11)

Distributed command, however, raises the spectre of "leadership"—an "almost religious regard for the differential acceptance of one individual in the command role versus another is a factor that cannot be lightly dismissed" (Baran, 1960: 12). Baran's response to this is what he calls "Virtual Command". Authority, he writes, is contingent on knowledge, and by implication is not an innate individual quality—anybody, given the right data, can lead effectively. The supposition is that by automating and expediently distributing real-time analysis of an attack—factors which would be theoretically possible in Baran's distributed model—leadership can "emerge" as necessary and deal effectively with forming a calculated response. The problem to solve then is one of technology, not of finding the appropriate authority figures. It is interesting to note that the idea that distributed networks can productively decentralise power indeed arises again in the 1980s-1990s utopian ideals of the burgeoning cyberculture, and whose neoliberalisation was captured presciently in Barbrook and Cameron's 1995 essay The Californian Ideology. The democratising, non-hierarchical ideals of the "electronic agora" were captured and transmuted into an "electronic marketplace": power to shape the web was not distributed as such, but remoulded and decentralised.

Baran defends his distributed delegation of power by taking inspiration from the reality of naval battles, which operate on a basis of "knowledge constitutes authority" (13). But is it not possible, as Herman Kahn imagines with regards to nuclear weapons, for the hypothetical "insane man" to take the reins of such a system and cause an irretractable sequence of events? How might social hierarchy re-emerge or be imposed on a system that is supposedly technically neutral? While the social vulnerabilities of distributed command and control are not afforded as much attention as the technical, the imperative for Baran is that, in a quickly evolving high-stakes situation, it is not only the technologies of control that must have built-in redundancy: there is also a necessity for redundancy of commanders (13). With a redundancy of both command and control, "the defense may play the old shell game; this time with a large number of shells and a sizeable number of peas. The enemy must find all the peas in order to win" (13).

Concluding the memorandum, Baran lists nine perceived "new problems" for the engineer trying to design a decentralised modular computer system. These problems, relating to issues such as automation, network organisation, and defense against enemy attacks on the network itself, should be approached with a view to creating a system that prioritises its "post-attack" operation. Baran writes that computer systems should die like the MacArthurian soldier, doing so while enabling other computers to continue the battle uninterrupted: "if they must die, they should die gracefully" (17).

Postscript

Baran, 1964a: 7

Four years later, Baran published an 11-volume series of memoranda titled On Distributed Communications, in which we can see him advancing his earlier theories into a careful study of precisely how they could be technically implemented. This document was one of the first feasibility studies of its kind on the practicalities of distributed networking, with elaborations on possible solutions to the perceived engineering challenges, how the network could be incrementally developed and scaled, and offers a speculative costing for the project—putting development at just under a quarter of a million dollars, and with an annual maintenance cost of $60 million (Baran, 1964: 5). Baran offers a caveat for the proposed budget:

Since it was necessary to "cost" a system not completed defined, using equipment never built, the author must absolve all concerned (and possibly even himself) for the errors and inconsistencies that inevitably occur in compiling cost for an untried system, especially one that uses unorthodox construction techniques to fulfill an unknown, but rapidly growing, demand for a new service. This inconvenience is part of the price we pay for new systems concepts. (1964: vii)

William Isaacson, writing in The Innovators (2014: 241), casts Baran as a victim of the inter-departmental military bureaucracy, his radical ideas bounced from one agency to the next, ultimately remaining unimplemented. A telling sign that Baran saw his proposals fitting into the doctrine of Mutually Assured Destruction, he "insisted that it not be classified as secret because he realised such a system worked best if the Russians had one as well" (ibid). Should both superpowers have distributed command and control systems, the strategic value of a surprise nuclear attack is essentially nil, as an instantaneous counterattack would be a near-certainty. However, despite being buried by military institutions, the declassified report did eventually find its way into the hands of ARPA engineers a few years later, as they attempted to develop their own distributed research network. Along with parallel research undertaken by Donald Davies—a British electronic engineer who coined the term packet with relation to networking—some of Baran's ideas were indeed influential on ARPANET's developers (ibid: 242).

Beyond the scope of this article, although certainly worth a read in their own right, are Baran's RAND papers that followed his work on distributed networking. In these works, he strikes a more overtly cautionary tone on the role of technology in society, writing on the need to form a new ethical basis for engineering in the information age, as well as proposed inventions for the social good, such as a gun barrel that etches a unique binary identifier into any bullet fired from it to "discourage illegal use of small arms". In A Little Caution in the Hasten to Number (1968), Baran calls on the engineering community to reflect on the kind of future it was having a hand in creating:

"We need to probe the boundary line between legitimate social use of the computer and where the methods and capabilities of the new information technology were oversold and allowed to displace common sense. Restated as a question: 'Are our analytical tools weaker than we would like the public to believe? Are we misleading those we sit in awe of the gods of science and technology?'" (13)

Baran's warning that the rush to number can have undesirable and potentially catastrophic second-order effects generates a tension with his earlier suggestion that, in a distributed network, knowledge constitutes authority. In 1968, the systems analysis of Robert McNamara was, in effect, rushing to number the Vietnam war with the belief that a systematic account of the war narrative would lead to more astute decision-making and strategising. As I have written previously, analysis of Vietnam War-era countersurgency programmes has widely proven that this was not the case. The tension between these two texts, written eight years apart it must be said, demonstrates that the idea of "technology being neutral" should be engaged with on the level of production. When a technology is produced specifically for a highly politicised application—in nuclear defense systems for instance—should its designers simply consider it neutral? As he grappled with this question in the late 1960s, Baran presciently warned of the possible second-order effects of networked information systems on the unequal distribution of power in society, enacted in areas as varied as the computerisation of insurance records and the broadening capacity for the government to observe its citizens. With such implications, the least that can be expected from those who work with technology is for them to act "as an early warning system for the rest of society" (Baran, 1968: 11).

Sources

  • Abbate, J. (2001). Government, Business, and the Making of the Internet. The Business History Review, 75(1), 147-176. Accessed 1 March 2017. Retrieved from http://www.jstor.org/stable/3116559.
  • Baran, Paul. (1960) On a Distributed Command and Control System Configuration. Santa Monica, CA: RAND Corporation. Accessed 6 March 2017. http://www.rand.org/pubs/research_memoranda/RM2632.html.
  • Baran, Paul. (1964a) On Distributed Communications: I. Introduction to Distributed Communications Networks. Santa Monica, CA: RAND Corporation. Accessed 6 March 2017. http://www.rand.org/pubs/research_memoranda/RM3420.html.
  • Baran, Paul. (1964b) On Distributed Communications: X. Cost Estimate. Santa Monica, CA: RAND Corporation. Accessed 6 March 2017. http://www.rand.org/pubs/research_memoranda/RM3766.html.
  • Baran, Paul. (1968) On the Future Computer Era: Modification of the American Character and the Role of the Engineer, or, A Little Caution in the Haste to Number. Santa Monica, CA: RAND Corporation. Accessed 6 March 2017. http://www.rand.org/pubs/papers/P3780.html.
  • Barbrook, Richard and Cameron, Andy. (2007) The Californian Ideology. Accessed 6 March 2017. http://www.imaginaryfutures.net/2007/04/17/the-californian-ideology-2/
  • Edwards, P. (1996). The Closed World : Computers and the politics of discourse in Cold War America. (Inside technology). Cambridge, Mass.; London.
  • Isaacson, Walter (2004) The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. Simon and Schuster, London.
  • Kahn, Herman. (1960a) The Nature and Feasibility of War and Deterrence. Santa Monica, CA: RAND Corporation. Accessed 6 March 2017. http://www.rand.org/pubs/papers/P1888.html.
  • Kahn, Herman. (1960b) On Thermonuclear War. New Jersey, United States: Transaction Publishers.