15 December 1998
Source: Hardcopy World Affairs, Fall 1998, Vol. 161, No. 2, pp. 82-91. With permission of Peter Leitner and World Affairs.

See also Mr. Leitner's article on Law of the Sea Treaty: http://jya.com/btr.htm


Supercomputers,
Test Ban Treaties,
and the Virtual Bomb

---

By PETER M. LEITNER

Peter M. Leitner is a senior strategic trade advisor at the Department of Defense. The opinions expressed herein are the author's alone and do not represent the views of the Department of Defense, the government of the United States, or any organization.

 

No matter what the Russian expectations were, or their cause, the acquisition of supercomputing power has now become a proxy for the nuclear weapons race. It's basically who has the virtual bomb. We're going to have a cold war on virtual weapons.1

Chelyabinsk-70 and Arzamas-16 are names unfamiliar to most outside of the intelligence or nuclear weapons design communities. Yet these two Russian facilities, long considered hidden or closed installations by the Soviet Union, designed and manufactured several hundred nuclear warheads per year and had a hand in the creation of most of the tens of thousands of nuclear weapons developed during the Soviet era.

Soviet state secrecy practices formulated such odd hyphenated names to describe approximately thirty-five municipalities dedicated to the military-industrial complex, ten of which were huge nuclear design facilities controlled by the Ministry of Atomic Energy (MINATOM) and guarded by special regiments of the Ministry of Internal Affairs.2 In addition to their primary names, these closed sites bear code names from cities 50 to 100 kilometers away followed by a postal zone number (for example, the All-Russian Scientific Research Institute of Experimental Physics, "Arzamas-16," or the All-Russian Scientific Research Institute of Technical Physics, "Chelyabinsk-70"). Since 1989, Russia has opened a number of these sites to limited visits by foreigners, but many details of their specific missions and locations still have not been declassified. Several of the facilities are described in Table 1.

TABLE 1
Selected Russian Nuclear Weapon Design Facilities

Code Name Formal Name Closed City Region Specialty
Arzamas-16 All-Russian Institute of
Experimental Physics
Sarova Urals Nuclear warhead design and
land-based ICBM re-entry
vehicle fabrication
Chelyabinsk-70 All-Russian Institute of
Technical Physics
Snezhinsk Urals Nuclear warhead design, high-
explosive testing, nuclear
bombs and submarine-
launched ballistic
re-entry vehicles
Sverdlovsk-45 Elektroimpribor Rusnoy Urals Warhead assembly/
disassembly
Zlatoust-36 Zlatonst Trekhgornyy Urals Warhead assembly/
disassembly
Penza-19   Zarechnyy Kuznetsk Warhead assembly/
disassembly
Tomsk-7 Siberian Chemical
Combine
Seversk Siberia Fissile material component
fabrication
Chelybinsk-65 Mayak Production
Association
Ozersk Southern
Urals
Plutonium and tritium
production for
nuclear weapons
Krasnoyarsk-26 Krasnoyarsk Mining and
Chemical Combine
Krasnoyarsk/
Atomgrad
Siberia Plutonium for nuclear
warheads
Krasnoyarsk-45 Electro-Chemical Plant:
Zheleznogarsk Reactor
Krasnoyarsk/
Atomgrad
Siberia Enriched uranium
production
Source: Carnegie Endowment for International Peace, Nuclear Successor States of the Soviet Union 4 (May 1996); Wisconsin Project on Nuclear Arms Control, Risk Report (various issues).

Chelyabinsk-70 and Arzamas-16 have recently become more widely known by virtue of published news accounts of the apparently illegal sale of high-performance supercomputers to these nuclear weapons design and manufacturing bureaus. The illegal transfer occurred after several failed Russian attempts to purchase similar supercomputers legally. Gary Milhollin of the Wisconsin Project on Nuclear Arms Control first broke the story of the illegal export of Silicon Graphics and IBM computers,3 and their role in warhead design and simulation, after it was publicly revealed by Viktor Mikhaylov, the head of MINATOM. News of the illegal export spurred angry statements from Congress and an ongoing investigation by law enforcement agencies.

In an interview, MINATOM spokesmen "could not hide their astonishment at the American side's hints to the effect that the furor broke because of Minister Viktor Mikhaylov's excessive frankness—speaking in Moscow, he referred to the processor models by their number. But the whole point is that the minister had nothing to hide. Everyone knows that the U.S. supercomputers will be used to solve tasks connected with the safe operation of the Russian nuclear arsenal, confirming its reliability, and ensuring its safekeeping."4 Izvestia also reported, "The U.S. Commerce Department says that the whole problem has arisen because of the undue candor of . . . Mikhaylov [who] also said that these computers are to be used for the modeling of nuclear explosions."5

A year ago Silicon Graphics sold eight high-speed R-1000 computers to a Russian scientific research institute known at home and in the rest of the world by its former code name Chelyabinsk-70. U.S. Laws do not forbid the sale of such equipment to Russia. All machines with speeds of up to 2 billion operations a second do not need export licenses.
For computers with speeds of between 2 billion and 7 billion operations a second, the rules are different. Manufacturers are obliged to consult Commerce Department experts before shipment commences. And finally the best computers, with speeds over 7 billion operations a second, cannot be sold to countries like Russia without the mandatory license.
A legal investigation has now been launched against Silicon Graphics Inc. The Commerce Department says that a parallel system of several R-1000 computers amounts to a supercomputer of the class that is strictly forbidden for sale to Russia. [In fact, these computers can be easily upgraded from 480 MTOPS to at least 4,500 MTOPS simply by adding additional CPU boards and memory.] However, this is not an old prohibition dating back to the Cold War times, but came into effect fairly recently after the disbandment of the infamous COCOM—the committee controlling the sale of strategic materials to Communist countries.6

Mikhaylov's announcement came in the wake of a controversial U.S. denial of Convex and IBM supercomputer equipment, in fall 1996, to the same two "closed" facilities.7 "Convex originally applied to sell three supercomputers . . . The SPP 1200 model (Exemplar X-Class) operated at 4,564 million theoretical operations per second (MTOPS) but upgradeable to 34,500, while two others, also upgradeable, ran at 1,630 MTOPS and 1,870 MTOPS. The IBM SP 2 model that was intended for sale operates at 780 MTOPS. Another IBM machine was also bound for the Moscow lab, a company official indicated."8 These RS6000 series computers are upgradeable to 250,000 MTOPS.9

In 1995, "President Clinton [unilaterally] decontrolled computers up to 2,000 MTOPS [from the previous CoCom ceiling of 260 MTOPS] for all users and up to 7,000 MTOPS for civilian use in Russia but reserved the authority to block exports that raise proliferation concerns."10

Ostensibly, the powerful IBM and Convex computers withheld from the Russians were to be used to model the migration of radioactive material in ground water in the vicinity of nuclear weapons plants. However, this was considered a highly improbable end use given what was known about the two facilities and their interest in the simulation of nuclear weapons effects.

In an exchange of letters between Mikhaylov and U.S. Energy Secretary Hazel O'Leary, Mikhaylov first indicated that he wanted the supercomputers to maintain the safety and security of nuclear stockpiles under the test ban. In a second letter on September 9, he denied the computers would be used to improve Moscow's nuclear weapons. But he conceded that at least one machine, the Convex SPP 2000, would be used in confirmation of the reliability of, and the preservation of Russia's nuclear stockpile. Those words meant that the Russians planned simulated tests to verify the yields of their nuclear bombs. It would be difficult to separate reliability testing of old weapons from development of new ones.11

In a 29 November 1996 letter to Mikhaylov from Assistant Secretary of State for Political-Military Affairs Thomas E. McNamara, the United States officially rejected the export request stating, "I am writing to inform you of our government's recent decision with respect to the Ministry of Atomic Energy's license requests for advanced computers for use by the nuclear research institutes Arzamas-16 and Chelyabinsk70. We have informed the U.S. manufacturers [company names were expurgated in the version of the letter released by the State Department] that we are not prepared to approve their license applications. While we consider the promotion of scientific and technical cooperation between the U.S. and Russia one of our most important goals, we must balance such considerations with national security concerns in evaluating sensitive dual-use export cases."12

SUPERCOMPUTERS: A QUID FOR RUSSIAN CTBT SIGNATURE

The U.S. rejection of the Convex and IBM sales triggered Mikhaylov to publicly state that Russia was promised access to U.S. supercomputer technology by Vice President Gore as a quid for Russian accession to the Comprehensive Test Ban Treaty. Vladislav Petrov, head of MINATOM's Information Department, stated that the Clinton administration promised Russia the computers during the test ban treaty negotiations to allow Russia to engage in virtual testing of warhead designs.13 Indeed, Mikhaylov told reporters in January 199714 that the Silicon Graphics and IBM supercomputers illegally shipped will be used to simulate nuclear explosions.

Boris Litvinov, the chief of design at Arzamas-16, stated in December 1996 that these computers were needed for "constantly perfecting nuclear warheads.''15 He added, "It is simply impossible to improve our knowledge of nuclear processes today without modern computers. We retain our nuclear power status; it is recognized and no one in the world has the right to demand that we scale down our research."

On 24 February 1997, MINATOM's Information Department issued a press release stating:

The 1996 signature of the Comprehensive Test Ban Treaty (CTBT) has become an undoubted success in the struggle for nuclear disarmament. At the expert meetings in London in December 1995 and Vienna in May 1996, which preceded the CTBT signature, special attention was paid to the issue of maintaining security of the nuclear powers respective arsenals under conditions of discontinued on-site testing. Nuclear arsenal security maintenance is impossible without simulation of physical processed and mathematical algorithms on high-performance parallel computers, which are currently produced in the United States and Japan. In the interests of signing the CTBT in the shortest possible time, the U.S. and Russian experts mutually agreed on the necessity of selling modern high-performance computers to Russia.16

According to a February 1997 report, "The possibility of the theoretical modeling or, in scientific parlance, 'simulation' of nuclear explosions was a crucial part of the Comprehensive Nuclear Test Ban Treaty. When pressuring for the conclusion of this treaty the Russians and Americans worked together on problems of the computer simulation of controlled explosions."17 Nikolay Voloshill, chief of the Russian Federation Ministry of Atomic Energy Department for Designing and Testing Nuclear Warheads, revealed, "During the purchase it was stated and guaranteed that Russia is buying the computers for fundamental scientific research in the sphere of ecology and medicine, and this includes the safety of the remaining nuclear arsenal."18

The rejection immediately provoked charges in Moscow that the United States was reneging on promises allegedly made during Gore-Chernomyrdin commission meetings, particularly as the long-delayed decision not to approve the licenses came just days after Russia signed the CTBT.19 One MINATOM official expressed his concern over American intentions:

If one takes into account the fact that nuclear parity between the two states has in many respects been maintained not only through testing, but also with the help of theoretical studies, one can imagine what is behind such a refusal. In many traditional branches of science and technology the creation of an experimental model is preceded by laboratory modeling, but in the atomic branch mathematical computations are a substitute for this stage. In the process of a real-life blast nothing is left of the elements of the nuclear devices except vaporized material, and that is why mathematical computation actually becomes the only way to obtain information on the processes that occur.
The special significance of these theoretical studies has become obvious in the course of the fulfillment of the terms of the comprehensive nuclear test ban treaty. The United States has made much better provisions than Russia for giving up nuclear testing. Supercomputers used for virtual-reality modeling of the processes of nuclear explosions have played a decisive role in that. The Americans rightly figured that since they has such equipment, they would be able to compensate for nuclear explosions by obtaining the necessary data with the aid of supercomputers. This practice of bans, smacking of the cold war, can push Russia, devoid, by contrast with the United States, of the possibility to improve its nuclear weapons with the help of supercomputers, into breaking the moratorium on nuclear tests.20

GOING VIRTUAL— WHAT DOES IT MEAN?

Virtual testing, modeling, and simulation are essential to clandestinely maintain or advance nuclear weapons technology. As the planet shows no sign of nearing the point where nuclear weapons are banned, it is reasonable to assume that current or aspiring nuclear weapons states will vigorously attempt to acquire high-performance computers to advance their nuclear programs with a degree of covertness hitherto impossible to achieve.

There is considerable conjecture within the scientific community as to whether a state would be able to design and deploy a nuclear device without first engaging in a full-scale test of the physics package. The arguments boil down to the confidence of designers and government officials that an untested device would behave in the intended manner. Many engineering purists in the United States declare unequivocally that virtual testing alone is insufficient to determine whether a weapon design is predictable or even functional. However, they often ignore one of the most compelling lessons drawn from the Iraqi nuclear weapons program; the necessity for a clandestine program not to expose itself by venturing beyond hydrodynamic testing. Proof of concept was all that the Iraqis could safely achieve without provoking a devastating pre-emptive response from the Israelis. A similar pattern was evident with Israeli and Swedish weapons programs. In fact, the only publicly known full-scale weapons test by a clandestine program was carried out by South Africa, reportedly with Israeli assistance.

The development of supercomputers has been driven relentlessly and underwritten by the weapons program because of the high cost of physical testing and the severity of the test environment. "The technical limitations are enormous: extreme temperatures (10 million degrees) and material velocities (4 million miles per hour), short time scales (millionths of a second) and complicated physical processes make direct measurement impossible. Computers provide the necessary took to simulate these processes."21

Perhaps the best way to understand the importance of virtual testing to facilitate weapons maintenance and development is to analyze by analogy. DoE's National Ignition Facility (NIF) embodies what many fear will be the worst-case application of U.S. supercomputer technology to Russian nuclear weapons development. The NIF represents the marriage of high-energy lasers and massively parallel supercomputers in support of an inertial confinement fusion program advertised as supporting pure, applied, and weapons sciences. This facility will seek—using lasers, X-rays, and electrical pulses—to measure how bomb components behave in conditions similar to those in a nuclear explosion. The Department of Energy intends, following a concept called Science Based Stockpile Stewardship, "to use the fastest supercomputers yet devised to simulate nuclear explosions along with all the important changes that occur to weapons as they age. The plan has stirred vigorous debate among arms-control advocates, military strategists, and, most recently, university researchers, over whether the approach is cost-effective, feasible and wise."22

The weapons-related research envisioned for the NIF would rely on high-performance computers and test equipment to explore a range of topics, including these:

• Radiation flow

• Properties of matter

• Mix and hydrodynamics

• X-ray laser research

• Computer codes

• Weapons effects

The Department of Energy is promoting each of these as an important potential NIF activity.23 The following descriptions are paraphrased from publicly available materials:

Radiation Flow. In most thermonuclear devices X-radiation emitted by the primary supplies the energy to implode the secondary. Understanding the flow of this radiation is important for predicting the effects on weapon performance of changes that might arise over time.

Properties of matter. Two properties of matter that are important at the high-energy densities of a nuclear explosion are equation of state and opacity. The equation of state is the relationship among a material's pressure, density, and temperature expressed over wide ranges of these variables. Opacity is a fundamental property of how radiation is absorbed and emitted by a material. The correct equation of state is required to solve any compressible hydrodynamics problem accurately, including weapons design. Radiation opacities of very hot matter are critical to understanding the radiation flow in a nuclear weapon.

Mix and hydrodynamics. These experiments involve the actual testing of extremely low-yield fission devices (as low as the equivalent of several pounds of TNT) within a confined environment to study the physics of the primary component of thermonuclear warheads by simulating, often with high explosives, the intense pressures and heat on weapons materials. (The behavior of weapons materials under these extreme conditions is termed "hydrodynamic" because they seem to flow like incompressible liquids.) Hydrodynamic experiments are intended to closely simulate, using non-nuclear substitutes, the operation of the primary component of a nuclear weapon, which normally consists of high explosive and fissionable material (the plutonium "pit"). In hydrodynamic experiments, the properties of surrogate pits can be studied up to the point where an actual weapon releases fission energy. High explosives are used to implode a surrogate non-fissile material while special X-ray devices ("dynamic radiography") monitor the behavior of the surrogate material under these hydrodynamic conditions.24

X-Ray laser research. Supercomputer-based experiments could provide data for comparison with codes and could be used to further interpret the results of past underground experiments on nuclear-pumped X-ray lasers.

Computer codes. The development of nuclear weapons has depended heavily on use of complex computer codes and supercomputers. The codes encompass a broad range of physics including motion of material, transport of electromagnetic radiation, neutrons and charged particles, interaction of radiation and particles with matter, properties of materials, nuclear reactions, atomic and plasma physics, and more. In general, these processes are coupled together in complex ways applicable to the extreme conditions of temperature, pressure, and density in a nuclear weapon and to the very short time scales that characterize a nuclear explosion.

Weapons effects. Nuclear weapons effects used to be investigated by exposing various kinds of military and commercial hardware to the radiation from actual nuclear explosions. These tests were generally conducted in tunnels and were designed so that the hardware was exposed only to the radiation from the explosion and not the blast. The data were used to "harden" the equipment to reduce its vulnerability during nuclear conflict. Without nuclear testing, radiation must be simulated in above-ground facilities and by numerical calculations.

"NIF . . . will cost approximately US $4.5 billion to construct and operate, [and] will be the world's largest laser, intended to bring about thermonuclear fusion within small confined targets [and] represents the closest laboratory approach to a number of critical parameters in the weapons environment . . . by using 192 laser beams to produce 500 trillion watts of energy for 3 billionths of a second."25 This capability will be combined with DoE's Accelerated Strategic Computing Initiative (ASCI), aimed at developing "the computer simulation capabilities to establish safety, reliability, and performance of weapons in the stockpile, virtual prototyping for maintaining the current and future stockpile, and connecting output from complex experiments to system behavior." A reported goal of ASCI is to create a "virtual testing and prototyping capability for nuclear weapons."26

The parallel between the development of the NIF and ASCI programs and the transfer of U.S. supercomputing power to Russian nuclear weapons labs is striking, particularly considering that Arzamas-16 is home to the Iskra-5 advanced X-ray laser facility. This 67-mega-joule, 12-synchronous-channel pulse laser is designed for experiments in thermonuclear target heating in support of nuclear fusion research activities.27 In addition, the United States has reportedly offered to provide China with computers that could aid in nuclear explosion simulations, in order to persuade Chinese military leaders to halt underground testing.28

As the joint report from Los Alamos, Lawrence Livermore, and Sandia National Labs pointed out,

Computers are more important to nuclear weapons design when agreements limit testing. In support of the atmospheric test ban treaty, we perform[ed] our nuclear tests underground. A weapon's performance in the mode for which it was designed, perhaps an above ground burst, must be inferred from test data by expensive computer calculations. Such calculations take account of the "down hole" environment, such as reflection from test-cavity walls which do not exist in the atmosphere. A second agreement, the threshold test ban, limit[ed] testing to weapons with yields of 150 kilotons or less. To design beyond this limit, computer extrapolations [were] relied upon to verify the performance of the weapon.29

VERIFICATION TECHNOLOGIES MADE IRRELEVANT

On a prima facie level, most would instinctively argue that eliminating nuclear chain-reaction explosions from the planet is highly desirable and would help make the world a safer place. However, the reverse may actually be the case; that is, the elimination of physical tests and their migration to cyberspace may make the world a more dangerous place. Can such a counterintuitive proposition be true? Consider the trillions of dollars' worth of detection, monitoring, and early-warning infrastructure designed to identify and measure foreign nuclear weapons programs that would be rendered useless by virtual testing.

As the availability of data indicating the strength and direction of foreign nuclear weapons activities decreases, the likelihood that the United States or its allies will fall victim to tactical or strategic surprise will increase. No longer will analysts have access to tangible seismic data of the type shown in figure 1. High-explosive or hydrodynamic tests simply do not have the energy potential to be identified from the background clutter such as natural seismic activity (see figure 2); mining and construction blasting, detonations from oil and gas development, or conventional weapons testing, training, or ordnance disposal. In the United States, for example, each year there are several thousands of chemical explosions greater than 50 tons or more, and a couple of hundred larger than 200 tons.30

Figure 1: Seimic Signals from Chines Nuclear Test at Lop Nor (26 July 1996)

Figure 1 is a graphic example of the type of empirical data currently collected by seismic stations and indicates the precise time, duration, and magnitude of a specific nuclear weapons test. Under a CTBT regime, characterized by clandestine computer-based modeling and simulation techniques, such hard data would be unattainable.

The term "national technical means of verification" (NTM) is often used to describe satellite-borne sensors, but it is more generally accepted as covering all (long-range) sensors with which the inspected country does not interfere or interact. Ships, submarines, aircraft, and satellites can all carry monitoring equipment employed without cooperation of the monitored country. Ground-based systems include over the horizon (OTH) radar and seismic monitors. Acoustic sensors will continue to provide the main underwater NTM for monitoring treaty compliance.

The first of the high-technology methods of treaty monitoring were the U.S. VELA satellites, designed in the 1960s to monitor the Limited Test Ban Treaty. Their task was to detect nuclear explosions in space and the atmosphere.31

At precisely 0100 GMT on Sept. 22, 1979, an American satellite recorded an image that made intelligence analysts' blood run cold. Looking down over the Indian Ocean, sensors aboard a Vela satellite were momentarily overwhelmed by two closely spaced flashes of light. There was only one known explanation for this bizarre phenomenon. Someone had detonated a nuclear explosion.

The list of suspects quickly narrowed to the only two countries at the time that had the materials, expertise, and motivation to build a nuclear weapon: South Africa and Israel. Both denied responsibility.32

This event was not confirmed until 1997, when Aziz Pahad, South African deputy foreign minister, stated "that his nation detonated a nuclear weapon in the atmosphere vindicating data from a then-aging Vela satellite."33 Pahad's statements were confirmed by the U.S. embassy in Pretoria, South Africa.

VELA's modern counterparts include the global positioning system (GPS) satellites. While these also have the function of providing navigational and positional data, their alternate role is to detect nuclear explosions, and to this end they mount both X-ray and optical sensors. However, "as nuclear detectors in orbit on Global Positioning System satellites age, the credibility of their data again could be challenged, and have subsequent adverse policy impacts."

Without strong evidence of a nuclear test no Administration official is going to charge another nation with violating a test ban treaty, for example. Los Alamos and the U.S. Energy Dept. have expended approximately $50 million to develop a new generation of space-based nuclear detection sensors, but they may never get into orbit. Pentagon budget woes could preclude inclusion of EMP sensors on next generation satellites, according to Los Alamos officials.
Researchers who developed the new sensors said it is ironic that funding constraints could force a decision to keep the detectors grounded. After all, had the old Vela satellite been equipped with a functioning EMP detector, it would have confirmed that the optical flash in September, 1979, was a nuclear blast. The White House panel subsequently stated that, because nuclear detonations had such critical ramifications and possible consequences, it was imperative that systems capable of providing timely, reliable corroboration of an explosion be developed and deployed.34

However, detection does not constitute identification. There are thousands of earthquakes each year in Russia with magnitudes comparable to decoupled kiloton-scale nuclear explosions. Many seismic events are detected that cannot be identified. There are also hundreds of chemical explosions each year that have seismic signals in this range and thus cannot be discriminated from nuclear explosions. Thus, it is obvious that there will be many unidentified seismic events each year that could be decoupled nuclear explosions with militarily significant yields much greater than 1 kiloton.35 A summary of annual seismic activity appears in figure 2.

Figure 2. Annual Earthquake Activity and Magnitude

The following types of useful verification technologies, among others, would be rendered ineffective or irrelevant by the migration of nuclear weapons testing to supercomputer-based simulation and modeling:

Space-based optics and sensors. Several satellites have telescopes and an array of detectors that are sensitive to various regions of the electromagnetic spectrum.

Radar. Lightweight space-based radar aboard satellites are capable of penetrating heavy cloud layers and monitoring surface disturbances at suspected nuclear test sites.

Listening posts. Hydroacoustic stations located on Ascension, Wake, and Moresby Islands and off the western coasts of the United States and Canada and Infrasound arrays in the United States and Australia detect underwater and sub oceanic events and distinguish between explosions in the water and earthquakes under the oceans. Some seismic stations located on islands or continental coastlines may be particularly useful since they will be able to detect the T phase—an underwater acoustic wave converted to a seismic wave at the edge of the land mass.

Radionuclide monitoring network. A new effort is under way to detect Xenon-133 and Argon-37 seepage into the atmosphere days or weeks after a nuclear weapons test.36 The inadvertent release of noble gases during clandestine nuclear tests, both above and below ground, represents an important verification technique. As nuclear explosions produce xenon isotopes, and xenon can be detected in the atmosphere, its concentration determined by noble-gas monitoring is very useful.37

Seismic detectors. The United States has set up a worldwide network of seismic detectors, like those used to measure earthquakes, that can gauge the explosive force of large under ground nuclear tests. Research programs funded by the Department of Defense improved monitoring methods for detecting and locating seismic events, for discriminating the seismic signals of explosions from those of earthquakes, and for estimating explosive yield based on seismic magnitude determinations.

A 1-kiloton nuclear explosion creates a seismic signal of 4.0. There are about 7,500 seismic events worldwide each year with magnitudes > 4.0. At this magnitude, all such events in continental regions could be detected and identified with current or planned networks. If, however, a country were able to decouple successfully a 1 kiloton explosion in a large underground cavity, the muffled seismic signal generated by the explosion might be equivalent to 0.015 kilotons and have a seismic magnitude of 2.5. Although a detection threshold of 2.5 could be achieved, there are over 100,000 events worldwide each year with magnitudes > 2.5. Even if event discrimination were 99% successful, many events would still not be identified by seismic means alone. Furthermore, at this level, one must distinguish possible nuclear tests not only from earthquakes but also from chemical explosions used for legitimate industrial purposes.38

CONCLUSION

The proliferation of high-performance computers, made possible by the drastic liberalization of supercomputer export restraints unilaterally undertaken by the United States in 1995, as well as illegal shipments by manufacturers, will serve to undercut the intent and spirit of the Comprehensive Test Ban Treaty. Providing access to advanced modeling and simulation platforms will facilitate the migration of nuclear testing to the world of virtual reality and draw down a curtain of opacity on nuclear weapons development activity where monitoring, verification, and inspections do not apply.

Many may argue that high-performance supercomputers are not a complete substitute for physical nuclear testing. Indeed, they are not. However, they will provide the analytical platform for operators of clandestine programs to acquire a much higher level of confidence in their design, assembly, and detonation processes. This higher level of confidence is absolutely critical to deployment decisions of even the most radical forces.

The growing opacity of weapons development activities between the superpowers marks a historic shift. Prior to the CTBT and the rise of the NIF and ASCI, the trend was clearly in the direction of transparency. Transparency was sought under the terms of the 1974 Threshold Test Ban Treaty, which provided for physical presence and real-time down-hole monitoring by officials from the two countries. In fact, the 1987 U.S. installation of CORRTEX (Continuous Reflectometry for Radius Versus Time Experiments) equipment at Semipalatinsk represented the zenith in transparency. Unfortunately, permitting nuclear weapons design and testing organizations to acquire high-performance computers capable of simulating nuclear explosion scenarios will mark the end of transparency. Figure 3 depicts the trend back toward opacity.

FIGURE 3. Relative Opacity of Nuclear Development Programs

One of the lessons learned from the destruction of Saddam Hussein's nuclear weapons program was that a proliferant may be quite willing to settle for hydrodynamic testing of its prototype nuclear weapons as a sufficient basis for an uneasy certification for inclusion into its arsenal.

The Iraqis were designing exclusively implosion-type nuclear weapons. Their apparent exclusive focus on U235 as a fuel is, therefore, puzzling because plutonium is the preferred fuel for an implosion weapon [as] . . . the mass of high explosives required to initiate the nuclear detonation can be far smaller. On the other hand, given enough U235 it is virtually impossible to design a nuclear device which will not detonate with a significant nuclear yield.39

The Iraqi nuclear weapon design, which appeared to consist of a solid sphere of uranium, incorporated sufficient HEU to be very nearly one full critical mass in its normal state. The more nearly critical the mass in the pit, or core, the more likely the weapon will explode with a significant nuclear yield, even if the design of the explosive set is relatively unsophisticated. Furthermore, the majority of the weight involved in an early-design implosion-type nuclear weapon is consumed by the large quantity of high explosives needed to compress the metal of the pit; the more closely the pit approaches criticality, the less explosive is needed to compress the pit to supercritical densities and trigger the nuclear detonation, and thus the lighter, smaller, and more deliverable the weapon will be.40

Given the problem of limited access to fissile materials facing most potential proliferants and the potential for a preemptive strike by a wary neighbor, as in the case of the 1981 Israeli destruction of the Iraqi Osirak reactor, physical testing along the lines of the superpower model cannot be readily engaged in. U.S. actions to promote the availability of high-performance supercomputers will likely contribute to the proliferation problem by facilitating access to modeling and simulation which will give clandestine bomb makers greater confidence in the functionality of their designs. This increased level of confidence may be all that a belligerent may require to make the decision to deploy a weapon. Under such constraints, sophisticated modeling and simulation will enable clandestine programs to advance closer to the design and development of true thermonuclear weapons.

It is worth noting, in this context, that the vintage-1965 Swedish designs were very sophisticated and that at least one appeared to have been designed for use as an artillery shell. The Swedish developers, according to journalist Christer Larsson who broke the story, were fully confident in the performance of their weapons even with no test program planned.41

From a historical perspective, it is interesting to note that the concept of a comprehensive test ban was repeatedly forwarded by the Russians throughout the 1980s and consistently rejected by the United States. In the 1990s, a strange juxtaposition occurred with the United States advocating a CTBT and the Russians ever more reluctant to go along. This shift parallels the explosion in high-speed computing potential emanating from the United States and the relatively stagnant progress of Russian indigenous capabilities. There may be much truth in the MINATOM official's statement quoted earlier: "The United States has made much better provisions than Russia for giving up nuclear testing. Supercomputers used for virtual-reality modeling of the processes of nuclear explosions have played a decisive role in that."

If the Russian claim that the United States reneged on a promise of supercomputer technology in exchange for accession to the CTBT is accurate, then the very value of this treaty must be questioned. If, as a price for Russia's signature, the Clinton administration was willing to provide the signatories the means of circumventing both its spirit and explicit goals, then the treaty should be regarded as little more than a sham to be rejected by the U.S. Senate.

If high-performance computers were made available to the Russian nuclear weapons design bureaus the historical database accumulated from their previous nuclear tests will be the most significant factor in maintaining their stockpiles. In the absence of physical testing they would be able to simulate a wide range of nuclear weapons design alternatives including a variety of unboosted and boosted primaries, secondaries, and nuclear directed-energy designs.42

In addition, the modeling and simulation efforts will help them to maintain a knowledgeable scientific cadre and to continue to verify the validity of calculational methods and databases. Under a test ban, only computer calculations will be able to approximate the operation of an entire nuclear weapon. Other states would also recognize the value of advanced simulation research in helping to develop or maintain nuclear weapon programs. In addition, high-performance computers may make it possible for micro-physics regimes of directed-energy nuclear weapon concepts to be investigated as well.43

There is increasing speculation that the Clinton administration's furious push to decontrol supercomputers. widely seen as a payoff tor generous campaign support and contributions,44 was also intended to underwrite CTBT treaty signatures by providing an avenue for weapons testing, stockpile stewardship, and ongoing weapons development without the need for the physical initiation of a nuclear chain reaction.

Few were happy when the United States helped the United Kingdom become a nuclear power. Even fewer were pleased when the United States helped the French develop an independent nuclear capability. Assisting the Russians in maintaining and further developing their nuclear arsenal is outrageous. Unfortunately, U.S. nuclear proliferation activities do not end there. If the persistent rumors are true that the United States is even considering providing aid to China to sustain its nuclear weapons modernization program in a CTBT environment, then alarm bells should be sounding on Capitol Hill on the unintentional consequences of reckless disarmament.

Will the synergistic effect of the CTBT and the decontrol of supercomputers make the world a safer place or a more dangerous place? The predictable outcome of the events described argues that the uncertainty in our ability to anticipate the nuclear intentions of potential adversaries will increase as the result of an increasingly opaque window into their programs. As to whether this will translate into a quantifiable in crease in the risk of nuclear war or terrorism, intuitively the answer appears to be yes.

U.S. willingness to trade supercomputer technology for treaty signatories and its own rush toward virtual testing make a farce of pretensions to high moral ground in criticizing others for rejecting the CTBT. "Pakistan or India . . . could be forgiven for suspecting that the five major nuclear powers. which asserted for years that testing was critical to maintaining deterrence, have now advanced beyond the need for nuclear tests. All the more reason, perhaps, for them to oppose the treaty."45

Amid the numerous statements by Russian officials of a secret deal to provide U.S. super computer technology as an inducement to sign the CTBT is the noticeable absence of official denials from the U.S. side. This may be one of those times when silence speaks louder than words.

NOTES

1. "White House Rejects Pending Sale of U.S. Supercomputers to Russia," Journal of Commerce, 25 November 1996, 1A.

2. "From Nuclear War to the War of the Markets," El Pais, 7 November 1995, 18.

3. Gary Milhollin, "Exporting an Arms Race," New York Times, 20 February 1996, A19; "Weekend All Things Considered," National Public Radio, 1 December 1996; John J. Fialka. "U.S. Investigates Silicon Graphic's Sale of Computers to Russian Weapons Lab,'' Wall Street Journal, 18 February 1997: David E. Sanger, "U.S. Company Says It Erred in Sales to Russian Arms Lab," New York Times, 19 February 1997: "Supercomputer Deal May Violate Export Rules," Washington Post, 19 February 1997.

4. "Ministry's 'Astonishment' at Furor Over Supercomputers," Izvestiya, 4 March 1997; "Researchers to Buy Supercomputers to Study Nuclear Blasts," ITAR-TASS, 13 January 1997.

5. "Supercomputers for a Former Superpower," Izvestiya, 22 February 1997, 1-2.

6. Ibid.

7. John J. Fialka, "Clinton Weighs Russian Bid to Buy 3 Supercomputers," Wall Street Journal, 11 October 1996, A2; Gary Milhollin, "U.S. Says 'No' to Supercomputers for Russia's Nuclear Weapon Labs," Risk Report 2 (November-December 1996); U.S. General Accounting Office. Nuclear Weapons: Russia's Request for the Export of U.S. Computers for Stockpile Maintenance, GAO/T-NSIAD-96-245, 30 September 1996.

8. "White House Rejects Pending Sale of U.S. Supercomputers to Russia," Journal of Commerce, 25 November 1996, 1A.

9. MTOP figure was provided by Wisconsin Project on Nuclear Arms Control.

10. Journal of Commerce, 25 November 1996, 1A.

11. Ibid. Moscow Times, 5 December 1996.

12. Expurgated copy of letter provided by Wisconsin Project on Nuclear Arms Control.

13. "Minister: Computer Block Threatens Arsenal," Moscow Times. 5 December 1996.

14. "Researchers to Buy Supercomputer to Study Nuclear Blasts," ITAR-TASS, 13 January 1997.

15. "Forecasting a Nuclear Explosion," Komsomolsakya Pravada, 10 December 1996, 3.

16. ITAR-TASS 26 February 1997 Press Release, Information Department, Ministry of Atomic Energy of Russia, presented by G. A. Kaurov, department head, 24 February 1997.

17. "Computer Furor Contradicts Albright Signals," Izvestiya, 22 February 1997, 1.

18. "Supercomputers for Arzamas and Chelyabinsk," Izvestiya, 4 March 1997, 3.

19. "White House Still Appears Confused Over Supercomputer Sales to Russia Possible Promise Tied to Test Ban Treaty" Journal of Commerce, 6 December 1996, 3A.

20. "Anything Goes? Virtual Nuclear Reality," Krasnaya Zvezda, 25 January 1997, 5.

21. U.S. Department of Energy, The Need for Supercomputers in Nuclear Weapons Design, 1986, 5.

22. Ibid., 14.

23. U.S. Department of Energy, Office of Arms Control and Nonproliferation, The National Ignition Facility and the Issue of Nonproliferation, 1996, http://198.124.130.244/news/docs/nif/front.htm.

24. Michael Veiluva, John Burroughs, Jacqueline Caabasso, and Andrew Lichterman, Laboratory Testing in a Test Ban/Non-Proliferation Regime, Western States Legal Foundation, April 1995, http://www.chemistry.ucsc.edu/anderso/UC_CORP/testban.html.

25. Ibid.

26. Ibid.

27. Ibid. A. C. Gascheev, I. V. Galachov, V. G. Bezuglov, and V. M. A. Murugov, Charge Control System of the Energy Capacitor Storage of Laser Device Iskra-5, http://adwww.fnal.gov/www/icalepcs/abstracts/Abstracts_I/Mozin.html [As written, though apparently not accessible].

28. Veiluva, et. al., Laboratory Testing in a Test Ban.

29. U.S. Department of Energy, The Need for Supercomputers in Nuclear Weapons Design, 5.

30. Prototype International Data Center, Contributing to Societal Needs, http://earth.agu.org/revgeophys/va.4.html [As written, though apparently not accessible].

31. "Means to an End," International Defense Review 24 (1 May 1981): 413.

32. Jim Wilson. "Finding Hidden Nukes," Popular Mechanics (May 1997): 48.

33. William B. Scott. "Admission of 1979 Nuclear Test Finally Validates Vela Data," Aviation Week & Space Technology 147 (21 July 1997): 33.

34. Ibid.

35. "Facing Nuclear Reality," Science, 23 October 1987, 455.

36. Wilson, "Finding Hidden Nukes," 50.

37. Prototype International Data Center, Report of the Radionuclide Export Group, www.cdidc.org:65120/librarybox/ExpertGroup/8dec95radio.html [As written, though apparently not accessible].

38. Prototype International Data Center, Contributing to Societal Needs. http://earth.agu.org/revgeophys/va.4.html [As written, though apparently not accessible].

39. Peter D. Zimmerman. Iraq's Nuclear Achievements: Components, Sources, and Stature, U.S. Congressional Research Service Report #93-323F, 18 February 1993.

40. Ibid.

41. Ibid.

42. U.S. Department of Energy, The National Ignition Facility and the Issue of Nonproliferation.

43. Ibid.

44. Michael Waller, vice president of the American Foreign Policy Council, Testimony before the House National Security Committee, Subcommittee on Military Research and Development, 13 March 1997.

45. W. Wayt Gibbs, "Computer Bombs: Scientists Debate U.S. Plans for 'Virtual Testing' of Nuclear Weapons," Scientific American (March 1997): 16.

© 1998 World Affairs


[End]

Transcription and HTML by JYA/Urban Deadline.