We want to make sure you get the best viewing experience for the content you are viewing.  Our goal is to improve each visit with data that creates this experience for you and those you share it with. We appreciate your continued readership.     

Meet the Human in Nuclear Deterrence

Not long ago, uttering words like “human element” or “psychology” at a physics or engineering national laboratory would make scientists’ eyes roll. Their silence was a clear “does not compute” message.  It was as if Oppenheimer, Ernest Lawrence, or Edward Teller had forbidden the “soft sciences” from entering those hallowed laboratory grounds. Those days are over.

Physicists and engineers now realize that just as most car crashes stem from driver error rather than mechanical failure, the same logic applies to nuclear weapons, their platforms, and their potential use. Whether Americans like it or not, humans are in the system and humans are, almost certainly, the weakest link.

Humans are the weakest component in the quantification of margins and uncertainty (QMU) sense. Engineers often test individual components and larger systems of nuclear weapons to a 1-in-1,000 certainty that they will function correctly. There has long been a view that nuclear weapons should always detonate when employed and never when they are not. To achieve this “always/never” goal, systems are engineered to perfection while largely ignoring sources of human error.

Humans design and manufacture the components, assemble the weapons, complete the wiring, and install systems onto delivery platforms (i.e., subs, silos, and bombers). Humans verify satellite signals of potential attacks from US Strategic Command, communicate those findings to the President, and, depending on the response, draft and transmit emergency action messages (EAMs). This is a gross simplification because fragile humans play a much larger role, but it illustrates the embeddedness of the human element in the system.

One example of human fragility that took place in September 2023 at the Pantex Plant is instructive. It appears a worker mistakenly cross-connected color-coded electrical wires inside a nuclear weapon.

Across the world this very task might be performed by a civilian or by an Air Force 2W2X1 Nuclear Weapons Specialist. At first glance, it seems simple; connect the red wire to the red wire and the green wire to the green wire. But around 8 percent of men are born with red-green color vision deficiency (color blindness) that makes it difficult for them to differentiate between red and green (and many other color combinations. The US Air Force correctly requires normal color vision for this role.

Not all color tests are created equal. Some vision tests catch 99 percent of people with colorblindness and others catch 90 or even 50 percent of colorblind individuals. An analogy may be useful in illustrating this point.

If, for example, a worker was testing a component and needed to detect 14MeV neutrons, a detector that simply says “between 2 and 20 MeV neutrons were detected” would be unacceptable. A tester with adequate sensitivity is required to test critical components. Detectors that verify the specific reading may even be required. Sensitive tests for humans who work on nuclear weapons is also required.

The Federal Aviation Administration (FAA) recently updated its standards, rejecting the century-old Ishihara color vision test and the D-15 test due to known shortcomings. The Ishihara test is fairly good at detecting red-green defects but will miss 100 percent of blue (Tritan) defects. Humans have red, green, and blue light sensitive cones in their eyes, and the Ishihara only tests two cones and ignores blue vision entirely. The D-15 test can pass up to half of individuals with color blindness, depending on how its administered (a test commonly used by police departments).

Figure 1. Simulated Color Vision Defects and Wire Color

Even if Pantex adopted one of the FAA’s “best in class” tests, such as the CAD, Rabin, or Waggoner Computerized Color Vision Test (WCCVT), there is still another issue—test frequency. Color vision should be tested periodically, not just once.

While 8 percent of men and 0.5 percent of women) are born with color blindness, it is expected that 15 percent of all people will develop an acquired color vision deficiency during their lifetime, most often affecting blue vision. Most people assume color vision is a static ability, but it is more like hearing loss, which is impacted by age and environmental factors.

Changes in color vision ability can occur rapidly due to medications, diseases, or environmental conditions. For critical roles, annual color vision testing should be a minimum standard.

Finally, different color vision tests examine different axes within the visible spectrum of light, meaning that a person could pass the Rabin but fail the WCCVT based on individual differences and the specific axis tested by each test. This is truer for mild vision defects but mild defects can still cause sub-par performance on real world tasks (i.e., reacting to red traffic lights).

Across the United States, teams are working to quantify this human element in complex, high-consequence systems. These include the Air Force’s 711th Human Performance Wing and the social scientists at Sandia National Laboratories.

The next time you hear about a cognitive psychologist, industrial-organizational psychologist, or human factors researcher at a national lab, do not assume they’re experimenting with LSD and goats to perfect psychic warfare. They’re far more likely to be studying how humans interact with technology—quantifying behavior, limitations, cognition, and the human’s reliability within critical systems.

Organizations should, whenever possible, bring these human-focused professionals into projects. They will identify issues most engineers never consider across a variety of scales, “from neurons to nations.” Factors like color vision, tool slips, (as in the Louis Slotin “demon core” incident), dropped sockets (as in the Titan II missile explosion in Damascus, Arkansas), mismatched job abilities, fatigue, attention lapses, and even intentional sabotage can all impact the nation’s deterrence posture. When processes are optimized to include the human, overall risk is minimized.

In the end, deterrence is not just about weapons. It is about the humans behind the weapons, the fallible, unpredictable, indispensable human element that remains both our greatest strength and our greatest risk.

Rob Kittenger, PhD, is a Senior Fellow at the National Institute for Deterrence Studies. The views expressed are his own.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Posts