On July 21, 2023, Christropher Nolan’s Oppenheimer entered wide release. The film follows J. Robert Oppenheimer, the scientist who directed the US program to create the world’s first atomic bombs. The movie, shot in part in New Mexico, tells a story of the bomb and the early Cold War focused closely on Oppenheimer, his personal relationships, and the revocation of his security clearance in 1954. The film depicts a fairly mainstream understanding of the scientist’s life, triumph, and foibles.
Because Nolan’s camera is kept narrowly focused on Cillian Murphy as Oppenheimer, much of the bomb’s impact and early history is kept largely off screen. The film draws directly from American Prometheus, a 2005 biography of Oppenheimer by Kai Bird and Martin J. Sherwin, which offers a thorough portrait. Los Alamos National Laboratory, where Oppenheimer oversaw the bomb’s development, was founded as part of the atomic bomb effort and today remains an important center of American nuclear weapons research.
Any fuller story of the bomb needs to venture beyond the boundaries of laboratory grounds and test ranges. Here are three details of nuclear weapons and their development not captured in the film.
The bombs of today are much more powerful
At the heart of an atomic bomb of any kind is a fission reaction. In this process, a mass of radioactive isotopes, like Uranium-235 or Plutonium 239, is compressed at high speed, breaking apart the atomic bonds in the isotopes and sending neutrons outwards with tremendous force.
It’s a fission bomb that forms the centerpiece of Oppenheimer. In an attempt to get scientists to stop openly talking about bomb production, the first atomic bomb was named “Gadget.” It was tested at Trinity on July 16, 1945.
It yielded an explosion equivalent to 20,000 tons of TNT, or 20 kilotons. Little Boy, the bomb dropped on Hiroshima, Japan, on August 6, 1945, had a yield of 15 kilotons. Fat Man, the bomb dropped on Nagasaki, had a yield of 20 kilotons. These weapons had a massive impact. In the US, Gadget’s fallout caused health impacts still observed in people downwind today.
Initial estimates place the death toll from Hiroshima at 70,000 and the death toll from Nagasaki at 40,000 people. A later estimate puts the deaths at 140,000 for Hiroshima and 70,000 for Nagasaki. The methodology of both estimates is sound, and added to those estimates can be the tens of thousands injured by the bomb’s effects but not killed outright.
These numbers are the baseline for understanding how many people a fission weapon of such power can kill. Given the scale of an atomic blast, bombs will invariably kill civilians if used near any population center.
Another type of bomb, called the “Super” during the Manhattan Project, sought to use a small fission reaction to set off a larger fusion chain reaction. It’s also known as an H-Bomb, or more broadly a thermonuclear weapon, and the largest one ever detonated by the United States had a yield of 15 megatons. The largest H-bomb detonated by the Soviet Union yielded 50 megatons.
The current nuclear bombs in the US arsenal range from 0.3 kilotons up to 1.2 megatons, making yields like that seen in Hiroshima and Nagasaki on the small end of what is presently fielded. Weapons with yields as small as Gadget, Fat Man, and Little Boy are sometimes described as “tactical” nuclear weapons, though that’s a broad adjective and not a particularly useful term. Most US nuclear weapons have larger—and often much larger—yields. Should a nuclear weapon be used by the United States in the 21st century, it would likely be at least one or two orders of magnitude more powerful than the only atomic bombs used so far in war.
There were complex, and costly, supply chains involved
Los Alamos is central to Oppenheimer’s story, and to the theory, design, and assembly of the atom bomb. Oppenheimer selected the location because of a fondness for northern New Mexico, and the US Army agreed to use the mesa that became Los Alamos because access to and from the lab could be easily controlled. The Army acquired the land in part by refusing grazing permits to families that previously used the mesa, as well as offering small cash payments or in some cases outright condemnation of the land—strong-arming inhabitants into leaving.
Los Alamos was just one node at the head of a broader industry built to design and create the bomb. Some parts of the work were done at university laboratories. Other parts, especially in the field of enriching uranium or producing plutonium, had to be done elsewhere. The city of Oak Ridge, Tennessee was created to facilitate the enrichment of Uranium-235 isotopes from acquired naturally occurring Uranium-238. At the Hanford Engineer Works in Washington State, the Army commissioned reactors to produce plutonium. Environmental harms from the work at Hanford were discovered as possible in 1949, but not revealed until public investigation in the 1980s.
Before uranium could be refined, it needed to be extracted from the ground. Some uranium existed in stockpiles outside mines, as before the war the element was not terribly sought after compared to radium, which resulted from decaying uranium. During World War II, the Belgian government in exile sold uranium from the Shinkolobwe mine in the Congo to the US. Mining radioactive material is harmful to the miners, and Belgian employers worked miners at Shinkolobwe around the clock, with little surviving written record of the human toll of the operation.
Uranium was also mined from areas on Navajo reservations within the United States. The health impact of this work, on miners and their families, was largely kept from people, until it spilled forth in disaster in the late 1970s in New Mexico. “On July 16, 1979, the Church Rock uranium mill experienced the largest release of radioactive material on United States soil. The south cell disposal pond experienced a massive, twenty foot breach in its wall — likely caused by numerous six-inch cracks in the cement,” records the Atomic Heritage Foundation. “The wall, which acted as a dam to keep the radioactive waste in the pond, was located directly next to Pipeline Arroyo, a tributary for the larger Puerco River. In all, 1,100 tons of solid radioactive waste and 93 million gallons of liquid waste ended up in the river.”
History is complicated, not tidy
The American creation of the bomb was motivated, in large part, out of a fear that the atomic bomb research undertaken by Nazi Germany was already further along. An American bomb could then be used in the war first, as the thinking at the time went. German decisions ended up not leading to anything like a productive bomb production effort, and as Los Alamos neared completion of the Gadget, Germany surrendered.
The first atomic bomb was tested and used in the window between when Germany had surrendered in the war and when Japan surrendered unconditionally. This timing is often used to argue that the dropping of the atomic bomb was a course of action directly chosen to end the war that otherwise would not have ended. One popular post-war narrative holds President Harry Truman as making the decision to drop the bomb rather than launch a massive invasion with US forces.
History is more complicated than that tidy tell, which itself was a postwar justification. What is notable instead is that President Truman, when he succeeded Franklin Delano Roosevelt, inherited a bomb program that was nearly ready for testing, and which was producing a weapon the Army expected to use. As historians like Alex Wellerstein have argued, there was never a specific moment at which a singular decision was made to drop the bomb.
“The day after Nagasaki,” writes Wellerstein in an article for The New Yorker, “Truman issued his first affirmative command regarding the bomb: no more strikes without his express authorization. He never issued the order to drop the bombs, but he did issue the order to stop dropping them.”