Nestled in the hills of Berkeley, California, is the Lawrence Livermore National Lab (LBNL). This is the home of “Sierra,” the second-most powerful supercomputer on the planet, that includes 7K square feet of racks, wires and blinking lights. For reference, the most powerful supercomputer is called “Summit” and is located in Tennessee’s Oakridge National Lab, which focuses on more civilian-oriented things like climate simulations and astrophysics. Sierra, however, is purpose-built for an entirely different mission—to learn more about nuclear weapons.
For the past year, the Sierra machine has been used to model earthquakes, explore traumatic brain injuries, and cancer research but not for long. In early 2019, it will be “air gapped,” which means cut off from any other computer network, and be used primarily for its true purpose of testing aging nuclear warheads to determine their viability.
Known for its long history of innovation in technology and science, The Lab houses some of the world’s best and brightest innovative minds. Why not a supercomputer named Sierre? She will be making her way through a stockpile of 4000-plus aging nuclear weapons that until now, have been un-testable. Because aging weapons are thought to become less effective or powerful over time, the supercomputer is designed to determine their functional integrity by running simulations that would otherwise be too dangerous to test in real-life.
In this way, scientists can help the government and the military decide if they can, in fact, take old nuclear weapons and potentially make them into new, usable ones—an ‘upcycling’ exercise of sorts. Unlike its historic Naval predecessor, known as the ‘mothball fleet’ of ships built to fight in WWII—which actually never left port in Martinez, California were eventually dismantled—these weapons could regain their usefulness.
Testing, 1,2,3—is this thing on?
The U.S., along with Russia and several other nations, have never totally ditched their nuclear weapons. When this stockpile of aging weapons was built, some 40-plus years ago, they were meant to be replaced at a later date—but never were. So, code physicists at The Lab will be using the Sierra supercomputer to test these nukes to see check the of the old but potentially valuable weapons.
The team will be trying to answer questions like:
- If we had to launch a 40 year-old bomb, would it explode?
- Would it detonate differently?
- Could it stay intact and potentially falls into the wrong hands?
The massive number of processors can simulate the questions asked by the military, therefore allowing them to model the changes required to update, modify, and refurbish the aging weapons in practical ways. Sierra will be used to simulate explosions, under the management of the government’s Stockpile Stewardship Program. Over the years, this Program has been used run a wide variety tests including using the world’s largest laser, as well as explosive tests using Plutonium.
Is this a sign of Nuclear Proliferation?
In 1970, the U.S. signed the Non-Proliferation of Nuclear Weapons, commonly known as the Non-Proliferation Treaty (NPT), an international treaty whose objective was to prevent the spread of nuclear weapons and weapons technology; to promote cooperation in the peaceful uses of nuclear energy; and to further the goal of achieving nuclear disarmament in general—leading to eventual and complete disarmament.
While the job of the scientists at LBLN is to simply run simulations, gather data and report their findings to the military, some may say that this work is potentially fundamentally incompatible with our government’s legal obligation and commitment to disarm under the NPT. According to the experts, the lines are beginning to blur. However, the U.S. does not have a legitimate policy stating they cannot design and build new nuclear weapons—mandates only say research is allowed as long as no missiles are tested or deployed.
The lines are more than blurring, they are being cut. In October of 2018, Donald Trump referred to the 31-year old Intermediate-Range Nuclear Forces (INF) Treaty, saying “We’re going to terminate the agreement, and we’re going to pull out.” The INF was first signed by President Ronald Reagan and Soviet Union leader Mikhail Gorbachev in December 1987 and was the first and only nuclear arms control agreement that ever eliminated an entire class of nukes. The treaty forced the superpowers to scrap more than 2,600 missiles with ranges 310 to 3,420 miles, weapons considered destabilizing to the European continent because of their capability to launch a nuclear strike from anywhere without early warning.
As long as we have nuclear weapons, we will need simulation. So it does not seem this testing will stop anytime soon. In fact, El Capitan, Sierra’s successor, is already in development. Scientists don’t just explode bombs anymore as a way to understand them better, as they used to do in the old days, turning islands into gaping holes. Instead, they will simulate bombs’ statuses, and look back at old videos to try to simulate what they see. To date, they haven’t been able to get at all the nuance in the footage.
But with slick new simulations being testing at the Los Alamos Lab in New Mexico, the bomb babysitters there say maybe they can. This program essentially allows scientists to keep track of individual particles, to see where they go and what they do in a certain situation. In nuclear research, scientists often use particle-in-cell code to understand how plasma mixes with itself, which is particularly important for Los Alamos because nuclear bombs produce plasma. This certainly sounds safer and more predictable than detonating toxi and potentially dangerous nuclear chemicals in nature. This new, emerging class of supercomputers seem to have a more humane approach than unpredictable humans.
Is Intel In Trouble Again In 2019?
Why Dharma Is Bad Karma For Your Computer
How Vulnerable To Ransomware Are State Computers?