Lawrence Livermore National Laboratory



Preserving the Past to Protect the Future: Advanced technology revives Cold War–era film reels, providing new data to validate modern nuclear weapons science.
U.S. atmospheric nuclear tests conducted in the mid-20th century, such as the one shown here from 1955 at the Nevada Test Site, were captured on various film formats. The Film Scanning and Reanalysis Project is using modern scanning technology to digitize these aging films and extract key data with unprecedented accuracy. © eranda-fotolia.com

Preserving the Past to Protect the Future

Atmospheric nuclear weapons testing has an important place in history for its scientific, political, and cultural legacies. Between 1945 and 1963, the United States conducted 210 such tests and captured the events on dozens of cameras. Performed mainly at the Nevada Test Site (now called the Nevada National Security Site) and the Pacific Proving Grounds—a collection of remote locations in the Pacific Ocean, including the Marshall Islands—the tests were designed to explore a range of tactical scenarios, such as detonations at high altitude, on the ground, or over water. Some bombs were dropped directly from aircraft or via parachute. Others were placed atop ground-based towers. Thermonuclear explosions depended on the types of explosives, fuels, and detonation methods used. Testing goals included experimenting with new weapons designs, evaluating weapons reliability and performance, and measuring explosives effects.

The scientific record of the nation’s atmospheric nuclear tests survives in aging, deteriorating film reels. Data such as fireball size, shock wave position, and cloud dimensions—all of which can be gleaned through careful frame-by-frame analysis—are critical to Lawrence Livermore and other institutions tasked with stockpile stewardship—ensuring the safety, security, and effectiveness of the U.S. nuclear stockpile. Now, in the post-nuclear-testing era, preserving these decades-old artifacts is a matter of national security. The Film Scanning and Reanalysis Project, a joint effort between Lawrence Livermore and Los Alamos national laboratories, is dedicated to this endeavor.

Livermore nuclear weapons physicist Greg Spriggs helms a team experienced in film preservation, archiving, image processing, shock wave physics, software development, data analysis, and declassification protocol. Since 2011, the team has combed through secure government vaults to inventory and salvage thousands of film rolls. Under the aegis of the Laboratory’s Weapons and Complex Integration Principal Directorate, the team uses modern scanning technology to digitize the films while developing image-processing techniques to extract key data with unprecedented accuracy.

Of the estimated 10,000 atmospheric nuclear test films, Spriggs and colleagues have identified approximately 6,500, scanned 4,200, declassified 750, and analyzed 500. In 2017, for the first time, the public was given access to a series of these films. “We hope viewers appreciate the immense power of these weapons,” says Spriggs. “Further, in the absence of live testing, the information from these films helps validate our computer simulations needed for stockpile stewardship.”

Specialists to the Rescue

Early in the project, Spriggs knew he needed help sorting through the various film stocks. Film expert Jim Moye, whose résumé includes preserving the Zapruder film (footage of President John F. Kennedy’s assassination), joined the team to evaluate the condition of the film reels and digitize the images. Livermore’s Maxine Trost and Los Alamos’s Alan Carr also stepped in to manage film retrieval and identification in the laboratories’ archives. Academy Award–winning filmmaker and documentarian Peter Kuran came aboard to advise on historical film stocks and camera equipment. Understanding film technology of the testing era is crucial to the team’s progress toward preservation and analysis. (See the box below.)

Necessity and Invention in Mid-20th Century Film Technology

The era of atmospheric nuclear testing ushered in advancements in film technology for better capturing testing events. During that time, a pioneering photography company called Edgerton, Germeshausen, and Grier, Inc. (EG&G), provided camera timing and weapons firing systems as well as high-speed photography for the tests. In addition to test films, other artifacts—camera rigs anchored to palm trees or stowed in concrete bunkers, towers assembled for observation, before-and-after images of target structures in the blast zone—were featured in documentary footage and still photography. Many models of high-speed motion picture cameras and film stocks were used, ranging from 8-millimeter to 9.5-inch (241.3-millimeter) widths. Special cameras were designed for studies of motion, velocity, and light intensity. By 1962, cameras had become faster and produced sharper images than were possible just a decade earlier.

According to Peter Kuran, author of How to Photograph an Atomic Bomb, EG&G was navigating a learning curve. As Hollywood filmmakers shifted from the highly flammable cellulose nitrate film to cellulose acetate, EG&G technicians also transitioned to ensure higher fidelity recordings. Acetate-based film, known as safety film, is less flammable than cellulose nitrate and more sensitive to light, which affects its optical density and exposure. However, safety film has limitations when it comes to ultrabright images of explosions. EG&G worked with film manufacturers to create emulsions that increased the light sensitivity of black-and-white safety film beyond what was commercially available.

Partway through the testing years, EG&G began using a new product called microfile film, with a fine-grained silver-halide emulsion. Smaller grains reduced the film speed, helping prevent overexposure from intense light. This film composition enhanced the quality of high-resolution nuclear detonation photography. “Nuclear explosions are brighter than the Sun. Most film stocks back then were not made for that range of exposure, but microfile film could handle it,” notes Kuran.

As advanced as nuclear weapons technology was at the time, and as rapidly as photography was evolving, photographic analysis tools had yet to catch up. Using a device called a densitometer, analysts painstakingly measured optical density at fixed points, frame by frame, to measure light output over time. The densitometer works by aiming a light source at a photoelectric cell, which converts the light into electricity. When a film strip is placed between the light source and the cell, the amount of light transmitted through the images is measured by the change in electricity produced. The densitometer displays this change as an optical density reading.

Although the densitometer allowed analysts to measure light output, they were unable to discern minor changes within a given region of the film. For instance, a shock wave boundary could escape detection and simply blend into the sky. “Scientists of that era did not have the technology to look at the shockwave boundary once it was far removed from the fireball,” observes Livermore physicist Greg Spriggs.

A new type of camera attempted to overcome some of these limitations. Called a streak camera, this device allows light to enter through four to six horizontal slits in the lens cap. The light is captured on a film strip moving at a constant speed, producing continuous streaks along the length of the film. To ensure that at least one streak will not be saturated, neutral-density filters of various optical densities were placed in front of the slits to attenuate the light.

Streak cameras usually ran at 100 frames per second, which was fast enough to capture both light pulses produced by the detonation, but too slow to be accurately resolved by a densitometer. Spriggs explains, “At the time, EG&G knew light output could be captured with streak cameras, but no practical way existed to analyze the results in detail.” EG&G stopped using the streak camera on atmospheric nuclear tests in the late 1950s. Sixty years later, the camera can be appreciated as ahead of its time. “Technology has come a long way since then. Now, for the first time, we can analyze in great detail the entire light output from a nuclear detonation,” continues Spriggs. “We are thankful EG&G had the foresight to capture these images.”

The mid-20th century film analysis toolbox also included a Kodagraph, which enlarged an image and projected it onto a grid pattern. Scientists would measure the size of a fireball in a single frame and compare it to measurements in subsequent frames to establish how quickly it expanded. However, fireballs do not form into perfect spheres, and multiple analysts ended up with different measurements of the same blast. The overall process was limited, subjective, and inconsistent. According to Spriggs, “Data were all over the map, leaving a lot of potential for human error.”


(from left) Jim Moye, Greg Spriggs, and Peter Kuran examine two high-speed, rotating-prism cameras similar to those used for recording mid-20th century atmospheric nuclear tests. The Fastax (left) and Eastman both shoot 16-millimeter film. (Photo by Randy Wong.)
(from left) Jim Moye, Greg Spriggs, and Peter Kuran examine two high-speed, rotating-prism cameras similar to those used for recording mid-20th century atmospheric nuclear tests. The Fastax (left) and Eastman both shoot 16-millimeter film. (Photo by Randy Wong.)

Although images captured on black-and-white film tend to remain stable throughout the years, various problems can befall an aging film reel. Decades of handling and transport can cause scratches. Alternatively, some reels were treated with a scratch-protective lacquer that over time can alter the original optical density—that is, the measurement of light absorbance and blockage—resulting in poor data quality. Removing the lacquer can cause further damage. In addition, cellulose acetate film is vulnerable to a condition known as vinegar syndrome, in which the breakdown of acetyl molecular chains produces acetic acid and a strong vinegar smell. Other complications include brittleness, oxidation, and curling.

Films of this age and composition also tend to shrink an average of 1.5 to 2.5 percent, and the team must compensate for this issue to increase the reliability of extracted data. “We measure the horizontal distance between the perforations to calculate the amount of shrinkage,” notes Spriggs. Uneven shrinkage can sometimes cause the edges of the film to buckle or flute—conditions in which the edges and center of the film are no longer the same length.


Film expert Jim Moye selects one of the thousands of film canisters queued for scanning and analysis. (Photo by Lee Baker.)





Film expert Jim Moye selects one of the thousands of film canisters queued for scanning and analysis. (Photo by Lee Baker.)

Cleanliness is another important aspect of film preservation. Moye usually cleans individual frames to remove or minimize the effect of dirt, debris, ink, and tape adhesive. “Tape of all kinds was used to mark frames for manual analysis. I’ve seen tape placed every five frames for a thousand frames,” he states. When he encounters torn or folded perforations along the edge of a reel, Moye determines the best repair strategy for each type of defect.

Besides age-related problems, the team must account for anomalies caused by conditions at ground zero. For instance, foggy images could be evidence of radiation exposure or a camera’s light leak. Kuran examines each reel to diagnose lens flares, refraction effects, static electricity, flaws caused by camera operator error, and degradation from repeated copying. Altogether, the team’s preservation activities are necessarily thorough. Kuran states, “Unlike Hollywood movies, you can’t remake these films.”


In addition to the ravages of time, mid-20th century film reels used for capturing events from atmospheric nuclear tests were subjected to radiation damage. Film crews attempted to protect cameras from radiation exposure with lead and concrete shielding and by positioning them at lower risk distances from ground zero. This reel from the first nuclear weapons test (in New Mexico in 1945) is beyond repair. (Photo by Lee Baker.)


In addition to the ravages of time, mid-20th century film reels used for capturing events from atmospheric nuclear tests were subjected to radiation damage. Film crews attempted to protect cameras from radiation exposure with lead and concrete shielding and by positioning them at lower risk distances from ground zero. This reel from the first nuclear weapons test (in New Mexico in 1945) is beyond repair. (Photo by Lee Baker.)

Nuclear Age Meets Digital Age

The most diligent preservation efforts can only slow, not stop, decomposition. “All organic substances, including film, will eventually decompose no matter how well they are cared for,” notes Moye. Most of the atmospheric test films are two-thirds of the way into the 100-year expected shelf life of black-and-white film, though Trost cautions, “It’s impossible to know exactly how long they will be usable.” Spriggs explains that in Livermore’s film laboratory, which was specially constructed for the project, the team works efficiently with modern tools to scan millions of frames. He says, “Our scanning technique allows a film to be digitized as a near-perfect copy of the original, thereby preserving this rare scientific information for future use."

One challenge to consistent digitization comes from the range of film formats used by different cameras. The widest film strips, up to 9.5 inches (241.3 millimeters), were used in aircraft to record the blast field from above or to capture the mushroom cloud from ground-based photo stations located as far as 48 kilometers away. Some of the 70-millimeter reels do not have perforations, which indicates that they were shot by a camera taking still images at several-second intervals, rather than as a continuous movie. This format was used to document cloud movement and dissipation over relatively longer time periods. Unlike standard 70-millimeter-wide film, some reels contain vertically oriented images. For example, in a vertical orientation, a mushroom cloud’s top and stalk grow toward the short ends of the rectangular frame instead of the long sides. Another variation is frame height. Moye states, “Some frames are five perforations high, while others are seven. I was surprised to see so many odd formats.”

A key acquisition for the team was the Golden Eye II scanner from Digital Vision in Sweden, whose versatile aperture and light source can accommodate film formats from 8 to 70 millimeters. (The team uses a flatbed scanner for larger widths.) While most scanners grip a film strip’s perforations to feed it around flanges and through rollers, the Golden Eye II is sprocketless. “I don’t have to restore all the damaged perforations. The scanner handles old, shrunken film well,” says Moye.


Atmospheric nuclear tests were shot on several film stocks. Pictured here are (from top) offset images on 35-millimeter film without perforations, 8-millimeter images on 16-millimeter film, double 8-millimeter images, and standard 16-millimeter images. Digitizing the multiple formats of test films is a challenge for the preservation team.


Atmospheric nuclear tests were shot on several film stocks. Pictured here are (from top) offset images on 35-millimeter film without perforations, 8-millimeter images on 16-millimeter film, double 8-millimeter images, and standard 16-millimeter images. Digitizing the multiple formats of test films is a challenge for the preservation team.

The scanner’s dual-camera system produces high-resolution images up to 8,000 pixels across with as many as 4,096 different tones (shades of gray). To preserve as much of the original optical density as possible, the team scans the film strips first to obtain lower densities, then again for higher densities. Moye explains, “A digital scanner cannot process everything at once, darkest to lightest, from the films. We would lose shadows or highlights. We need more range, so we scan twice.” A computer program combines the two scans into a single digital image that includes the full optical range of the original frame. Even with the extra scan, the speed and resolution of the scanner combined with reduced repair time enable the team to digitize several films per day.


Filmmaker and documentarian Peter Kuran (standing) operates the Golden Eye II scanner, while Moye reviews a scanned image’s digital fidelity and optical density distribution. (Photo by Randy Wong.)


Filmmaker and documentarian Peter Kuran (standing) operates the Golden Eye II scanner, while Moye reviews a scanned image’s digital fidelity and optical density distribution. (Photo by Randy Wong.)

Faster, Better, Newer Data

Members of the Film Scanning and Reanalysis Project can analyze film faster, with fewer people, and with more accuracy than previously possible. The team’s approach is threefold: speed up the process through automation, collect better data for films analyzed with older methods, and gather data from films not previously examined. The team is conducting a complete analysis of the approximately 50 films shot for each atmospheric test. According to physicist Jason Bender, “Our level of analysis is unprecedented.”

Livermore’s innovative computerized image-processing technology eliminates most of the manual work of analyzing every frame. Bender explains, “We have many modern tools we can bring to bear on this problem, making it much easier to analyze digital content than in the past.” The first hurdle was establishing the best method for extracting data from the scanned films. Bender and colleagues turned to the open-source software community to leverage industry-standard visual-processing functions. The project team uses the Python programming language and the OpenCV (Open Source Computer Vision) library to measure blast dimensions and timescales present in the films. The team relies on the Livermore Computing (LC) Division’s parallel computing capabilities for large-scale, batch-processed analysis and fast iteration. “LC’s standardized environment is game changing,” states Bender. “Studies that once took hours or days to complete on a typical film can now be done in minutes. Through LC, we are also able to bring new interns and scientists into the project.”

Analysis begins with measuring a blast’s fireball radius and growth rate. Similar to facial recognition technology, the team’s software algorithms include shape and pattern matching and noise reduction to find the edges and center of the fireball. By enhancing the background to delineate light from dark, features can be better identified and more data points become available. Machine-learning algorithms weed out visible defects, film manufacturers’ marks, and false positives, while customized graphical user interfaces enable scientists to compare fireball contours to circular and elliptical overlays. Pixel-level analysis is possible, and results are highly accurate and consistent. Bender says, “Our tools impose quantitative standards and remove human bias. Data are therefore more reproducible.”


Each reel passes through the scanner twice to improve image fidelity. (bottom) The low-exposure range provides higher fidelity shadows outside the fireball, but the fireball itself appears grainy. (middle) The high-exposure range provides more nuanced highlights yet lacks depth in darker tones. (top) The combined frame contains this 1956 test’s full optical range.


Each reel passes through the scanner twice to improve image fidelity. (bottom) The low-exposure range provides higher fidelity shadows outside the fireball, but the fireball itself appears grainy. (middle) The high-exposure range provides more nuanced highlights yet lacks depth in darker tones. (top) The combined frame contains this 1956 test’s full optical range.

Measuring Energy Yield

Crucial to a weapon’s effectiveness is knowing a device will hit the right target and produce the intended effects without inadvertent outcomes. At a time when nuclear deterrence policy prohibits nuclear weapons testing, Livermore scientists rely on three-dimensional computer models to predict shock, thermal blast, and fallout effects occurring in various environments. For instance, a mushroom cloud’s size helps determine the amount, concentration, and dispersal behavior of fallout particles. This information, in turn, affects calculations of radiation dose rate. Spriggs explains, “As the stockpile ages, we must be able to accurately predict any changes in weapons performance. Since we can no longer physically test our weapons, computer simulations are an essential tool for assessing the health of the stockpile. We need reliable data to validate that our simulations are trustworthy.” (See S&TR, March 2012, Extending the Life of an Aging Weapon; and S&TR, July/August 2015, Stockpile Stewardship at 20 Years.)

A key data point in these simulations is a weapon’s yield. Much like an engine’s work rate is measured in terms of horsepower, the energy yield of a nuclear weapon is expressed in units of TNT (trinitrotoluene). For example, an explosion with a yield of 17 kilotons produces energy equivalent to 34 million pounds of TNT—roughly the energy produced by the nuclear devices dropped on Hiroshima and Nagasaki, Japan, in 1945. The weapons immortalized in the atmospheric nuclear test films often produced energy yields several hundred times more powerful.

Yield can be estimated as a function of time based on light output. A nuclear detonation produces two light pulses. The first pulse is associated with the shock wave’s formation and its subsequent cooldown during expansion. The second pulse is produced by the light from heated gases in the atmosphere (the fireball). The brightest points of these two pulses and the minimum light output occurring between them helps scientists determine a weapon’s yield. Other parameters include the duration of the fireball and the initial rise velocity of the mushroom cloud.


To determine a fireball’s dimensions, the project team uses a measured calibration factor that converts pixels to millimeters and accounts for the camera’s focal length and distance from the detonation point. This fireball was produced by an air-dropped device in 1953. Livermore scientists can confirm its diameter to within 1 pixel (0.4 meters), which represents an energy yield uncertainty of approximately 0.2 percent—significantly more precise than the contemporaneous calculation of 6.0 percent.
To determine a fireball’s dimensions, the project team uses a measured calibration factor that converts pixels to millimeters and accounts for the camera’s focal length and distance from the detonation point. This fireball was produced by an air-dropped device in 1953. Livermore scientists can confirm its diameter to within 1 pixel (0.4 meters), which represents an energy yield uncertainty of approximately 0.2 percent—significantly more precise than the contemporaneous calculation of 6.0 percent.

Shock wave position can also contribute to yield estimates. However, accurately measuring a shock wave’s radius can be difficult when the shock wave is irregularly shaped, and measurements off by only a small amount can adversely affect yield calculations. For instance, if the radius is incorrectly measured by 1 percent, the yield of the blast could be skewed by 5 percent. Spriggs states, “Using modern image-processing techniques, we can make more precise measurements of the shock wave radius. In most cases, we have reduced the uncertainty of the yield estimates by nearly an order of magnitude.”

Silent Films Speak Volumes

By mining the valuable data found in historic atmospheric nuclear test films, the Film Scanning and Reanalysis Project strengthens the Laboratory’s stockpile stewardship capabilities for future generations. Throughout the project, Spriggs and colleagues have processed the films with assistance from college and graduate students studying math, computer science, and physics. These summer interns have helped develop data extraction tools, work on uncertainty quantification in timing, and build the database of timing data. “We’re establishing a complete analysis platform so scientists can uncover other types of data from these films, such as complex features in fireball structure,” says Bender. The data may also prove useful for modeling cloud rise in more complex environments, such as urban canyons. The team plans to extend their analysis technology to new global security applications for first responders and radiochemists.

Plots derived from contemporaneous and modern-day data analysis depict fireball measurements from a 1955 atmospheric test. Although the graphs use different scales, uncertainty is approximately a factor of 10 smaller with the newer data points, which align more closely with an asymptotic value. Asymptotes are shown as a dashed curve (left) and red line (right). Colors and symbols represent different camera shots of the same test.
Plots derived from contemporaneous and modern-day data analysis depict fireball measurements from a 1955 atmospheric test. Although the graphs use different scales, uncertainty is approximately a factor of 10 smaller with the newer data points, which align more closely with an asymptotic value. Asymptotes are shown as a dashed curve (left) and red line (right). Colors and symbols represent different camera shots of the same test.

In March 2017, the Laboratory posted the first batch of 63 digitized films to its YouTube channel with an introductory video featuring Spriggs and Moye. The videos range in length from 2 seconds to more than 7 minutes, each with a frame counter in the corner. The YouTube release has generated intense public interest, with collective views topping 5.5 million in just 8 months, and the number continues to grow. The project has gained attention from major media outlets including The New York Times, CNN, CBS Sunday Morning, National Public Radio’s Northern California affiliate (KQED), the San Francisco Chronicle, and several San Francisco Bay Area news television stations. Scientific American, WIRED, Esquire, and other national magazines have also covered the release.

Spriggs has received positive feedback from many YouTube viewers. Some offer their services in film archiving or preservation, while others share personal stories of the nuclear testing era. “The videos are resonating with the public, which is what we want,” remarks Spriggs. “Our efforts to maintain a safe, secure, and effective nuclear deterrent are paramount to our national security.”

—Holly Auten

Key Words: atmospheric nuclear test, computer simulation, energy yield, film preservation, Film Scanning and Reanalysis Project, Livermore Computing (LC), Nevada National Security Site, Nevada Test Site, nuclear weapons, OpenCV (Open Source Computer Vision), open-source software, optical density, Pacific Proving Grounds, Python, stockpile stewardship.

For further information contact Greg Spriggs (925) 423-8862 (spriggs1@llnl.gov).