Stanford experts discuss the lessons and legacy of the Fukushima nuclear disaster

A decade after a powerful earthquake and tsunami set off the Fukushima Daiichi nuclear meltdown in Japan, Stanford experts discuss revelations about radiation from the disaster, advances in earthquake science related to the event and how its devastating impact has influenced strategies for tsunami defense and local warning systems.

On a Friday afternoon in the spring of 2011, the largest earthquake in Japan’s recorded history triggered a tsunami that crashed through seawalls, flattened coastal communities and pummeled the Fukushima Daiichi nuclear power plant.

Damage from the earthquake and tsunami. (Image credit: Shutterstock)

More than 19,000 people died and tens of thousands more fled as radiation belched from the world’s worst nuclear accident since Chernobyl.

A decade later, large swaths of land remain contaminated and emptied of most of their former residents. The deadly natural disasters of March 11, 2011, and the catastrophic nuclear meltdown that followed have left a lasting impact on earthquake science, tsunami defense and the politics of nuclear power.

Here, Stanford nuclear security expert Rod Ewing and geophysicists Eric Dunham and Jenny Suckale discuss that legacy, as well as how scientists are continuing to discover new details about the disaster.

 

What lessons did the damage from Tohoku provide about preparing for tsunamis?

SUCKALE: The Tohoku tsunami highlighted that even a highly sophisticated and expensive tsunami mitigation system can fail. There has also been increasing interest in alternative approaches to mitigating tsunami risks such as nature-based or hybrid approaches. We need to learn a lot more about these types of approaches, but it’s exciting to see progress in that area. It might not be coincidental that a lot of that thinking comes from Miyagi Prefecture, which was hard hit by the tsunami.

 

How have local tsunami warning systems changed since the 2011 disaster in Japan?

DUNHAM: Most tsunamis are caused by offshore earthquakes, like the 2011 Tohoku-Oki earthquake, uplifting the seafloor and the ocean surface so that water begins to flow back toward land in the form of a tsunami. Local tsunami warning systems are still currently based on a two-step workflow: analysis of seismic waves constrain the earthquake location and size, and relations from tsunami simulations are then used to predict tsunami arrival times and wave heights.

But this is about to radically change.

Japan has deployed offshore networks of pressure gauges and seismometers connected to each other and back to computers on land by thousands of miles of fiber optic cable. Scientists have new methods for reconstructing the tsunami waves, in real-time, using the seafloor pressure data. (Pressure increases when the wave passes over a sensor.) These methods completely bypass the need to first estimate earthquake properties, and they also work for tsunamis that are caused by non-earthquake sources like underwater landslides.

Recent offshore earthquakes and tsunamis in Japan have demonstrated that these methods are ready for real-world use, and I anticipate they will start to become part of local tsunami warning systems in Japan within the next few years. Hopefully, other countries that face similar tsunami hazards will invest in offshore sensor networks.

 

What important insights have scientists gained about earthquakes by studying data from the Tohoku-Oki earthquake and tsunami?

DUNHAM: The Tohoku-Oki earthquake and tsunami were much larger than had been expected for that part of the Japan Trench subduction zone. Scientists now have a much better appreciation for the variability in earthquake (and tsunami) size that can occur in a given region, although the reasons for that variability are still being explored.

Computer simulations of earthquakes have advanced considerably in the past decade, to the point where they can be used to test hypotheses about the role of frictional properties, fluids and other properties and processes on the fault slip behavior. The international scientific community studying earthquake and tsunami hazards from subduction zones is currently planning ambitious experiments involving onshore and offshore instrumentation, which paired with computer modeling, will revolutionize our understanding of these dangerous regions.

 

Ten years after the event, what have scientists learned about the particles released from the Fukushima Daiichi Nuclear Power Plant? 

EWING: ​I think that during the past ten years, one of the most important findings is that volatile radionuclides, such as the isotopes of cesium, were actually transported by micro- to nano-scale particles. Initially, the volatile radionuclides were thought to be simple, chemical complexes that were generally soluble and thus would be washed out of the soil column, i.e., during rain events. However, if a significant fraction of the highly radioactive Cs is incorporated into more durable particles, then they may persist for longer periods of time. This is important information for designing the remediation strategy.

The other important discovery is that, depending on their size, these particles can be found kilometers to over one hundred kilometers from the nuclear reactors at Fukushima Daiichi. Finally, these particles are heterogeneous, forming under the different conditions within the reactor at the time of the meltdown; thus, they provide samples of what was happening during the meltdown events.

 

What do we know about the radiation, how far it traveled and its impact?

EWING: Generally, the level of radiation for any individual particle is relatively low, but the size and chemical form of the particle affects its potential for causing radiation exposure. Also, the chemical speciation of the particle is a critical aspect of designing strategies for remediation. There is an increasing scientific effort to follow specific radionuclides through the environment. The fate of these radionuclides in the environment contributes to assessments of safety.

 

If an earthquake and tsunami like the Tohoku-Oki event occurred in the same region today, what would play out differently and why?

DUNHAM: Local tsunami warning systems have improved greatly since 2011, particularly in their ability to handle extremely large events like Tohoku. And they will only continue to improve in the next few years as the offshore sensor data starts to be used for real-time warnings. Data from those offshore sensors could also be used to explore the pattern of foreshocks and slow fault slip that might precede large earthquakes. There are intriguing hints of precursory behavior prior to large subduction zone earthquakes, but conclusive evidence for this requires high-resolution data of the type that can only be collected with offshore sensor networks.

 

What are the lessons for future power plants?

EWING: The tragedy at Fukushima Daiichi was not an “accident” in the sense that it could not have been anticipated. From a geologic perspective, there were many “red flags” related to the probability of a tsunami event and its scale. I think that one of the important lessons is that we have spent too much time using risk assessments to demonstrate that a reactor site is safe and not enough time imagining how it might fail.

Ewing is the Frank Stanton Professor in Nuclear Security in Stanford’s Center for International Security and Cooperation (CISAC), a senior fellow at the Freeman Spogli Institute for International Studies and at the Precourt Institute for Energy, and a Professor of Geological Sciences in the School of Earth, Energy & Environmental Sciences (Stanford Earth). Dunham is an Associate Professor of Geophysics. Suckale is an Assistant Professor of Geophysics and, by courtesy, of Civil and Environmental Engineering and a center fellow, by courtesy, at the Stanford Woods Institute for the Environment.