Name: Giuliano F. Panza
Residence: Trieste
Country: Italy

What is your background?
I graduated in physics at Bologna University (Alma Mater). I am a former full professor at the University of Trieste in Seismology and Geophysics and with Prof. Vladimir Keilis-Borok I co-founded and served as the Head of the SAND Group – Structure and non-linear dynamics of the Earth – at ICTP Trieste.
I am Doctor honoris causa at University of Bucharest; Emeritus Honorary professor of the Institute of Geophysics of China Earthquake Administration, Beijing; Honorary professor of the Beijing University of Civil Engineering and Architecture; Member of the Advisory Committee of the Beijing Advanced Innovation Center for Future Urban Design. At the conferring ceremony of the Institute of Geophysics, CEA, Mr. Zhao Ming, deputy director of the Department for International Cooperation of CEA and representative of CEA leaders, complemented me as the Marco Polo in seismology.
I am member of the following Academies: Accademia Nazionale dei Lincei, Accademia Nazionale delle Scienze, Academia Europaea, Russian Academy of Sciences, The Academy of Sciences for the Developing World.

I have been interested in earthquakes all my scientific life. Italy has a long history of earthquakes and many people will remember the dramatic earthquake in L’Aquila in 2009 when more than 300 people died and several scientists were convicted for their lack of warning but later cleared. The Italian legislation, in fact, consists of 3 levels of trial:
1 – Corte d’assise:  An Italian court found six Italian scientists and an ex-government officially guilty of manslaughter over the earthquake in L’Aquila.
2 – Corte d’assise d’appello: Acquits six of the seven indicted.
3 – Corte di Cassazione: Confirms above Acquittals of six of the seven indicted.

Predicting earthquakes and their effects is still a very difficult task due to the nonlinear behaviour of the earth system. Although we have made some progress in the last decade with the so-called Neo Deterministic Seismic Hazard Assessment (NDSHA), which earned me important international awards, for example, the EGU Beno Gutenberg medal and the AGU International award, in 2000 and 2018, respectively.

In NDSHA we employ numerical modeling computer codes that are based upon: (1) the physical description of the earthquake rupture process; and then upon (2) the seismic wave propagation pathways — to then reliably predict resulting ground motion parameters resulting from the many considered potential seismic sources. Indeed NDSHA works better than the very popular Probabilistic Seismic Hazard Assessment (PSHA). But science is not democratic! Let me just remember a sentence attributed to Galileo Galilei: “In questioni di scienza, l’autorità di un migliaio di persone non vale tanto quanto l’umile ragionamento di un singolo individuo.” (“In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”).

NDSHA has, for over two decades, provided both effective and Reliable Seismic Hazard Assessment tools (RSHA) for understanding, communicating and mitigating earthquake risk. The procedure for the NDSHA-derived Seismic Hazard Maps at the regional scale is described in some detail at

NDSHA Seismic Hazard Assessment has been well validated by all events occurring in regions where NDSHA maps were available at the time of the subsequent earthquakes; including these observations from four recently destructive earthquakes: M 5.9 Emilia, Italy 2012; M 6.3 L’Aquila, Italy 2009; M 5.5–6.6 Central Italy 2016–2017 Seismic Crisis; and M 7.8 Nepal 2015. This good performance suggests that the wider adoption of NDSHA (especially in tectonically active areas – but with perhaps relatively prolonged seismic quiescence, i.e. where only few major events have occurred in historical time) can better prepare civil societies for the entire suite of potential earthquakes that can and will occur!

Better to retire and then bury PSHA, which is more concept and “trust in numbers” than it is a tested pathway to seismic safety, R.I.PSHA than to “take a chance on a guess” and then, in the future, to experience more earthquake disasters and catastrophes, because erroneous hazard maps depicted only “low hazard”, but the active tectonic regions again acted otherwise!

PSHA, unlike NDSHA, has: (a) never been validated by “objective testing”; but has (b) actually been proven unreliable as a forecasting method on the “rates” (but claimed probabilities) of earthquake occurrence; and (c) has nevertheless mandated that earthquake-resistant design standards and societal earthquake preparedness and planning should be based on “engineering seismic risk analysis” models – models which incorporate assumptions, really fabulations (or “magical realisms”) now known to conflict with what we have learned scientifically regarding earthquake geology and earthquake physics.
In the evidence against PSHA: too many damaging and deadly earthquakes (like the 1988 M 6.8 Spitak, Armenia earthquake; the 2011 M=9 Tohoku, Japan Megathrust; and the 2012 M=6 Emilia, Italy events) have all occurred in regions rated to be “low-risk” by PSHA Seismic Hazard Maps.

Predicting climate is as difficult as predicting earthquakes and therefore I am hesitant to accept mainstream climate predictions that foresee climate doom in the future. This is the reason that prompted me to accept the request to sign, as leading light of NDSHA, the CLINTEL Global Climate Declaration. In other words, I felt it my duty to demonstrate how fundamental drawbacks affect different branches of science that are apparently very distant from each other.

Since when and why are you interested in climate change?
At the turn of the millennium when I realized that popular climate change assessments are as invalid as popular seismic hazard assessments. Both do not readily satisfy Popper’s falsifiability principle: for a theory to be considered scientific it must be able to be tested and proven false. To be accepted scientific theories must make confirmed predictions.

The worldwide maps depicting the earthquake hazard assessed by PSHA resulting from the Global Seismic Hazard Assessment Program, GSHAP, are grossly misleading, as proved by fatal evidence in all the top deadliest earthquakes that occurred since 2000. In fact, the seismicity of the past twenty years disproved the probabilistic GSHAP maps that were published in 1999, as can be verified by any interested person. GSHAP fails both in describing past seismicity, as well as in predicting expected ground shaking and number of claimed victims.

According to the probabilistic seismic hazard analysis (PSHA), the deterministically evaluated or historically defined largest credible earthquakes (often referred to as Maximum Credible Earthquakes, MCEs) are “an unconvincing possibility” and are treated as “likely impossibilities” within individual seismic zones. However, globally over the last decade such events keep occurring where PSHA predicted seismic hazard to be low. Systematic comparison of the observed ground shaking with the expected one reported by the Global Seismic Hazard Assessment Program (GSHAP) maps discloses gross underestimation worldwide. Several inconsistencies with available observation are found also for national scale PSHA maps (including Italy), developed using updated data sets.
As a result, these maps have underestimated the expected numbers of fatalities in recent disastrous earthquakes by approximately two to three orders of magnitude. The total death toll in 2000-2011 (which exceeds 700,000 people, including tsunami victims) calls for a critical reappraisal of GSHAP results, as well as of the underlying methods. Some theoretical and practical issues of probabilistic seismic hazard assessment range from the overly simplified assumption about the modelling of the complex nature of telluric motion – that one could reduce the tensor problem of seismic wave generation and propagation, standard in mechanics of a continuum like the Earth, into a scalar problem – to the insufficient size and quality of earthquake catalogs necessary for a reliable probability modeling at the local scale. Indeed NDSHA (supplement) is capable of effectively modeling the complex nature of telluric motion and requires only the knowledge about the occurred damaging earthquakes, that are quite well represented also in historical catalogues.

This was the motivation to start a Neo-deterministic approach to seismic hazard assessment, now internationally well known as NDSHA, that readily satisfied Popper’s falsifiability principle, as proven, since 2000, by the seismicity occurred in the countries where NDSHA maps are available. This is not surprising since seismic hazard analysis based on credible scenarios for real earthquakes, NDSHA evaluation solidly rooted on physics, provides a robust, reliable approach for seismic hazard and risk assessment which is readily handling Maximum Credible Earthquakes.

In a nutshell GSHAP is based on probability applied to not fully adequate data sets and is affected by several mistakes of physical and mathematical origin; NDSHA, uses scenario earthquakes including MCEs and models the caused ground motion exploiting the most advanced physical models of wave generation and propagation based on continuum mechanics. GSHAP missed all the top deadliest earthquakes that occurred since 2000!
Incidentally let me mention that a large international project is currently in progress (China Seismic Experimental Site – Natural laboratory of earthquake science for seismic disaster resilience. And one of this endeavor’s projects is focusing on “Application research of the neo-deterministic seismic risk assessment (NDSHA) method in China Seismic Experimental Site.” See here, here, here and here. We produced NDSHA hazard maps for the region surrounding the Wenchuan (Sichuan, China) earthquake of 12/05/2008 with Magnitude= 8.1 that claimed 87,587 victims and whose intensity was underestimated by GSHAP by three units!

How did your views on climate change evolve?
Edward Lorenz, in 1963, discovered deterministic chaos (“the butterfly effect”) in an ordinary natural process, in a system of ordinary differential equations describing thermal convection in the atmosphere. The deterministic Lorenz attractor triggered the real-world implications of chaotic solutions in a multitude of natural and socioeconomic processes, including, the chaotic dynamics of paleoclimate described by Michael Ghil in 1994.
In its turn, the recognition of hierarchical non-linear nature of geodynamics emerged about a quarter of a century later when Vladimir Keilis-Borok in 1982 published a review of worldwide test of three long-term premonitory seismicity patterns that stimulated new approaches to the analysis of seismic hazard and earthquake predictability. In other words I want to stress here the logical influence of the study of climate change based on the concept of deterministic chaos (“the butterfly effect”) on the problem of earthquake predictability. Indeed the climate system is a non-linear complex system and available data are not adequate to make reliable predictions. Some inadequacy affects also earthquake catalogues and that is why earthquakes cannot be predicted with ultimate precision. I am not a specialist but as a scientist I can say that the literature at hand about climate modeling is scientifically not convincing thus I am not sure that we are on the verge of a man driven climate catastrophe, or how long it would take for it to happen.

Is climate change a big issue in your country and how do you notice this?
In this historical moment in which the “pro-climate protest” is stronger than ever, with millions of young people demonstrating in the streets around the world demanding decisive action against climate change, as if we were on the verge of a catastrophe, in Italy they would have much more to protest against.

A strong earthquake could hit many Italian cities at any time. And I also know that in the event of a strong earthquake that hit a populous city, our country would receive a very hard blow, which would have effects on all Italian citizens for years. Perhaps this is valid not only in Italy, as recent earthquake history tells. Following the thought of Galileo in questions of science, the authority of thousands [majority] is not worth the humble reasoning of a single individual [minority]. The majority of Italian people are betting on the wrong horse, i.e. climate instead of earthquakes.

How would climate policy ideally look like in your view?
I do not think that I can add much to the content of the CLINTEL World Climate Declaration, which is the result of deep and intense discussion of about a thousand (certainly a minority) of highly reputed scientists not biased by current mantras. All I can do is to wish that the CLINTEL World Climate Declaration is carefully read and studied by ALL ministers involved with climate monitoring and prediction.
I am skeptical about the real importance of renewables and I consider nuclear energy a necessity at present. In fact, the only current true alternative to fossil fuels is nuclear energy so that should have a prominent place in any future development. Only nuclear could guarantee a significant CO2 emission reduction and give an extra boost to our economy in these difficult times.
At present the energy mix production in Italy is among the most environmental “friendly” of Europe, as coal and other highly pollutant fossil fuels are scarcely used. By the way NDSHA evaluation applied to Nuclear Power Plants definitely decreases risk.

What is your motivation to sign the CLINTEL World Climate Declaration?
I learned, while studying seismicity in the framework of complex systems with the aim to intermediate-term middle-range prediction and reliable earthquake hazard assessment, that gross data like the ones contained in parametric catalogues require rigorous and robust data analysis methods validated by facts against well documented and formulated rules, like the one suggested by The United States National Research Council, Panel on Earthquake Prediction of the Committee on Seismology, in 1976:

“An earthquake prediction must specify the expected magnitude range, the geographical area within which it will occur, and the time interval within which it will happen with sufficient precision so that the ultimate success or failure of the prediction can readily be judged. Only by careful recording and analysis of failures as well as successes can the eventual success of the total effort be evaluated and future directions charted. Moreover, scientists should also assign a confidence level to each prediction.”

This scientific definition fully satisfies the Popper’s falsifiability principle (Fälschungsmöglichkeit) and makes confutation possible. Existing climatic temperatures models, as far as I understand, do not pass this same criterion as the anthropogenic global warming theory (AGWT) that has been mostly promoted by the Intergovernmental Panel on Climate Change (IPCC) of the United Nations.

Climate change is a fact as a fact is pollution, but the two facts should not be mixed and confused: to control pollution is in the reach of man, the control of climate is not.

The climatic uncertainty derives mostly from the fact that the main climatic mechanisms and forcing are only partially known and this uncertainty adds to that associated with the non-linearity of the model equations.