Taking the Madoi MS 7.4 earthquake of 21 May 2021 as an example, this paper proposes using time series prediction models to predict the outgoing long-wave radiation (OLR) anomalies and study short-term pre-earthquake signals. Five time series prediction models, including autoregressive integrated moving average (ARIMA) and long short-term memory (LSTM), were trained with the OLR time series data of the aseismic moments in the 5° × 5° spatial range around the epicenter. The model with the highest prediction accuracy was selected to retrospectively predict the OLR values during the aseismic period and before the earthquake in the area. It was found, by comparing the predicted time series values with the actual time series value, that the similarity indexes of the two time series before the earthquake were lower than the index of the aseismic period, indicating that the predicted time series before the earthquake significantly differed from the actual time series. Meanwhile, the temporal and spatial distribution characteristics of the anomalies in the 90 days before the earthquake were analyzed with a 95% confidence interval as the criterion of the anomalies, and the following was found: out of 25 grids, 18 grids showed anomalies—the anomalies of the different grids appeared on similar dates, and the anomalies of high values appeared centrally at the time of the earthquake, which supports the hypothesis that pre-earthquake signals may be associated with the earthquake.
Zhang, J.; Sun, K.; Zhu, J.; Mao, N.; Ouzounov, D. Application of Model-Based Time Series Prediction of Infrared Long-Wave Radiation Data for Exploring the Precursory Patterns Associated with the 2021 Madoi Earthquake. Remote Sens. 2023, 15, 4748. https://doi.org/10.3390/rs15194748
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.