Home / Journals / CMES / Online First / doi:10.32604/cmes.2022.022045
Table of Content

Open Access


CEMA-LSTM: Enhancing Contextual Feature Correlation for Radar Extrapolation Using Fine-Grained Echo Datasets

Zhiyun Yang1,#, Qi Liu1,#,*, Hao Wu1, Xiaodong Liu2, Yonghong Zhang3
1 School of Computer and Software, Engineering Research Center of Digital Forensics, Ministry of Education, Nanjing University of Information Science and Technology, Nanjing, 210044, China
2 School of Computing, Edinburgh Napier University, Edinburgh, EH10 5DT, UK
3 School of Automation, Nanjing University of Information Science Technology, Nanjing, 210044, China
* Corresponding Author: Qi Liu. Email: qi.liu@nuist.edu.cn
# Both are the first authors due to their equal contributions

Computer Modeling in Engineering & Sciences https://doi.org/10.32604/cmes.2022.022045

Received 18 February 2022; Accepted 24 May 2022; Published online 28 June 2022


Accurate precipitation nowcasting can provide great convenience to the public so they can conduct corresponding arrangements in advance to deal with the possible impact of upcoming heavy rain. Recent relevant research activities have shown their concerns on various deep learning models for radar echo extrapolation, where radar echo maps were used to predict their consequent moment, so as to recognize potential severe convective weather events. However, these approaches suffer from an inaccurate prediction of echo dynamics and unreliable depiction of echo aggregation or dissipation, due to the size limitation of convolution filter, lack of global feature, and less attention to features from previous states. To address the problems, this paper proposes a CEMA-LSTM recurrent unit, which is embedded with a Contextual Feature Correlation Enhancement Block (CEB) and a Multi-Attention Mechanism Block (MAB). The CEB enhances contextual feature correlation and supports its model to memorize significant features for near-future prediction; the MAB uses a position and channel attention mechanism to capture global features of radar echoes. Two practical radar echo datasets were used involving the FREM and CIKM 2017 datasets. Both quantification and visualization of comparative experimental results have demonstrated outperformance of the proposed CEMA-LSTM over recent models, e.g., PhyDNet, MIM and PredRNN++, etc. In particular, compared with the second-ranked model, its average POD, FAR and CSI have been improved by 3.87%, 1.65% and 1.79%, respectively on the FREM, and by 1.42%, 5.60% and 3.16%, respectively on the CIKM 2017.


Radar echo extrapolation; attention mechanism; long short-term memory; deep learning
  • 649


  • 179


  • 0


Share Link

WeChat scan