强对流天气临近预报、预警在气象灾害防御中具有极为重要的地位。在气象业务中,因对强对流天气临近预报、预警准确率和时、空分辨率的极高要求,使其成为业务难点和研究热点之一。对于高时、空分辨率强对流临近预报问题,尝试用深度学习方法来解决。首先将强对流临近预报抽象成同时包含时间和空间的序列预测问题;然后基于改进的循环神经网络算法形成的自编码模型,使用京津冀地区长序列、高时空分辨率天气雷达组网拼图数据进行模型训练;最后利用基于历史0.5 h雷达回波拼图数据训练得到的端到端神经网络,预报未来1 h内的逐6 min回波演变特征。通过基于传统外推算法的临近预报方法与深度学习算法的临近预报方法进行对比,发现使用的深度学习方法可以有效“学习”到高时、空分辨率序列雷达数据特征的内在关联,通过多层神经网络构造出抽象的深层特征,能够有效捕捉到雷达回波的演变规律和运动状态。通过计算雷达回波预报的命中率(POD)、虚警率(FAR)、临界成功指数(CSI)等检验表明,相较传统外推临近预报方法,在强对流回波临近预报准确率上有较明显提高。
Abstract:
Nowcasting and early warnings of severe convective weather play an extremely important role in the prevention of meteorological disasters. In meteorological services, it becomes one of the most difficult and hot research topics because of its requirement for high accuracy and fine temporal-spatial resolution. Deep learning method is trying to solve the problem of strong convective nowcasting with high spatial and temporal resolution. First, the strong convective nowcasting is abstracted into a sequence of prediction problems containing both time and space. Then, the long sequence and high temporal resolution weather radar network mosaic data are fed to train the model using the encoder-decoder based on the improved recurrent neural network algorithm. The end-to-end neural network trained by radar echo data is then used to predict evolution characteristics of the radar echoes at 6 min intervals in the next one hour. By comparing the traditional extrapolation method with the deep learning algorithm, it is found that the deep learning method can effectively learn the intrinsic correlation of the data features in the high temporal-spatial resolution sequence, construct the abstract deep features through the multi-layer neural network and effectively capture the motion state of radar echoes. The calculation of prediction hit rate, false alarm rate, critical success index shows that compared with the traditional extrapolation forecast method, deep learning method can improve the nowcasting accuracy of strong convections, and demonstrates a wide application prospect.