Open Access
ARTICLE
Image Deblurring of Video Surveillance System in Rainy Environment
Jinxing Niu1, *, Yajie Jiang1, Yayun Fu1, Tao Zhang1, Nicola Masini2
1 North China University of Water Resources and Electric Power, Zhengzhou, 450045, China.
2 Institute of Science of Cultural Heritage, National Research Council, C. da Santa Loja, Tito Scalo (PZ), 85050, Italy.
* Corresponding Author: Jinxing Niu. Email: .
Computers, Materials & Continua 2020, 65(1), 807-816. https://doi.org/10.32604/cmc.2020.011044
Received 16 April 2020; Accepted 30 May 2020; Issue published 23 July 2020
Abstract
Video surveillance system is used in various fields such as transportation and
social life. The bad weather can lead to the degradation of the video surveillance image
quality. In rainy environment, the raindrops and the background are mixed, which lead to
make the image degradation, so the removal of the raindrops has great significance for
image restoration. In this article, after analyzing the inter-frame difference method in
detecting and removing raindrops, a background difference method is proposed based on
Gaussian model. In this method, the raindrop is regarded as a moving object relative to
the background. The principle and procedure of the method are given to detect and
remove raindrops. The parameters of the single Gaussian background model are studied
in this article. The important parameter of the learning rate of Gaussian model is explored
in order to better detection and removal of raindrops. Experiment shows that the results
of removal of raindrops by using the proposed algorithm are better than that by using the
inter-frame difference method. The image processing effect is the best when the learning
rate is 0.6. The research results can provide technical reference for similar research on
eliminating the influence of rainy weather.
Keywords
Cite This Article
J. Niu, Y. Jiang, Y. Fu, T. Zhang and N. Masini, "Image deblurring of video surveillance system in rainy environment,"
Computers, Materials & Continua, vol. 65, no.1, pp. 807–816, 2020.
Citations