Open Access
ARTICLE
Towards No-Reference Image Quality Assessment Based on Multi-Scale Convolutional Neural Network
Yao Ma1, Xibiao Cai1, *, Fuming Sun2
1 School of Electronics and Information Engineering, Liaoning University of Technology, Jinzhou, 121001, China.
2 School of Information and Communication Engineering, Dalian Minzu University, Dalian, 116600, China.
* Corresponding Author: Xibiao Cai. Email: .
(This article belongs to this Special Issue: Security Enhancement of Image Recognition System in IoT based Smart Cities)
Computer Modeling in Engineering & Sciences 2020, 123(1), 201-216. https://doi.org/10.32604/cmes.2020.07867
Received 05 July 2019; Accepted 25 September 2019; Issue published 01 April 2020
Abstract
Image quality assessment has become increasingly important in image quality
monitoring and reliability assuring of image processing systems. Most of the existing
no-reference image quality assessment methods mainly exploit the global information of
image while ignoring vital local information. Actually, the introduced distortion depends
on a slight difference in details between the distorted image and the non-distorted reference
image. In light of this, we propose a no-reference image quality assessment method based
on a multi-scale convolutional neural network, which integrates both global information
and local information of an image. We first adopt the image pyramid method to generate
four scale images required for network input and then provide two network models by
respectively using two fusion strategies to evaluate image quality. In order to better adapt
to the quality assessment of the entire image, we use two different loss functions in the
training and validation phases. The superiority of the proposed method is verified by
several different experiments on the LIVE datasets and TID2008 datasets.
Keywords
Cite This Article
Ma, Y., Cai, X., Sun, F. (2020). Towards No-Reference Image Quality Assessment Based on Multi-Scale Convolutional Neural Network.
CMES-Computer Modeling in Engineering & Sciences, 123(1), 201–216.
Citations