Open Access iconOpen Access



License Plate Recognition via Attention Mechanism

Longjuan Wang1,2, Chunjie Cao1,2, Binghui Zou1,2, Jun Ye1,2,*, Jin Zhang3

1 School of Computer Science and Cyberspace Security Hainan University, Haikou, 570228, China
2 Key Laboratory of Internet Information Retrieval of Hainan Province, Haikou, 570228, China
3 Hilbert College, Hamburg, NY, 14075, USA

* Corresponding Author: Jun Ye. Email: email

Computers, Materials & Continua 2023, 75(1), 1801-1814.


License plate recognition technology use widely in intelligent traffic management and control. Researchers have been committed to improving the speed and accuracy of license plate recognition for nearly 30 years. This paper is the first to propose combining the attention mechanism with YOLO-v5 and LPRnet to construct a new license plate recognition model (LPR-CBAM-Net). Through the attention mechanism CBAM (Convolutional Block Attention Module), the importance of different feature channels in license plate recognition can be re-calibrated to obtain proper attention to features. Force information to achieve the purpose of improving recognition speed and accuracy. Experimental results show that the model construction method is superior in speed and accuracy to traditional license plate recognition algorithms. The accuracy of the recognition model of the CBAM model is increased by two percentage points to 97.2%, and the size of the constructed model is only 1.8 M, which can meet the requirements of real-time execution of embedded low-power devices. The codes for training and evaluating LPR-CBAM-Net are available under the open-source MIT License at: .


Cite This Article

L. Wang, C. Cao, B. Zou, J. Ye and J. Zhang, "License plate recognition via attention mechanism," Computers, Materials & Continua, vol. 75, no.1, pp. 1801–1814, 2023.

cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1097


  • 629


  • 1


Share Link