Vol.127, No.1, 2021, pp.175-189, doi:10.32604/cmes.2021.014635
OPEN ACCESS
ARTICLE
Stereo Matching Method Based on Space-Aware Network Model
  • Jilong Bian1,*, Jinfeng Li2
1 College of Information & Computer Engineering, Northeast Forestry University, Harbin, 150040, China
2 College of Computer & Information Technology, Mudanjiang Normal University, Mudanjiang, 157011, China
* Corresponding Author: Jilong Bian. Email:
(This article belongs to this Special Issue: Blockchain Security)
Received 14 October 2020; Accepted 11 January 2021; Issue published 30 March 2021
Abstract
The stereo matching method based on a space-aware network is proposed, which divides the network into three sections: Basic layer, scale layer, and decision layer. This division is beneficial to integrate residue network and dense network into the space-aware network model. The vertical splitting method for computing matching cost by using the space-aware network is proposed for solving the limitation of GPU RAM. Moreover, a hybrid loss is brought forward to boost the performance of the proposed deep network. In the proposed stereo matching method, the space-aware network is used to calculate the matching cost and then cross-based cost aggregation and semi-global matching are employed to compute a disparity map. Finally, a disparity-post processing method is utilized such as subpixel interpolation, median filter, and bilateral filter. The experimental results show this method has a good performance on running time and accuracy, with a percentage of erroneous pixels of 1.23% on KITTI 2012 and 1.94% on KITTI 2015.
Keywords
Deep learning; stereo matching; space-aware network; hybrid loss
Cite This Article
Bian, J., Li, J. (2021). Stereo Matching Method Based on Space-Aware Network Model. CMES-Computer Modeling in Engineering & Sciences, 127(1), 175–189.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.