/PAM

[TPAMI 2020] Parallax Attention for Unsupervised Stereo Correspondence Learning

Primary LanguagePython

Parallax-Attention Mechanism (PAM)

Reposity for "Parallax Attention for Unsupervised Stereo Correspondence Learning", IEEE TPAMI 2020

[arXiv]

Updates

  • Fix a mistake in Eq. 8. Please see the latest arXiv version.

Overview

1. Network Architecture

2. Left-Right Consistency & Cycle Consistency

3. Valid Mask

Applications

1. PAM for Unsupervised Stereo Matching (PASMnet) [code]

1.1 Overview

1.2 Results

2. PAM for Stereo Image Super-Resolution (PASSRnet) [code]

2.1 Overview

2.2 Results

3. PAM for Other Applications

Our PAM provides a compact and flexible module to perform feature fusion or information interaction for stereo images without explicit disparity estimation, which can be extended to stereo 3D object detection, stereo image restoration (e.g., super-resolution [1,2,3,4], denoising, deblurring, deraining and dehazing [5]), stereo image style transfer, multi-view stereo, and many other tasks [6,7].

[1] Wang et al. "Learning Parallax Attention for Stereo Image Super-Resolution", CVPR 2019.

[2] Ying et al. "A Stereo Attention Module for Stereo Image Super-Resolution", SPL.

[3] Song et al. "Stereoscopic Image Super-Resolution with Stereo Consistent Feature", AAAI 2020.

[4] Xie et al. "Non-Local Nested Residual Attention Network for Stereo Image Super-Resolution", ICASSP 2020.

[5] Pang et al. "BidNet: Binocular Image Dehazing Without Explicit Disparity Estimation", CVPR 2020.

[6] Wu et al. "Spatial-Angular Attention Network for Light Field Reconstruction", arXiv.

[7] Nakano et al. "Stereo Vision Based Single-Shot 6D Object Pose Estimation for Bin-Picking by a Robot Manipulator", arXiv

Citation

@Article{Wang2020Parallax,
  author    = {Longguang Wang and Yulan Guo and Yingqian Wang and Zhengfa Liang and Zaiping Lin and Jungang Yang and Wei An},
  title     = {Parallax Attention for Unsupervised Stereo Correspondence Learning},
  journal   = {{IEEE} Trans. Pattern Anal. Mach. Intell.},
  year      = {2020},
}

Acknowledgement

We would like to thank @akkaze for constructive advice.

Contact

For questions, please send an email to wanglongguang15@nudt.edu.cn