/HRAN

This repository is a PyTorch version of the paper "HRAN : Hybrid Residual Attention Network for Single Image Super Resolution" (IEEE Access 2019).

Primary LanguagePythonGNU General Public License v3.0GPL-3.0

HRAN

This repository is an official PyTorch implementation of the paper "HRAN : Hybrid Residual Attention Network for Single Image Super Resolution".

Paper can be download from here

All test datasets (Preprocessed HR images) can be downloaded from here.

All original test datasets (HR images) can be downloaded from here.

The trained models are available on Google Drive


Dependencies

  • Python 3.6
  • PyTorch >= 1.0.0
  • numpy
  • skimage
  • imageio
  • matplotlib
  • tqdm

Contents

  1. Introduction
  2. Train
  3. Results
  4. Citation
  5. Acknowledgements

Introduction

The extraction and proper utilization of convolutional neural network (CNN) features have a significant impact on the performance of image super-resolution (SR). Although CNN features contain both spatial and channel information, current deep learning techniques for SR often suffer to maximize the performance due to using either the spatial information or channel information. Moreover, they integrate such information within a deep or wide network rather than exploiting all the available features, eventually resulting in high computational complexity. To address these issues, we present a binarized feature fusion (BFF) structure that utilizes the extracted features from global residuals (GR) in an effective way. Each GR consists of multiple hybrid residual attention blocks (HRAB) that effectively integrates the multiscale feature extraction module and channel attention mechanism in a single block. Furthermore, to save computational power, instead of using a large filter size, we use convolutions with different dilation factors to extract multiscale features. We also propose to adopt global skip connections (GSC), short skip connections (SSC), long skip connections (LSC) and GR structure to ease the flow of information without losing important features details. In the paper, we call this overall network architecture as hybrid residual attention network (HRAN). In the experiment, we have observed the efficacy of our method against the state-of-the-art methods for both the quantitative and qualitative comparisons.

HRAB Hybrid Residual attention block (HRAB) architecture. HRAN The architecture of our proposed hybrid residual attention network (HRAN).

Train

Prepare training data

  1. Download DIV2K training data (800 training + 100 validtion images) from DIV2K dataset or SNU_CVLab.

  2. Specify '--dir_data' based on the HR and LR images path. In option.py, '--ext' is set as 'sep_reset', which first convert .png to .npy. If all the training images (.png) are converted to .npy files, then set '--ext sep' to skip converting files.

For more informaiton, please refer to EDSR(PyTorch).

Results

Quantitative Results

PSNR_SSIM_BI

Quantitative results with BI degradation model. Best, 2nd Best and 3rd Best Results are Respectively Shown With Magenta, Blue, and Green Colors

Memory Comparisons

PSNR_SSIM_Parameters Comparison of memory and performance. Results are evaluated on Urban100 (×4)

Visual Results

Visual_PSNR_SSIM_BI Visual results with Bicubic (BI) degradation (4×) on on Urban100 and Manga109 datasets

For more results, please refer to our paper

Citation

If you find this code helpful in your research, please cite the following paper.

@article{muqeet2019hran,
  title={HRAN: Hybrid Residual Attention Network for Single Image Super-Resolution},
  author={Muqeet, Abdul and Iqbal, Md Tauhid Bin and Bae, Sung-Ho},
  journal={IEEE Access},
  volume={7},
  pages={137020--137029},
  year={2019},
  publisher={IEEE}
}

Acknowledgements

This code is built on EDSR (PyTorch). We are grateful to the authors for sharing their codes of EDSR (https://github.com/thstkdgus35/EDSR-PyTorch).