TY - JOUR
T1 - Memory- and time-efficient dense network for single-image super-resolution
AU - Imanpour, Nasrin
AU - Naghsh-Nilchi, Ahmad Reza
AU - Monadjemi, Amirhassan
AU - Karshenas, Hossein
AU - Nasrollahi, Kamal
AU - Moeslund, Thomas B.
PY - 2021/4
Y1 - 2021/4
N2 - Dense connection in convolutional neural networks, which connects each layer to every other layer, can avoid mid/high-frequency information loss and further enhance high-frequency signals. Single image super-resolution (SISR) can benefit from it in restoring rich details. A larger number of propagating feature maps, named as growth rate, especially with deeper depths use high memory. To address this problem, an efficient two-step concatenate feature map learning is proposed in this paper. The idea is to enrich the concatenate feature maps using a convolutional layer with more filters before concatenate layers instead of increasing the growth rate. Afterward, representative concatenate feature maps are extracted using a smaller growth rate. That significantly reduces memory usage without loss of information. The proposed dense block improves the results by 0.24 dB in comparison toSISR with the basic dense block. Moreover, it results in 24% and 6% less memory usage and test time. Furthermore, the proposed method decreases the growth rate by at least a factor of 2 while producing competitive results, and improves the percentage of memory and time consumption by up to 40% and 12%. These results suggest that the proposed approach is a more practical method for SISR.
AB - Dense connection in convolutional neural networks, which connects each layer to every other layer, can avoid mid/high-frequency information loss and further enhance high-frequency signals. Single image super-resolution (SISR) can benefit from it in restoring rich details. A larger number of propagating feature maps, named as growth rate, especially with deeper depths use high memory. To address this problem, an efficient two-step concatenate feature map learning is proposed in this paper. The idea is to enrich the concatenate feature maps using a convolutional layer with more filters before concatenate layers instead of increasing the growth rate. Afterward, representative concatenate feature maps are extracted using a smaller growth rate. That significantly reduces memory usage without loss of information. The proposed dense block improves the results by 0.24 dB in comparison toSISR with the basic dense block. Moreover, it results in 24% and 6% less memory usage and test time. Furthermore, the proposed method decreases the growth rate by at least a factor of 2 while producing competitive results, and improves the percentage of memory and time consumption by up to 40% and 12%. These results suggest that the proposed approach is a more practical method for SISR.
KW - Super-Resolution
UR - https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/sil2.12020
U2 - 10.1049/sil2.12020
DO - 10.1049/sil2.12020
M3 - Journal article
SN - 1751-9675
VL - 15
SP - 141
EP - 152
JO - IET Signal Processing
JF - IET Signal Processing
IS - 2
ER -