24 February 2023 Image super-resolution using multi-scale non-local attention
Sowon Kim, Hanhoon Park
Author Affiliations +
Abstract

Recent convolutional neural network (CNN)-based super-resolution (SR) studies incorporated non-local attention (NLA), to consider the long-range feature correlations and achieve considerable performance improvement. Here, we proposed an innovative NLA scheme called multi-scale NLA (MS-NLA) that computes NLAs at multiple scales and fuses them. To effectively fuse NLAs, we also proposed two learning-based methods and analyzed their performance on a recurrent SR network. Furthermore, the effect of weight sharing in the fusion methods is analyzed as well. In 2 × and 4 × SR experiments on benchmark datasets, our method had higher PSNR values of 0.295 and 0.148 dB on average than those using single-scale NLA and cross-scale NLA, respectively, and produced visually more pleasing SR results. The weight sharing had a limited but positive effect, depending on datasets. The source code is accessible at https://github.com/Dae12-Han/MSNLN.

© 2023 SPIE and IS&T
Sowon Kim and Hanhoon Park "Image super-resolution using multi-scale non-local attention," Journal of Electronic Imaging 32(1), 013043 (24 February 2023). https://doi.org/10.1117/1.JEI.32.1.013043
Received: 15 November 2022; Accepted: 6 February 2023; Published: 24 February 2023
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Super resolution

Image quality

Image fusion

Feature fusion

Convolution

Education and training

Lawrencium

RELATED CONTENT


Back to Top