We demonstrate a simple yet highly effective uncertainty quantification method for neural networks solving inverse imaging problems. We built forward-backward cycles utilizing the physical forward model and the trained network, derived the relationship of cycle consistency with respect to the robustness, uncertainty and bias of network inference, and obtained uncertainty estimators through regression analysis. An XGBoost classifier based on the uncertainty estimators was trained for out-of-distribution detection using artificial noise-injected images, and it successfully generalized to unseen real-world distribution shifts. Our method was validated on out-of-distribution detection in image deblurring and image super-resolution tasks, outperforming other deep neural network-based models.
|