We propose a super-resolution resolution algorithm on the basis of maximum likelihood (ML) method and edge-orient diffusion. By using Hammerseley-Clifford theorem, an image field assumed to be a Markov random field is Gibbs distributed. An edge-orient diffusion function is introduced and employed in the Gibbs prior. According to Bayesian theorem, the solution to the maximum likelihood function is equal to that to maximum a posterior function. Therefore we incorporate ML with a prior distributed function. Experimental results illustrate that our method has a powerful super-resolution restoration performance. Compared with traditional ML method, our approach can not only obtain super-resolution images, but also eliminate noise artifacts effectively without smoothing edges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.