As vein identification technology advances, more and more vein recognition products are coming into the public's view. Additionally, vein recognition technology boasts a much higher level of security than previous recognition technologies. The convolutional neural network model is frequently used in vein recognition research to increase vein recognition accuracy. However, because the model learns the noise in the image, vein recognition accuracy cannot be further increased. To tackle the aforementioned issues, a neural network model based on Canny is suggested for the detection of palm veins. The input image's feature information was extracted using the convolutional layer, controlled by the Canny layer to learn the vein image's texture features. The self-attention layer was then utilized to increase the convolutional layer's robustness when learning the vein image's feature information. Based on the experimental results, it can be concluded that the Canny-based vein recognition neural network model performs better at extracting texture features from vein images and avoids the network from learning unnecessary feature information. It has also demonstrated improved accuracy and equal error rates on three publicly available palm vein datasets: the CASIA-PV200 palm vein dataset (94.33% and 5.83%), the TJU-PV600 palm vein dataset (94.88% and 4.86%), and the VERA-PV220 palm vein dataset (95.64% and 4.86%).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.