Performances of proposed normalization algorithm for iris recognition

(1) * Yessi Jusman Mail (Department of Electrical Engineering, Faculty of Engineering, Universitas Muhammadiyah Yogyakarta, Indonesia, Indonesia)
(2) Siew Cheok Ng Mail (Department of Biomedical Engineering, Faculty of Engineering, University of Malaya, Malaysia)
(3) Khairunnisa Hasikin Mail (Department of Biomedical Engineering, Faculty of Engineering, University of Malaya, Malaysia)
*corresponding author

Abstract


Iris recognition has very high recognition accuracy in comparison with many other biometric features. The iris pattern is not the same even right and left eye of the same person. It is different and unique. This paper proposes an algorithm to recognize people based on iris images. The algorithm consists of three stages. In the first stage, the segmentation process is using circular Hough transforms to find the region of interest (ROI) of given eye images. After that, a proposed normalization algorithm is to generate the polar images than to enhance the polar images using a modified Daugman’s Rubber sheet model. The last step of the proposed algorithm is to divide the enhance the polar image to be 16 divisions of the iris region. The normalized image is 16 small constant dimensions. The Gray-Level Co-occurrence Matrices (GLCM) technique calculates and extracts the normalized image’s texture feature. Here, the features extracted are contrast, correlation, energy, and homogeneity of the iris. In the last stage, a classification technique, discriminant analysis (DA), is employed for analysis of the proposed normalization algorithm. We have compared the proposed normalization algorithm to the other nine normalization algorithms. The DA technique produces an excellent classification performance with 100% accuracy. We also compare our results with previous results and find out that the proposed iris recognition algorithm is an effective system to detect and recognize person digitally, thus it can be used for security in the building, airports, and other automation in many applications.

Keywords


Iris recognition; Feature extraction; ROI; GLCM; Dicriminant Analysis

   

DOI

https://doi.org/10.26555/ijain.v6i2.397
      

Article metrics

Abstract views : 431 | PDF views : 269

   

Cite

   

Full Text

Download

References


[1] M. Eskandari and Ö. Toygar, “Selection of optimized features and weights on face-iris fusion using distance images,” Comput. Vis. Image Underst., vol. 137, pp. 63–75, Aug. 2015, doi: 10.1016/j.cviu.2015.02.011.

[2] R. Donida Labati, A. Genovese, E. Muñoz, V. Piuri, and F. Scotti, “A novel pore extraction method for heterogeneous fingerprint images using Convolutional Neural Networks,” Pattern Recognit. Lett., vol. 113, pp. 58–66, Oct. 2018, doi: 10.1016/j.patrec.2017.04.001.

[3] Z. Ali, M. S. Hossain, G. Muhammad, I. Ullah, H. Abachi, and A. Alamri, “Edge-centric multimodal authentication system using encrypted biometric templates,” Futur. Gener. Comput. Syst., vol. 85, pp. 76–87, Aug. 2018, doi: 10.1016/j.future.2018.02.040.

[4] Y. A. U. Rehman, L. M. Po, and M. Liu, “LiveNet: Improving features generalization for face liveness detection using convolution neural networks,” Expert Syst. Appl., vol. 108, pp. 159–169, Oct. 2018, doi: 10.1016/j.eswa.2018.05.004.

[5] T. Mansfield, G. Kelly, D. Chandler, and J. Kane, “Biometric product testing final report,” Contract, vol. 92, no. 4009, p. 309, 2001, available at: Google Scholar.

[6] N. Liu, H. Li, M. Zhang, Jing Liu, Z. Sun, and T. Tan, “Accurate iris segmentation in non-cooperative environments using fully convolutional networks,” in 2016 International Conference on Biometrics (ICB), 2016, pp. 1–8, doi: 10.1109/ICB.2016.7550055.

[7] M. A. M. Abdullah, S. S. Dlay, W. L. Woo, and J. A. Chambers, “Robust Iris Segmentation Method Based on a New Active Contour Force With a Noncircular Normalization,” IEEE Trans. Syst. Man, Cybern. Syst., vol. 47, no. 12, pp. 3128–3141, Dec. 2017, doi: 10.1109/TSMC.2016.2562500.

[8] E. Ouabida, A. Essadique, and A. Bouzid, “Vander Lugt Correlator based active contours for iris segmentation and tracking,” Expert Syst. Appl., vol. 71, pp. 383–395, Apr. 2017, doi: 10.1016/j.eswa.2016.12.001.

[9] F. Jan, “Segmentation and localization schemes for non-ideal iris biometric systems,” Signal Processing, vol. 133, pp. 192–212, Apr. 2017, doi: 10.1016/j.sigpro.2016.11.007.

[10] C. Galdi and J.-L. Dugelay, “FIRE: Fast Iris REcognition on mobile phones by combining colour and texture features,” Pattern Recognit. Lett., vol. 91, pp. 44–51, May 2017, doi: 10.1016/j.patrec.2017.01.023.

[11] J. Daugman, “Probing the Uniqueness and Randomness of IrisCodes: Results From 200 Billion Iris Pair Comparisons,” Proc. IEEE, vol. 94, no. 11, pp. 1927–1935, Nov. 2006, doi: 10.1109/JPROC.2006.884092.

[12] J. Daugman, “New Methods in Iris Recognition,” IEEE Trans. Syst. Man Cybern. Part B, vol. 37, no. 5, pp. 1167–1175, Oct. 2007, doi: 10.1109/TSMCB.2007.903540.

[13] C. Sanchez-Avila, R. Sanchez-Reillo, and D. de Martin-Roche, “Iris-based biometric recognition using dyadic wavelet transform,” IEEE Aerosp. Electron. Syst. Mag., vol. 17, no. 10, pp. 3–6, Oct. 2002, doi: 10.1109/MAES.2002.1044509.

[14] L. Ma, T. Tan, Y. Wang, and D. Zhang, “Local intensity variation analysis for iris recognition,” Pattern Recognit., vol. 37, no. 6, pp. 1287–1298, Jun. 2004, doi: 10.1016/j.patcog.2004.02.001.

[15] L. Ma, T. Tan, Y. Wang, and D. Zhang, “Efficient Iris Recognition by Characterizing Key Local Variations,” IEEE Trans. Image Process., vol. 13, no. 6, pp. 739–750, Jun. 2004, doi: 10.1109/TIP.2004.827237.

[16] I. Daubechies, Ten Lectures on Wavelets, 1992, doi: 10.1137/1.9781611970104.

[17] P. S. Huang, C.-S. Chiang, and J.-R. Liang, “Iris Recognition Using Fourier-Wavelet Features,” 2005, pp. 14–22, doi: 10.1007/11527923_2.

[18] C. Tisse et al., “Person identification technique using human iris recognition Advanced System Technology,” Transform, 1992, available at: Google Scholar.

[19] R. M. Farouk, “Iris recognition based on elastic graph matching and Gabor wavelets,” Comput. Vis. Image Underst., vol. 115, no. 8, pp. 1239–1244, Aug. 2011, doi: 10.1016/j.cviu.2011.04.002.

[20] R. P. Wildes, “Iris recognition: an emerging biometric technology,” Proc. IEEE, vol. 85, no. 9, pp. 1348–1363, 1997, doi: 10.1109/5.628669.

[21] A. Das, “Recognition of Human Iris Patterns,” National Institute of Technology, Rourkela, 2012, available at: Google Scholar.

[22] J. G. Daugman, “High confidence visual recognition of persons by a test of statistical independence,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 15, no. 11, pp. 1148–1161, 1993, doi: 10.1109/34.244676.

[23] W. K. Kong and D. Zhang, “Accurate iris segmentation based on novel reflection and eyelash detection model,” in Proceedings of 2001 International Symposium on Intelligent Multimedia, Video and Speech Processing, ISIMP 2001, 2001, doi: 10.1109/isimp.2001.925384.

[24] L. Masek and others, “Recognition of human iris patterns for biometric identification,” Master’s thesis, University of Western Australia, 2003, available at: Google Scholar.

[25] W. W. Boles and B. Boashash, “A human identification technique using images of the iris and wavelet transform,” IEEE Trans. Signal Process., vol. 46, no. 4, pp. 1185–1188, Apr. 1998, doi: 10.1109/78.668573.

[26] L. Ma and Y. Wang, “Iris recognition based on multichannel Gabor filtering,” Proc. Fifth Asian Conf. Comput., 2002, available at: Google Scholar.

[27] A. F. Mat Raffei, H. Asmuni, R. Hassan, and R. M. Othman, “Frame detection using gradients fuzzy logic and morphological processing for distant color eye images in an intelligent iris recognition system,” Appl. Soft Comput. J., 2015, doi: 10.1016/j.asoc.2015.08.035.

[28] R. Szewczyk, K. Grabowski, M. Napieralska, W. Sankowski, M. Zubert, and A. Napieralski, “A reliable iris recognition algorithm based on reverse biorthogonal wavelet transform,” Pattern Recognit. Lett., vol. 33, no. 8, pp. 1019–1026, Jun. 2012, doi: 10.1016/j.patrec.2011.08.018.

[29] T. Tan, X. Zhang, Z. Sun, and H. Zhang, “Noisy iris image matching by using multiple cues,” Pattern Recognit. Lett., vol. 33, no. 8, pp. 970–977, Jun. 2012, doi: 10.1016/j.patrec.2011.08.009.

[30] Y. Jusman, S.-C. Ng, K. Hasikin, R. Kurnia, N. A. Abu Osman, and K. H. Teoh, “A system for detection of cervical precancerous in field emission scanning electron microscope images using texture features,” J. Innov. Opt. Health Sci., vol. 10, no. 02, p. 1650045, Mar. 2017, doi: 10.1142/S1793545816500450.

[31] Y. Jusman, S.-C. Ng, K. Hasikin, R. Kurnia, N. A. B. A. Osman, and K. H. Teoh, “Computer-aided screening system for cervical precancerous cells based on field emission scanning electron microscopy and energy dispersive x-ray images and spectra,” Opt. Eng., vol. 55, no. 10, p. 103110, Oct. 2016, doi: 10.1117/1.OE.55.10.103110.

[32] C.-P. Chang, J.-C. Lee, Y. Su, P. S. Huang, and T.-M. Tu, “Using empirical mode decomposition for iris recognition,” Comput. Stand. Interfaces, vol. 31, no. 4, pp. 729–739, Jun. 2009, doi: 10.1016/j.csi.2008.09.013.

[33] R. H. Abiyev and K. Altunkaya, “Personal iris recognition using neural network,” Int. J. Secur. its Appl., vol. 2, no. 2, pp. 41–50, 2008, available at : Google Scholar.

[34] L. Ma, K. Wang, and D. Zhang, “A universal texture segmentation and representation scheme based on ant colony optimization for iris image processing,” Comput. Math. with Appl., vol. 57, no. 11–12, pp. 1862–1868, Jun. 2009, doi: 10.1016/j.camwa.2008.10.012.




Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571  (print) | 2548-3161 (online)
Organized by Informatics Department - Universitas Ahmad Dahlan,  UTM Big Data Centre - Universiti Teknologi Malaysia, and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: ijain@uad.ac.id (paper handling issues)
    info@ijain.org, andri.pranolo.id@ieee.org (publication issues)

View IJAIN Stats

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0