# Introduction e discuss feature selection method, that operates after the physical installation of the imaging system and through a learning stage where typical images resultant of the imaging setting are processed, selects the higher discriminating features, according to the environment specificities. The non-cooperative image capturing setting, either under natural light or varying lighting conditions leads to the appearance of images whose typical characteristics are termined by the used optic device and the environment itself. For instance, it is expectable that some imaging conditions propitiate the existence of reflections (specular or lighting) in specific iris regions, while others propitiate the iris occlusion by eyelids and eyelashes. Current iris matching proposals (feature extraction and comparison) are independent of the imaging environments and do not take into account this information in the recognition task. # II. # Feature selection method The problem of feature selection is to take a set of candidate features and select a subset that best performs under some classification system [11]. This procedure can reduce the cost associated with classification, by reducing the number of features that must be collected, and in some cases it also provides better results due to the finite sample size effects: as the number of features is reduced, and the number of points is maintained, the feature space becomes more densely populated. Formally, let T and S be respectively the candidate and selected feature sets, S is subset to T. Also, let ?.? denote the cardinality of the set, such that ?T?= t and ?S?= s. The feature selection criterion function for the set X is represented by J(X). Considering that higher values of J indicate better feature sets, the problem of feature selection is to find a subset S to set T such that |S| = s and ??(??) = ?????? ?????,|??|=?? (equ.? 1) selected. The motivation behind this proposal is the valorisation of the features which respectively maximize and minimize the signatures dissimilarity in the inter-and intra-class comparisons. As can be seen in (equ.? 2), the dissimilarity between two feature values contributes to an increase of the respective merit if they were extracted from different irises and, inversely, contributes to its decrease if the features were extracted from images of the same iris. In the following discussion we will use F p i to denote the i th feature set extracted from the iris p and f p i,j to denote the j th feature of the i th feature set extracted from the iris p. Thus, F p i = {f p i, 1, . . . , f p i,t }. Let A = {F p1 1 , . . . , F pk n } be the set of training feature sets extracted from n images of k subjects. The merit value m(.) of each candidate feature i is given by: m(i) : {1, . . . , t}? R ??(??) = ? ? ???ð??"ð??" ?? ,?? ?? ,ð??"ð??" ??,?? ?? ? ??? ?? ??? ?? ??? ?? ,?? +?? ?? (1 ? 2 ?? ??=?? +1 ??+1 ?? =1 ?? ??,?? ) (equ.? 2) Where d(.) is the function that gives the features dissimilarity (e.g., Hamming or Euclidean distance), ? p,r is the Kronecker delta and t I and t E are, respectively, the number of intra and inter-class comparisons between elements of A. This definition implies that the highest values occur when the features dissimilarity is respectively smaller in the intra-and higher in the interclass comparisons, obtaining a value that is directly correspondent to the feature discriminant capacity within the respective imaging environment. According to (equ.? 1), the function J(.) that performs the feature selection will give us the feature set S, which contains the s features with highest values of q(.). However, if the features are selected as above described, it is not possible to achieve invariance to iris rotation through signature shifting, and this is a very common technique used in the feature comparison. We compensate this by making the normalization process into the dimensionless polar coordinate system starting from 5 different deviation angles of the segmented iris image (-10 o , -5 o , 0 o , +5 o , +10 o ) and obtaining 5 normalized iris images. The subsequent processing is further made separately for each of these images and the dissimilarity between iris signatures is given by the lowest dissimilarity between the enrolled signature and those extracted from each of these images. Algorithm contains the pseudo-code of the above described feature selection method. Its computational complexity of O(n 3 ) is not a concern, as it will be executed before the functioning stage of the recognition system and, due to this fact, without critical time constraints. In this algorithm f (i, j) represents the i th feature extracted from the image j and id (f) the identity of the subject from where the feature f was extracted. # III. # Algorithm for Feature Selection for i = 1 to n do merit(i) ? 0 end for for i = 1 to t ? 1 do for j = i + 1 to t do for k = 1 to n do x ? dist(f(k, i), f(k, j)) if id(f(k, i)) = = id(f(k, j)) then merit(k) ? merit(k) -x / t I else merit(k) ? merit(k) + x / # Result # Conclusion The typical noise regions and characteristics of the images captured within non-cooperative environments are highly influenced by the used optic device and the specific lighting conditions of each environment. This leads to a significant increment of the error rates, which was the main motivation for this section proposal. We described a method for the feature selection that takes into account the typical characteristics of the images, namely their noise regions determined by the imaging environment. Using a training set composed of images captured after the physical installation of the imaging system, we computed the merit value for each candidate feature and selected those with highest values. Since the training set images are representative of the ones that the recognition system will have to deal with, this process contributes for the adaptability of the recognition system to the specific environment. We stress that this approach is compatible with different imaging environments, since each recognition system will select a proper sub set of features that are further taken into account in the recognition process, through the comparison with the correspondent enrolled features. Experiments led us to conclude about an improvement in the system's accuracy when the cardinality of the selected feature set is between 30 and 50% of the number of candidate features. In this situation, the error rates significantly decreased (about 50%) in the recognition of noisy iris images, which must be considered an achievement. # VI. # Future work We are currently working on the analysis of the requirements for the physical implementation of the noncooperative prototype system. This has revealed, specially the planning of the optical framework, as a task with higher difficulty than we initially thought. Simultaneously, we are implementing, and in specific situations adapting and improving, algorithms for the real-time human face and eye detection. Our purpose demands algorithms with high performance, which decreased the number of potential alternatives. Regarding the experiments and results contained in this dissertation, we are presently per forming the experimental evaluation of the proposed methods with larger data sets, in order to obtain information about the advantages resultant of the methods with higher statistical relevance. Moreover, we are performing the comparison between three common iris recognition proposals (Daugman's [3], Wildes' [14] and Ma et al. [15]) as they are described by the authors and together with the totality of our proposals. This will bring us new information about the improvements in the recognition accuracy, according to different recognition strategies. The evaluated types of noise should be the subject of further work, since this work has not dealt, for instance, with off-angle iris images. This will obviously introduce new challenges to the recognition that must be overcome, and predictably demand the adjustment of some of our methods to these new constraints. 21![Figure 2.1: Block diagram of the feature selection method](image-2.png "Figure 2 . 1 :") 41![Figure 4.1: 1st Original image of the eye showing the iris](image-3.png "Figure 4 . 1 :") 434![Figure 4.3: 2nd Original image of the eye showing the iris](image-4.png "Figure 4 . 3 :Figure 4") t Eend ifend forend forend forS=Select_Features_Highest_Merit (n, s, merit)return(S)I ? Number of intra-class comparisons between elements of T t E ? Number of inter-class comparisons between elements of T IV. * Automatic retinal image registration scheme using global optimization techniques GeorgeKMatsopoulos NicolaosAMouravliansky KonstantinosKDelibasis KonstantinaSNikita IEEE Transactions on Information Technology in Biomedicine 3 1 March 1999 * Automated feature extraction in color retinal images by a model based approach LiHuiqi OChutatape IEEE Transactions on Biomedical Imaging 51 2 February 2004 * Visual identification by signature tracking MarioEMunich PietroPerona ACM February 2003the. September 2000 25 * Toward reliable user authentication through biometrics VaclavMatyasJr ZdenekRiha IEEE Security and Privacy 1 3 2003 * Phenotypic versus genotypic approaches to face recognition JohnGDaugman Face Recognition: From Theory to Applications Heidelberg Springer-Verlag 1998 * Personal Identification in networked society KAnil RJain SBolle Pankanti 1999 Kluwer Academic Publisher, E.U.A 2nd edition * An introduction to biometric recognition KAnil AJain SRoss Prabhakar IEEE Transactions on Circuits and Systems for Video Technology January 2004 14 * The"123" of biometric technology YunYau Wei Synthesis Journal 2002. 2002 * A practical guide to biometric security technology SimonLiu MarkSilverman IT Professional 3 1 January 2001 * CBEFF common exchange biometric file format 2001 National Institute of Standards and Technology * Security technology for ubiquitous information society 2006 Hitachi, Ltd Systems Development Laboratory * JohnDWoodward KatharineWatkinsWebb ElaineMNewton MelissaABradley DavidRubenson KristinaLarson JacobLilly KatieSmythe Brian * Iris recognition: an emerging biometric technology RichardPWildes In Proceedings of the IEEE 85 9 September 1997 * Local intensity variation analysis for iris recognition LiMa TieniuTan DexinZhang YunhongWang Pattern recognition 37 6 2006