# Introduction maging understanding is one of the most important tasks involving a classification system. Its primary purpose is to extract information from the images to allow the discrimination among different objects of interest. The classification process is usually based on grey level intensity, color, shape or texture. Image classification is of great interest in a variety of applications. Most of the image analysis problems are related to the neighborhood properties. Each pixel in a neighborhood or image is considered as a random variable, x r , which can assume values x r ? {0, 1?G-1}, where G is the number of grey levels of the image. The probability P (x r = x r | r), where r is the neighbor set for the element x r . The Fig. 1 illustrates different orders of neighborhood for a central pixel. Most of the research involved in image processing is mostly revolved around second order neighborhood only. This is because all the 8-neighboring pixels are well connected with central central pixels and the methods based on second order neighborhood are given extraordinary results in various issues. The present paper considering the difficulties and complexities involved in the third order neighborhood and derived a new, simple and efficient model for image analysis. # Derivations of Transitions on Trapezoids of tn-lbp The proposed method evaluated transitions on "Trapezoids of Third Order Neighborhood of LBP (T-TN-LBP)" and based on this, derived various algorithms for the recognition of facial expressions. The proposed transition based T-TN-LBP consists of 7 steps as described below. Step 1: Take facial image as Input Image (Img). # II. Step 3 : Crop the grey scale image. Step 4: The present research evaluated TN-LBP on each 5 x 5 sub image. The TN contains only 13 pixels of 25 pixels of 5x5 neighborhood as shown in Fig. 1. The TN-LBP grey level sub image is converted into binary sub image by comparing the each pixel of TN grey level sub image with the mean value of TN grey sub image. The following Equation.1 is used for grey level to binary conversion. TN-Pi= ? 0 if P i < V 0 1 if P i ? V 0 ? for i = 1,2,3(1) Where V 0 is the mean of the TN sub matrix Step 5: The present research for classification purpose considered the two reciprocal trapezoids i.e. Top Left (TL) and Bottom Right (BR) trapezoids of TN-LBP. The Fig. 2 shows TL and BR trapezoids of TN-LBP. The each trapezoid pattern consists of 5 pixels. The pixels P 1, P Step 6 : Each trapezoid of TN-LBP consists of five bit patterns. The present research computed the transitions from 0 to 1 and 1 to 0. Generally in 5 bit patterns, 3 types of 0 to 1 and 1 to 0 transitions occur i.e. zero, two and four transitions. The proposed method, considers two and four transitions only, which accounts for 87.5% of patterns. Step 7 : Based on frequency occurrences of two and four transitions, the facial image is classified as one of the category (Neutral, Happiness, Sadness, Surprise, Anger, Disgust and Fear). # Results and Discussions The proposed transition based T-TN-LBP method is experimented on a database contains 213 images of female facial expressions collected by Kamachi and Gyoba at Kyushu University, Japan In the proposed "Transitions based on T-TN-LBP method", the sample images are grouped into seven categories of expression (neutral, happiness, sadness, surprise, anger, disgust and fear). Each T-TN-LBP consists of 5 bit pattern. It results a total of 32 bit patterns. This forms two-zero transitions i.e. the decimal value 0 and 31. The decimal values 5,9,10,11,13,18,20,21,22,26 results for 0 to 1 or 1 to 0 four transitions.The rest of the binary equivalent decimal values1,2,3,4,6,7,8,12,14,15,16,17,19,23,24,25,27,28,29, 30 results two transitions. The beauty of the proposed transitions on T-TN-LBP method is it evaluated the frequency occurrences of 2 and 4 transitions. This accounts a total of 87.5% of transitions. The proposed method not considered the zero transitions which accounts for 12.5% of patterns. Further the proposed method evaluated the frequency occurrence of 2 and 4 transitions separately. The proposed method further evaluated sum of frequency occurrences two and four transitions of both TL and BR T-TN-LBP for the different facial expressions separately and listed in tables 1, 2, 3, 4, 5, 6 and 7 respectively. In the tables, STLT denotes sum of transitions (both 2 and 4) of Top Left Trapezoid and SBRT denotes sum of transitions (both 2 and 4) of Bottom Right Trapezoid. Further, the table also gives Total number of ( 2 and 4) transitions of both Trapezoids denoted as TBT in the above tables. Comparison of The Proposed t-tn-lbp with Other Existing Methods Table 9 shows the classification rate for various groups of facial expression by the proposed T-TN-LBP method with other existing methods like feature-based facial expression recognition within an architecture based on a two-layer perception of Zhengyou Zhang [2], Facial expression analysis by Dela Torre et.al [3] and Facial Expression Recognition Based on Distinct LBP and GLCM by Gorti SatyanarayanaMurthy et.al [4]. These methods are implemented on Kamachi and Gyoba [5] at Kyushu University-data set and compared with the proposed method. From table 9, it is clearly evident that, the proposed method exhibits a high classification rate than the existing methods. The graphical representation of this is also shown in Fig. 5. 1![Figure 1 : Neighborhood for a central pixel: (a) First Order (b) Second Order (c) Third Order (d) Fourth Order](image-2.png "Figure 1 :") 2![Convert the RGB image into Grey scale Image by using HSV color model.](image-3.png "Step 2 :") 2![Figure 2 : The TL and BR trapezoids of TN-LBP](image-4.png "Figure 2 :") ![[1]. A few of them are shown in Fig.3.](image-5.png "") 3![Figure 3 : Facial expression database (Kamachi and Gyoba at Kyushu University, Japan).](image-6.png "Figure 3 :") 4![Figure 4 : Classification Performance of various algorithms.](image-7.png "Figure 4 :") 5![Figure 5 : Classification chart of proposed T-TN-LBP method with other existing methods.](image-8.png "Figure 5 :") P 1P 4P 2P 3P 5P 6P 7P 8P 9P 10P 11P 12P 13 1YearD D D D D D D D ) F(Transitions on Top-LeftTransitions on Bottom-RightT-TN-LBPT-TN-LBPS.NoImage Name24STLT24SBRTTBT1KA.AN1.3973713787474115289317672KA.AN2.4072317089370818989717903KA.AN3.4171117788870918389217804KL.AN1.16772317089369917987817715KL.AN2.16872918291172618791318246KL.AN3.16974815990771618790318107KM.AN1.1772715287972115387417538KM.AN2.1869616786369816986717309KM.AN3.19699167866732159891175710KR.AN1.83727158885693193886177111KR.AN2.84759160919723169892181112KR.AN3.85730161891730161891178213MK.AN1.125708173881742162904178514MK.AN2.126678184862733162895175715MK.AN3.1277041538577381518891746 22014YearTransitions on Top-LeftTransitions on Bottom-RightT-TN-LBPT-TN-LBPS.NoImage Name24STLT24SBRTTBT( D D D D ) F1 2KA.DI1.42 KA.DI2.43831 788158 186989 974770 784163 175933 9593KA.DI3.447951509457951759704KL.DI1.1708201679877492039525KL.DI2.1718071849917351929276KL.DI3.1727421789207851739587KL.DI4.1737581489067751869618KM.DI1.208221699917561719279KM.DI3.2282015097074518492910KR.DI1.8681917199076314590811KR.DI2.87843166100972617289812KR.DI3.8879215694877817995713MK.DI1.12883314497779415194514MK.DI2.12983713296978916395215MK.DI3.13080616096676418394716NA.DI1.21479818298076718695317NA.DI2.215834168100276516092518NA.DI3.21683416499877316794019NM.DI1.10781818099872617089620NM.DI3.10982117799873718992621TM.DI1.193754215969753212965 3YearTransitions on Top-LeftTransitions on Bottom-RightT-TN-LBPT-TN-LBPS.NoImage Name24STLT24SBRTTBT1KA.FE1.4579619599184419410382KA.FE2.4681117898982018310033KA.FE3.4778319297581518910044KA.FE4.4877820698482621010365KL.FE1.17477819797583219210246KL.FE2.17578420598985117310247 8KL.FE3.176 KM.FE1.23796 778197 198993 976843 782199 2011042 983( D D D D D D D D ) F9KM.FE2.2478319597877420197510KM.FE3.2578718196880918599411KR.FE1.89769196965832186101812KR.FE2.90792186978818183100113KR.FE3.918012001001830197102714MK.FE2.131795184979844165100915MK.FE3.132802180982832174100616MK.FE4.133793165958812193100517NA.FE1.21779318898180119099118NA.FE2.218783188971824181100519NA.FE3.2197972091006856173102920NM.FE1.110773200973867162102921NM.FE2.11178318696982017799722NM.FE3.11279818498282516498923TM.FE1.1967962081004833186101924TM.FE2.1978141991013807208101525TM.FE3.198793189982823200102326UY.FE1.1527921999918421721014 4Transitions on Top-LeftTransitions on Bottom-RightT-TN-LBPT-TN -LBPS.No Image Name24STLT24SBRTTBTYear1KA.HA1.298472071054865220108521392KA.HA2.308471931040857204106121013KA.HA3.318232101033887193108021134KA.HA4.328322211053874211108521385KL.HA1.1588092511060878208108621466KL.HA2.1598442081052864209107321257KL.HA3.1608392041043859209106821118KM.HA1.48392171056829201103020869KM.HA2.584918510348651771042207610KM.HA3.678223810208102321042206211KM.HA4.7831215104684219810402086D D D D ) F12KR.HA1.74823217104089321111042144(13KR.HA2.7583120410358792101089212414KR.HA3.7681919910188642031067208515MK.HA2.11782721110388552001055209316MK.HA3.11883118510168471881035205117NA.HA1.20283520810438351991034207718NA.HA2.20383320510388592081067210519NA.HA3.20486319610598321861018207720NM.HA1.9583621110478512151066211321NM.HA2.9684220210448691971066211022NM.HA3.9785718610438582011059210223TM.HA1.18082620810348522321084211824TM.HA2.18181723610538262621088214125TM.HA3.18282322310468482381086213226UY.HA1.13784622210688602131073214127UY.HA2.13886121210738402281068214128UY.HA3.13982421310378712001071210829YM.HA1.5283322010538642061070212330YM.HA2.53826214104084521610612101 5Transitions on Top-LeftTransitions on Bottom-RightT-TN-LBPT-TN-LBPS.No Image Name24STLT24SBRTTBT1KA.NE1.268712141085876227110321882KA.NE2.278681951063898211110921723KA.NE3.288631991062892223111521774KL.NE1.1558612271088864222108621745KL.NE2.1568712201091857233109021816KL.NE3.1578732261099887220110722067KM.NE1.18442211065898195109321588KM.NE2.28432421085861215107621619KM.NE3.387720810858662251091217610KR.NE1.7185820710658722231095216011KR.NE2.7286222410868762171093217912KR.NE3.7387123311048782111089219313MK.NE1.11389418510798542191073215214MK.NE2.11488620310898702211091218015MK.NE3.11586120110629261731099216116NA.NE1.19988821411028562021058216017NA.NE2.20087323711108572331090220018NA.NE3.20190018810888862041090217819NM.NE1.9286019110518782301108215920NM.NE2.9387620210788782131091216921NM.NE3.9493021011408562051061220122TM.NE1.17785522810838652371102218523TM.NE2.17884924510948332891122221624TM.NE3.17983423910738822401122219525UY.NE1.13487320410778792131092216926UY.NE2.13587421410888542311085217327UY.NE3.13688121010918732121085217628YM.NE1.4985121510669041941098216429YM.NE2.5088818610748722121084215830YM.NE3.51887214110186322310862187 6Transitions on Top-LeftTransitions on Bottom-Right T-T-TN-LBPTN-LBPS.No Image Name24STLT24SBRTTBT 7YearD D D D ) F(Transitions on Top-LeftTransitions on Bottom-RightT-TN-LBPT-TN-LBPS.No Image Name24STLT24SBRTTBT1KA.SU1.3610052311236981235121624522KA.SU2.379732341207974233120724143KA.SU3.3810062251231983237122024514KL.SU1.1649462651211988238122624375KL.SU2.165975236121199122612172428 9Image DatasetArchitecture based on a two-layer perceptionFacial expression analysisGLCM on DLBP of FCI MethodProposed Method (T-TN-LBP)Kamachi and Gyobaat Kyushu University,80.2991.7996.67100Japan-data set © 2014 Global Journals Inc. (US) © 2014 Global Journals Inc. (US) Global Journal of Computer Science and Technology ## Conclusions The present paper derived new direction for various problems of image processing by deriving LBP on the third order neighborhood. The third order neighborhood consists of 12 pixels excluding centre pixel. This may lead to huge number of patters i.e. 2 12 . The U-LBP on third order neighborhood leads to a negligible percentage of patterns. To overcome this, the present paper proposed transitions on T-TN-LBP. The T-TN-LBP considered 87.5% of transitions thus overcoming the disadvantage of U-LBP of third order neighborhood. The STLT, SBRT and TBT results of Table 8 clearly indicates an average facial expression classification result of 58%, 66% and 100% respectively. * Feature-Based Facial Expression Recognition: Sensitivity Analysis and Experiments with a Multi-Layer Perception Zhe ZhengyouZhang International Journal of pattern recognition and Artificial Intelligence 13 6 1999 * Facial expression analysis FFDela Torre JFCohn Guide to Visual Analysis of Humans: Looking at People .BTh AMoeslund VHilton LKruger Sigal * Facial Expression Recognition Based on Features Derived From the Distinct LBP and GLCM GortiSatyanarayanamurty Sasikiran VDr Vijaya Kumar International Journal on Image, Graphics and Signal Processing 2 2014 * Coding facial expressions with gabor wavelets MLyons SAkamatsu MKamachi JGyoba Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition the Third IEEE International Conference on Automatic Face and Gesture RecognitionNara, Japan Apr. 1998