# Introduction ariation in the visualation of an object mostly arises due to changes in illumination and pose [2]. However, existing approaches has limitations on illumination conditions and objects [1] [3], or are computationally intense due to iterative optimization procedures used for obtaining the solution [4]. Moving Object detecting and tracking are very important in any vision based surveillance system. There are Various approaches to object detection have been proposed for surveillance, including feature-based object detection, template-based object detection and background subtraction [6] or inter-frame difference-based detection .Most algorithms for object detection and tracking are designed for daytime visual surveillance [5]. Every object tracking method should have an object detection mechanism either in each frame or when the object first time appears in the video. A most of approach for object detection is to use information in a single frame. However, some of object detection methods make use of the temporal information computed from a of frames sequence to reduce the false detections in video frames. This temporal information is usually in the form of frame differencing, which shows information in form of changing regions in consecutive frames. [7] Tracking involves registering the movements of the segmented object from initial frame to the last frame in a video. The goal of this paper is estimation for improved object tracking and recognition. Object tracking algorithm can be categorized into three steps shown in Fig. 1 [8]. These are point tracking, kernel tracking and silhouette tracking [8]. In point tracking, detected object in consecutive frames are represented by a set of points and kalman filter is widely used in the point based feature tracking [9]. Point tracking is complicated in the presence of occlusion, entries and exists of an object. Kernel tracking is associate with the object shape and appearance. These algorithms differ from the others based on the method used to estimate the object motion and the numbers of the objects tracked [8]. Kernel based object tracking is usually represented with rectangular or elliptical shape of kernel. Silhouette tracking methods provide an accurate shape of the objects [9], where object boundary shows sharp changes in image intensities. Advantages of the silhouettes are their flexibility to recognize and handle a variety of object shapes. The object tracking also defined as a serial process of object representation, feature selection, object detection and object tracking. [43] [8] The object can be defined as a set of points or single point. Object is represented as primitive geometric shapes like circle, ellipse, rectangle , object contour, object silhouette, skeletal model, articulated shape models, etc. The combined defined with the appearance representations for tracking purpose. Most of appearance representations are probability densities of object appearance, active appearance model, multiview appearance model and templates. In feature selection is another most important steps in object tracking. Some of the commonly used features are edges, gradient, texture, color, optical flow etc. Every object tracking method requires an object detection mechanism in every frame. Some commonly used object detection methods are: point detectors, segmentation, background subtraction etc. After object detection the tracker's task is to detect and generate the object trajectory over time by locating object position in each frame of the video. Object tracking can be complex due to the loss of information, complex object motion/shapes, noise in the image, illumination changes into scene and partial occulation object. Kernel tracking is associate with the object shape and appearance. These algorithms differ from the others based on the algorithm used to estimate the object motion and the numbers of the objects tracked [8]. Kernel based object tracking is usually represented with rectangular or elliptical shape of kernel. Silhouette tracking methods provide an accurate shape of the objects [9], where object boundary shows sharp changes in image intensities. Advantages of the silhouettes are their flexibility to handle a variety of object shapes. Object tracking can be complex due to the loss of information, complex object motion/shapes, noise in the image, illumination changes into scene and partial and full occlusion into scene [8]. Illumination is an important concept in visual arts. Illumination problem is defined as the degree of visibility of the object or change in appearance of the object with different lighting condition [10]. The placement of the light sources can make a difference in the type of any object that is being presented. Multiple light source can produces the illumination effect. Another major challenge is the occlusion effect. It is also defined as hidden (occluded) object. In dynamic scenes, the moving objects exhibit many spatial configurations relative to other objects. A relative depth ordering on the objects and the scene background structures is imposed along the lines of sight when observed from a view point. Such a depth ordering leads to the partial or complete viewing obstruction of some of the object of interest by others and the phenomenon is also known as occlusion [11]. II. # Relate Work Illumination effect on object tracking has been the major challenge till date and various researchers have proposed the effective algorithms for the same. There are several techniques that attempt to pre-compensate for illumination variations between frames caused by changes in the strength or position of light sources in the scene. Some of the earliest attempts to deal with illumination changes used intensity normalization [32][33][34], Most of the algorithms handle luminance intensity, which is one scalar value per each pixel [35][36][37][38][39][40][41]. Few reviews exist for surveying the performance of application independent trackers. The work of 2006 of Yilmaz et al. [8] still provides a good frame of reference for reviewing the literature, describing methodologies on tracking, features and data association for general purposes. Fuat corun and A. Enis Cetin have proposed method using 2D-Cepstrum characteristics method for object tracking under illumination variations of the target. They also explain object tracking algorithm bsed on the co-difference and covariance matrix based [12].The covariance of feature vectors describing the target is called covariance matrix [13]. Co-difference tracking method describes the co-difference matrix to model the moving objects or target. In these two methods, the aim is to assign this region as the moving target and find the region in a given Image frame having the minimum distance from the target matrix at that frame. The first stage of these algorithms is to find the feature images and vectors and then co-variance matrix and codifference matrix is finding out by using the feature vectors. After first stage, to estimate the distance matrix and target location of the object. This operation is repeated for each frame. Then this algorithm is analyzed by using 2DCepstrum [14] analysis. 2D-Cepstrum is an amplitude invariant feature extraction method. So, cepstrum domain coefficient of a region remains unchanged under the light intensity variations. This property of cestrum provides robustness to illumination variation at the target region. Co-difference method is applied to video.it is applied to video sequence in which the intensity of the target region varies and experimentally this method provides better result than the ordinary covariance tracking method. Ashwani Kumar et al., have proposed method to improvement in colour based a moving object tracking for corporate world and employed. The parameter of object consideration are object position, speed, size, object size scale and the appearance effect of the object. The target parameter update based on condition to the perfect tracking of an object [15]. This method is used for non-rigid deformation of targets, partial occlusion and cluttered background. Disadvantage of this method is that it does not consider colour histogram in target regions. Zhou Dan et al., have proposed method based on surf for higher tolerance of illumination changes, which help in outdoor object tracking [16].this method is robust for difficult and complex environments. Rama Chellappa et al. have proposed a method for object detection and tracking using multiple smart cameras [18]. They use the method of background modelling (background Subtraction) for moving object detection and tracking. In this study, a test image can be subtracted from the template and pixel with large difference can be marked as moving objects. Gaussian distribution functions are used to remove the global changes into the scene such as illumination or camera jitter. Also, the mixer of Gaussian (MOG) model handles periodic background disturbance and can be used to keep the tack of global changes in the scene. This will reduce the effects of illumination into the Scene. Multiple smart cameras are arranged in the proper direction and the numbers of cameras are connected in the distributed network. In detection and tracking, ground place can be used as a strong constraint for designing efficient and robust estimators for moving objects. The algorithms are optimized for sensor networks that contain a small number of cameras. The geometric constraints induced by the imaging devices to derive distributed algorithms for target detection, tracking and recognition which are efficiently used. Object detection and tracking under changing lighting (illumination) conditions studied by Wagas Hassan et al. [19] is based on orientation of the edge. Tracking based on the energy or magnitude of the edges can also suffer from changes in illumination. A change in illumination can causes the magnitude of an edge to change which can result in false tracking outputs. In this paper, author has considered adaptive edge orientation based technique. Adaptive edge orientation method considers the orientation of the edge rather than the intensity and there is no dependency on colour features. Such method will gives the better results where lighting is not consistent. Under lighting Conditions, edges are more stable than both edge magnitude and colour. This algorithm is also applied to the highly variable lighting video sequence and provides the better results. Francois Bardet et al., have proposed a method for illumination invariance where multi-objects are jointly tracked through a Markov chain Monte-carlo Particle Filter (MCMC PF) [20]. To allow the object to enter or leave the scene khan et al. [21] extended their Markov chain Monte-Carlo particle Filter (MCMC PF) method to track the variable number of objects. This extended method is reversible Jump Markov Chain Monte-carlo (RJ MCMC) [25] sampler. Reversible Jump Markov chain Monte-Carlo Particle Filter has become a popular algorithm for real-time tracking. RJMCMCPF samplers allow the object classification as well as detect the object shapes. An experiment has been performed by the authors on pedestrian tracking and highway vehicle tracking. In pedestrian tracking, more than ten objects are tracked under variable sunlight condition. Also, highway vehicles are tracked and classify simultaneously with time evolving sunlight. RJ MCMC PF sampler algorithm overcomes the problem of Isard et al. [24]. They have proposed a method using SIRPF (Sequential Importance Resampling Particle Filter) and a Monocular multi-Object Tracker (MOT) which has the limitation of tracking the maximum three objects. Online tracking methods under the various outdoor lighting variations with moving cameras are studied by Yanli Liu and Xavier Granier [22]. To design the algorithm, they have assumed a strong correlation in lighting over large spatial and temporal extents. With such an assumption, they combine the information of previous frame with the current frame for estimating the relative variations of sunlight and skylight. Sunlight and skylight are estimated via a sparse set of planar featurepoints extracted from each frame. The most of algorithm achieves nearly real-time processing with an unoptimized Matlab implementation. This approach does not require any prior knowledge of 3D scene and works with moving view point. Also algorithm provides a user with an augmented reality experience with its general purpose camera. Without knowledge of lighting direction, algorithm cannot deal with indoor scenes. Moving object tracking approach defined by Oksam Chae et al. [45] is based on the parametric edge of the object. Image information lies on the edge of different objects. Edge information is less sensitive to noise and is more consistent than the pixel values in the video sequence. Also edge-based methods show more robustness as compared to pixel intensity based methods and less sensitive to illumination variation than intensity features. Object boundary shows sharp changes in image intensities. Segment based edge pixels representation is fast compare to all the pixels in the image. This representation helps to incorporate a fast, efficient and flexible edge tracking algorithm. Tianzhu Zhang et al. have proposed a robust visual tracking algorithm using multi-task sparse learning [26]. This algorithm handles the particles (target Candidates) [27] independently. First stage of this algorithm given by the authors is to define or formulate the object tracking in particle filter framework as a multitask tracking. They have also uses the particle filter to track the target object. Then the particles are randomly sampled according to Gaussian distribution. These particles are represented as a linear combination of updated dictionary template. As particles are densely sampled around the target stage, their representation will be sparse. This is more robust representation for particles. This convex optimization problem can be solving using accelerated proximal gradient method. This algorithm improves the tracking performance and overall computational complexity. # Global Journal of Computer Science and Technology Volume XIV Issue V Version I # Evaluation Parameter Basic types of evaluation parameter used in object tracking for algorithm [7][8] [42][43]: 1. Deviation: the track's location deviated from the ground truth. 2. True positive: tracker identifies a target which is a target. 3. False positive: tracker identifies a target which is not a target 4. True negative: tracker not misses to identify and locate the target. 5. False negative: tracker misses to identify and locate the target. IV. # Conclusions In this paper, we present a literature survey of object tracking approaches and also give a brief review of related topics. We devides the basic approach in three categories such as point tracking, kernel tracking and silhouette tracking. Moreover, we describe the degree of applicability and qualitative comparison of the tracking algorithms. After the literature survey, we came to the conclusion that in order to track object under illumination condition, the features extracted from frames must be invariant to illumination. We expect that this survey on moving object tracking in video with rich theoretical details of the tracking methods along with bibliography contents will give valuable contribution to research works on object tracking and encourage new research. # Global Journal of Computer Science and Technology Volume XIV Issue V Version I 1![Figure 1 : Object tracking classification[8] ](image-2.png "Figure 1 :") ![Disadvantage of it increases the complexity of the algorithm and reduce efficiency of tracking. Kai Du et al., have improved algorithm based on SIFT feature matching and MeanShift method for used in object initialization in tracking, which uses Global Journal of Computer Science and Technology Volume XIV Issue V Version I weight coefficient of Bhattacharyya coefficient in each video sequence block.](image-3.png "F") © 2014 Global Journals Inc. (US) Kalpesh R Ranipa ? & Dr. Kiritkumar Bhatt ? © 2014 Global Journals Inc. (US) Illumination Condition Effect on Object Tracking: A Review ## This page is intentionally left blank * Robust Estimation of Albedo for Illumination-Invariant Matching and Shape Recovery SBiswas GAggarwal RChellappa IEEE Trans. Pattern Analysis and Machine Intelligence 31 5 May 2009 * Modeling Illumination Variation with Spherical Harmonics RRamamoorthi Face Processing: Advanced Modeling Methods 2006 * Face Recognition from a Single Training Image under Arbitrary Unknown Lighting Using Spherical Harmonics LZhang DSamaras IEEE Trans. Pattern Analysis and Machine Intelligence 28 03 Mar 2006 * Face Re-Lighting from a Single Image under Harsh Lighting Conditions YWang ZLiu GHua ZWen ZZhang DSamaras IEEE Conf. Computer Vision and Pattern Recognition 2007 * A survey on visual surveillance of object motion and behaviors WHu TTan LWang SMaybank Man, Cybern. C: Appl. Rev 34 3 Aug 2004 IEEE Trans. Syst. * Robust techniques for background subtraction in urban traffic video SC SCheung CKamath Proc. SPIE SPIE Jan 2004 5388 * Object detection and tracking in videos : A Review .C KNagalakshmi RHemavathy Shobha In International Journal Of Engineering And Computer Science 3 2014 * Object tracking: A survey AYilmaz OJaved MShah ACM comput. Surv 38 2006 * Moving object tracking-A parametric edge tracking approach MMurshed MAli Akber Dewan OChae IEEE conference oncomputer and information technology 2009 * Detection and tracking of multiple, partially occluded human by Bayesian combination of edgelet based part detectors BoWu RNevatia In International journal of computer vision 75 2 2007 * Formulation, detection and application of occlusion state in the context of multiple object tracking PGuha AMukerjee VKSubramanian IEEE conf. on advanced video and signal-based surveillance 2011 * Object tracking under illumination variation using 2D-Cepstrum characteristics of the target FCogun AEnisCetin IEEE Con. On MMSP 10 Oct 4-6 2010 * Covariance tracking using model update based on mean on Riemannian mainfolds FPorikli OTuzel PMeer IEEE conf. on Computer vision and pattern recognition 2006 1 In proc * Frequency to quefrency : A history of cepstrum AVOppenheim RWSchafer IEEE signal processing mag 2004 * Robust Detection &Tracking of Object by Particle Filter using Color Information AshwaniKumar Sudhanshu Kumar Mishra 4th ICCCNT Tiruchengode India, IEEE July 4-6 2013 Pranjna parimita Dash * A Robust Object Tracking Algorithm Based on SURF Zhou Dan Dong 2013 IEEE * Object Tracking Based on Improved MeanShift and SIFT KaiDu YongfengJu YinliJin GangLi YanyanLi ShenglongQian IEEE 2012 * Object tracking, detection and recognition for multiple smart cameras CSAswin VAshok RamaChellappa Proceedings of the IEEE 96 10 2008 * Illumination invariant stationary object detection WHassan PBirch Chris RupertYoung IEEE computer vision ,IET 7 * Illumination aware MCMC particle filter for longterm outdoor multi-object simultaneous tracking and classification FBardet TChateau DRamadasan 2009 * MCMC -based particle filtering for tracking a variable number of interacting target ZKhan TBalch FDellaert IEEE Transactions on pattern Analysis and Machine 2005 27 * Online tracking of outdoor lighting variations for augmented reality with moving cameras YLiu XGranier IEEE Transactions on visualization and computer graphics 2012 18 * Moving object detection for real time video surveillance: An edge based approach MJHossain MA ADewan OChae IEICE Tran Commun 12 2007 * Condensationconditional density propagation for visual tracking MIsard Blake International journal of computer vision 29 1 1998 * Reversible jump markov chain monte carlo computation and bayesian model determination PJGreen Biometrika 4 82 1995 * Robust visual tracking via multi-task sparse learning TZhang BGhanem SLiu IEEE conf. on CVPR June 2012 * Sequential monte carlo method in particle ADoucet NDeFreitas NGordon springer-verlag New York 2001 * Real time tracking of non-rigid objects using mean shift DComaniciu VRamesh PeterMeer IEEE CVPR 2000 * Fundamental bound on edge detection: An information evaluation of different edge cues SKonishi ALYuille JCoughlan SZhu IEEE conf. on comp. vis. And pat. Rec 1999 * Alignment by maximization of mutual information PViol WMWells IEEE Int'l conf. comp. vis 1995 * Moving object tracking under varying illumination condition CShen XLin YShi Patten reorganization let 2006 27 * Techniques for change detection RLillestrand IEEE Trans. On 21 7 1972 * An algorithm for estimating small scale differences MSUlstad Pat Recog 5 1973 * The effects of image misregistration on the XDai SKhorram IEEETrans. Geoscience Remote Sensing 36 5 September 1998 * Automatic moving object ANeri SColonnese GRusso PTalone Signal Processing 66 2 1998 * Illumination independent change detection for KSkifstad RJain Comp Vision, Graphics Image Process June 1989 46 * Detection and description of moving MHotter RMester FMuller Signal Proces: Image Comm 1996 8 * Statistical model-based change detection TAach AKaup RMester Signal Process 1993 31 * Statistical change detection with moments SCLiu CWFu SChang IEEE Transactions on Image 7 9 1993 * A noise robust method for 2D shape estimation RMech MWollborn Signal Process 1998 66 * Video object extraction based on adaptive ACavallaro TEbrahimi Proc. of SPIE of SPIE 2001 * A SURVEY ON MOVING OBJECT TRACKING IN VIDEO BargaDeori Dalton MeiteiThounaojam International Journal on Information Theory (IJIT) 3 3 July 2014 * Comparative Performance Evaluation of Three Object Tracking Methods KumarMishra DiptiPrajna Parimita Dash Patra SubhendukumarSatya Ranjan Behera Behera Sudhansu International Journal of Emerging Technology and Advanced Engineering 2250-2459 2 5 May 2012 * Object Tracking under Varying Illumination Condition: A Survey Naimish Kasundra KrishnaKDr Warhade National Conference on Information Theory and Communication Networks 2014 * Moving Object Tracing -An Edge Segment Based Approach ChaeMahbub Murshed MDKabir Oksam July 2011 7 * Pattern Analysis and Machine Intelligence AW MSmeulders IEEE Transactions on 36 1468 July 2014 Visual Tracking: An Experimental Survey * Moving Foreground Object Detection and Background Subtraction Using Adaptive-K GMM: A Survey PRajan .SDr Prakash IJARCSMS 2321-7782 2 1 January 2014 * Moving Object Tracing -An Edge Segment Based Approach HasanulMd Oksam Chae MahbubKabir Murshed July 2011 7