# Introduction he recognition of human affection is considered as a significant feature in many intelligent systems. In certain situations, emotion aware intelligent systems perform with greater credibility than normal systems. For this reason, the development and implementation of computer-based emotion aware systems have become an interesting research area in affective computing researches. Affective computing is the part of Human Computer Interaction (HCI) that deals with the study, recognition, interpretation & simulation of human affects. We have seen the use of human affection recognition in desktop application as well as in web applications. Sometimes it is found as service recommendation utility in different applications. In contrast to this, it has greater promise in mobile scenario. The rapid advancement of modern science has taken us to the peak of mobility bringing powerful processor enabled mobile devices in our hand. People are using handheld devices like Smartphone and tablet computer more. Since mobile phones have become an essential part of our day to day life they represent an ideal computing platform to monitor the behavior, movement, moods of the user himself. This processor embedded powerful Smartphone will become even smarter if it can detect user emotion and act accordingly. It is an obvious matter that the emotion of the users' mental state significantly affects their social lives and interactions. As Smartphone are used more frequently than any other devices that they carry, applications that are being developed focusing the emotion will be more effective as they will not only focus on the usability or user experience but also on the interaction methodologies. Current Smartphone usually have a virtual keyboard rather than physical keypad on its body. That means users have to directly touch the screen in order to interact with the device. One of the most significant factors affecting the usage of Smartphone is the pressure given on the screen with the fingers while typing a message, playing games, browsing, switching applications and many more. Every time a user is tapping, swiping or pinching on the screen a small number of pressure is applied. We believe that due to change of the mental state, the amount of pressure is given on the screen also changes. The variations are too little, so we need to normalize the values and create some approximated result so that we can use the resulting output to draw conclusions about the user's emotional state. As the user's mental state changes with the context of his environment and surroundings, these may lead to a situational impairment [14]. The present phone operating systems are not that much aware of the user's emotion. We have picked Smartphone rather than desktop computers or other sensory devices because it is everyday companion. Researchers have found different touch patterns for typing in on-screen keyboards [15]. The amount of pressure applied on the screen varies with the user's emotion and substantially it increases when he is in excited or angry mood. The normal state or neutral state usually exhibits steady variations. In our system we will try to combine both of the features( pressure and the signal coming from the gyroscope, accelerometer ), normalize them and make a specific relation between them so that in future we can detect the user's emotion based on these results. By examining the gyroscope signal we will be able to detect whether the phone is in a stationary state or in a mobile state. The accelerometer sensor is mostly responsible to detect the state of the user [9] that means whether the user is standing, walking, jogging, running, ascending/descending stairs or doing any kind of extra physical works. Here we want to draw some conclusions about what we are going to do in our system: (1) we will create a system that will combine the signals coming from the built in sensors that are already available with the phone; the amount of pressure applied to touch screen using external sensor and the signals coming from the gyroscope and accelerometer. (2) The system will be less battery consuming as there will be an effective algorithm to pause or start the system based on the states of the Smartphone whether it is being used or not (3) the resulting outputs will be fetched in to a table to show the user's emotional state. The working procedure has been divided into several sections to describe our research work precisely. In the next sections we are going to discuss about the theoretical definitions and other related terms. A brief description of our background study will also be added here and related works will be stated clearly in following sections. After the state of the art we will define the problem description briefly so that it can be easily understandable. In the last section we will state our future works. # II. # Background Study "Emotion" is an abstract term. It is an idea or expression that can define the current state or condition of human mind or behavior. Emotion can be characterized by psychological & physiological expressions, biological reactions, and mental states. Emotion [10] is one of the fundamental components or characteristics of being human. Emotion can be derived from human experiences and expressions to represent different emotional states such as joy, hate, anger, pride, excitement, sadness and so on. The current research in psychology and technology suggest a completely different view of the various aspects of the relationship among human, computer and emotion. Emotion is no longer seen as limited to casual ebullition of rage and anger when a computer program crashes for inexplicable reasons, excitement when a video game character overcomes an obstacle or frustration when an incomprehensible error message appears on the screen. It is now understood that a wide range of emotions play a vital role in every computer related, goal oriented activity, from using drawing application and editing photos in a photo manipulation app, to browsing web pages and sending a message, to making an online purchase to playing games. The way the user carries out a task or uses an application is highly influenced by his emotional states. Here, another important aspect is human affection which is the main focus of Affective Computing [10]. Affective Computing is one of major research areas which relates to and arises from or it adversely effects or influences human emotion or other active phenomena. Recognition of emotion refers to the identification of emotional states. It can be done in various contexts. Facial expressions, gesture, posture, speech analysis, keystroke and pressure all of these can be used for emotion recognition. There are several emotion recognition techniques. First, there can be emotion recognition from facial expression. It is a vast area of psychological studies for last several decades. It requires neuroscience, digital image processing, pattern recognition and a lot analysis and processing of collected data to generate probable emotion. There are also various differences in the facial expression patterns in different age groups of humans (such as children, adult, senior etc). It is also difficult to implement in smaller hardware or devices. Hence the necessity to detect emotion in some other convenient ways arises. [11] Second, there is Vocal emotion recognition. The vocal aspect of a message or conversation carries significant variations or emotional information. If we do not consider how a sentence was verbally spoken, the meaning may change. From the input audio signals pitch, intensity, and pitch contours were estimated as acoustic features and then classified using some predefined roles and mapped to suitable emotional state. This model has some inaccuracies in case of neutral voice. Third, another new method is Multimodal Emotion Recognition. Voice, gesture, and forcefeedback etc can also be used for detecting emotion instead of using only keyboard and mouse. Sometime people say specific sentence with particular facial expression to indicate an emotional state. The multimodal approach is much more dynamic but it has some constraints and it's an ongoing research. It has to combine multiple types of inputs and finally fuse with probabilistic models. [12] Forth, another approach is emotion recognition using pressure sensing keyboards and keystroke dynamics. [13] The context of our work is touch screen Smartphone. People use these devices as their daily commodities. Emotion frequently changes while using such a device depending on the scenario. The input for these devices is touch. The pressure applied on the screen by the user as touch stimuli vary according to the mood or mental state of the user. So, the idea is to find out the pressure data and after further processing, it can be useful information for emotion recognition. # Global Journal of Computer Science and Technology Volume XIV Issue VI Version I There are several types of built in sensors in today's smart phones. Some of them can be used to capture the pressure raw data. In most of the Smartphone, these sensors exist-accelerometer, gyroscope, ambient light sensor, proximity sensor, GPS etc. The sensors are built into handsets using Microelectromechanical systems (MEMS). Accelerometer: This sensor allows the device to detect the orientation of the device and adapts the content to suit the new orientation. It also measures the acceleration of the device relative to freefall. It actually controls the switching between portrait and landscape modes. Mobile phone accelerometers are a form of three-axis micro electromechanical-system (or MEMS) and essentially consist of a series of three sensors which are fixed at right angles to each other. Each of these sensors is capable of monitoring the smallest of changes in force and pressure: as applied by gravity or movement and creating a corresponding electrical impulse. Development of this technology has allowed for the relatively cheap manufacture of increasingly accurate and sensitive accelerometers which can be easily integrated into all manner of applications. Gyroscope: It is a device for measuring or maintaining orientation, based on the principles of angular momentum. This sensor is built into modern Smartphone mainly for navigation systems and gesture recognition systems and also for finding the position and orientation of the device. Gyroscope is capable of measuring angular rates around one or more axes, these gyroscopes represent a fitting complement to MEMS accelerometers. Thanks to the combination of accelerometers and gyroscopes it is possible to track and to capture complete movements in a threedimensional space. According to Wikipedia, this nifty device is able to detect angular movements such as the rotation around the X-axis, rotation around the Y-axis and the rotation around the Z-axis or also known as roll, yard and pitch in layman's terms. On the other hand, an accelerometer is only able to detect three linear axes of vectors, including left-right (X-axis), top-bottom (Y-axis) and up-down (Z-axis). Unlike a gyroscope, it measures translation of direction and cannot detect if a device made a full spin or is experiencing inertial change. Mechanical and Electronics Engineering today have managed to transform the mechanical gyroscope into a Micro electromechanical system (MEMS), also known as vibrating structure gyroscope. So, instead of having a spinning wheel inside the microchip, a vibrating mass is placed in the centre of the chip. The mass will be vibrated whenever an electrical signal goes through it. Moving the phone will cause the changes of electrical signals that are picked by the sensors. The sensors will send instructions to be interpreted by the software to provide the necessary feedback to the user. Proximity Sensor: It detects how close the screen of the phone is to our body. This allows the phone to sense when we have brought the phone close to ear. It is used to avoid unwanted input. When combining both the accelerometer and the gyroscope, it will be a total of 6-axis motion sensing that is able to have precise motion detection by simply moving the phone naturally. Ambient light sensor: It is mainly used to adjust the screen brightness automatically and for power saving. Magnetometer: This sensor measures the strength of earth's magnetic field. Result is expressed in Tesla. Smartphone provides raw magnetometer data and compass bearing. It is used as compass to rotate maps, graphics, and orientation and also as recorder to detect magnets, force fields etc. These are the main sensors typically found in modern Smart phones. As we can see there is no dedicated pressure sensor built in to normal Smartphone. Thus the necessity to get pressure data from elsewhere such as touch screens of the device is felt. There are many types of touch screens in Smartphone. But the most commonly used two are the Resistive and the capacitive touch screen. Resistive touch screen is used lower end Smartphone and right now almost outdated. Resistive touch screens contain two layer of conductive material with a very small gap between them which acts as a resistance. When the resistive touch screen is touched with finger (or stylus) the two layers meet at the point of touch and create a circuit. This information is recognized by the mobile's processor. This kind of screens can be operated with a finger, a fingernail, a stylus or any other object. Capacitive touch screen technology consists of a layer of glass coated with a transparent conductor (like indium tin oxide). When a capacitive touch screen is touched by human body (finger), an interruption is created in the screens electrostatic field (which is measurable as a change in capacitance) which is detected by phone's processor or chip and which in turn instructs phone's operating system to trigger an event or action accordingly. Haptic/Tactile touch screen is another technology used by Blackberry and Nokia targeted towards enterprise market. This technology provides a tactile feedback on a touch action on the screen thus providing an immediate and unmistakable confirmation to the user. Hepatic technology has been found to significantly improve user performance, accuracy and satisfaction while typing on a touch screen. Retina Display is another technology mainly used by apple for their IPS LCD (with backlit LED) in iPhone4. They call it the Retina display because its The enhancement of the capacitive touch screens is AMOLED and Super AMOLED. AMOLED means Active-Matrix Organic Light-Emitting Diode. AMOLED displays are a type of OLED displays for mobiles and are rapidly gaining popularity in top end Smartphone segment. Super AMOLED displays are an even advanced version of AMOLED displays developed by Samsung. Super AMOLED display is built with touch sensors on the display itself, as opposed to creating a separate touch sensitive layer (as in capacitive touch screen). # III. # Related Works There are only few research works that have been done in the context of touch screen Smartphone. An emotion sensing approach has been recommended along with a proposed affective entity scoring algorithm [1]. This algorithm maintains affective scoring vectors for various entities in a mobile device. It keeps track of installed applications, multimedia contents and contacts of people and also calculates the difference between prior and posterior emotional states. Then some recommendation is proposed based on the emotional state. The device can create user's preferences from the variance in his emotional states. Then according to those preferences recommendations are generated such as a call log showing emoticons that expresses the current emotional state of the person to be called. The emotion detection is accomplished by collecting sensory data from the device and analyzing contextual information for example emails and text messages. Then affective entity score is calculated from a target application. While mapping the score, the usage pattern of the user and timer interval between different applications is also considered. A model named OCC model (Ortony, Clore and Collins' computational emotion model) has been established [2] to provide a structure of 22 conditions which influence emotions and variables which affects the intensities of the emotion. From the study of facial expression of emotions, Ekman [3] defined six emotions, 'Joy', 'Anger', 'Fear', 'Disgust', 'Surprise' and 'Sadness' as basic emotions which has been largely used in the field of psychology and robotics. Since today's Smartphone contains different low cost MEM sensors, the combination provides accurate measurement of the orientation of the device. An orientation estimation technique is proposed by fusing [4] different MEM sensors of the Smartphone. Orientation can be determined by the fusion of Accelerometer and Magnetometer but it is only effective when the device is stationary or not moving linearly. The device may also suffer from magnetic interference. Fusing gyroscope with the previously mentioned approach produces more accurate result of the orientation of the device. Gyroscope provides a quick response to change in angels and also does not suffer from problems like interference. But there are some bias and integration errors that can be overcome by applying a Drift and Noise removal filter. The successful estimation of orientation leads to successful development of mobile games, navigation apps, augmented reality and other kind of applications. After the invention of pressure sensor keyboards for desktop computers some research works have been done also. One of them was biologic verification based on these keyboards [5] which was done by Hai-Rong Lv et al. and their following paper was another approach to recognize emotion by analysis the pressure sequences when any keystroke occurs [6]. Global features of pressure sequences, dynamic time warping and traditional keystroke dynamics-these three features were combined using a classifier fusion technique. Analyzing the emotion and ages of users was intended to be done in future in their paper. The impact force on musical instrument is crucial. The built-in accelerometers, the pressure sensing capability of Android phones, and external force sensing resistors can be used to calculate this impact force on multi-touch devices such as Smartphone. Georg Essl et al. worked on these three approaches [7]. The accelerometer based approach was kind of not suitable to detect the pressure applied where the android API show some promises and FSR sensor was giving almost accurate and precise pressure values. Although it had some dynamic range it was difficult in practical use as the setup was very sensitive and calibrating with the phone and another background surface was kind of difficult. A slight movement could make the sensor unresponsive and also can give biased amount of pressure for this type of pressure sensor calibration. Mayank Goel et al. worked on the detection of hand postures and pressure with the help of accelerometer and gyroscope. In this paper they also implemented the system in such a way that it could use the pressure applied on the screen when the vibration motor is pulsed [8]. The accuracy result was different in different cases. Their system accurately differentiates device usage on a table vs. in hand with 99.7% accuracy; when in hand it inferred the hand posture having 84.3% accuracy. They could differentiate among three types of pressure by using gyroscope sensor. Higher and lower frequencies were generated from touch-induced vibrations increase with increase in pressure. One of the limitations (as we are mainly focusing on the part of pressure sensing) they faces was they could successfully detect three levels of pressure. These three levels of pressure that they have collected were mainly to make it easy for the user with the accepted levels of accuracy. Again although the # Global Journal of Computer Science and Technology Volume XIV Issue VI Version I range of the pressure was different for different types of user the system could practically detect high levels of accuracy yet we need to detect exact amount of pressure, not the levels so that the mapping with the emotion can be successfully done. Tatsuya Shibata and Yohei Kijima demanded that not only the facial expressions but also different body parts and gestures play an affective role in nonverbal communication which they further analyzed and found relation between different body gestures and human emotion. They said in their paper that the body gestures have same structure as the facial expressions which can be calculated with the help of using a pressure sensor on a chair (that detects the amount of pressure exerted on the chair) and used some accelerometers to detect the positions of different body parts like angle of neck, leg and arms. Three types of emotional status (arousal, pleasantness and dominance) were defined in their paper but an important limitation was they could not use effective sensors to detect arm and leg positions to estimate the "dominance" or unattached sensors for body parts. # IV. # Research Challenges In this section we are focusing on the challenges against our research work in mobile perspective. a) Absence of pressure sensor Smartphones that are available in the market are still lacking pressure sensor embedded underneath its screen. Here, we need to mention that the biometric pressure sensor is already available but that is responsible for calculating the pressure of the atmosphere, not the tap pressure exerted on the screen of the device. # b) Resource Limitations Mobile phones now-a-days have extremely good processing capabilities yet they are still not comparable with the powerful processors and hardware of the desktop computers. On the contrary, the battery technology has not been improved that much. The processors consume most of the battery life. It became a major issue in case of working with so many sensors. # c) Limitations with the sensors The sensors come up with many of the mobile phones are not always absolutely accurate. For example we have seen many Smartphone with problematic gyroscopes and accelerometers. These devices do not detect the rotation of the device accurately. The quality of the camera is still not up to the mark and so face recognition technologies sometimes fail to detect correctly. So accurate algorithms should be designed so that these biased raw sensor values can be take into control. # d) Possibility of biased values There is a chance of biased results as well. For example, when someone is stressed he can sleep, lie down on the chair or can keep working as was doing extra hours in office. These are just examples that show how people can hide their actual emotional state. # e) Privacy Concerns Too much usage of sensors may cause serious violation of the privacy of the users. Camera photos or voice from the microphone or accelerometer data that can detect user's state or GPS data that can detect the position of the user can be sensitive data. V. # Problem Description While using a touch screen phone the user generally does not require any button to perform any task. The user interface is built in such a way that the user will have to touch specific portions of the screen to complete different actions. For example, while typing a message a keyboard will be shown on the screen where he will touch desired letters to write the message. While using the phone, emotional state change will affect his interaction with the device. Normally larger touched area is also responsible to show a larger amount of pressure value and vice-versa. Again, due to anger he may shake the phone harder than the usual and may be remain steady and calm in case of sadness. So, the gyroscope signals will vary according to his emotional states. The accelerometer will exhibit continuous changes of the orientation of the device. Moreover, the interaction log will contain the type of usage, type of application, user tasks, time of usage and other necessary information. The challenge is to synchronize all these data, correlate them with the user's interaction log and normalize the values into a specific range. The resulting data set can be mapped with the different emotional states of the user. # VI. # Proposed Approach We have already defined that we are going to build our system for touch screen smart phones. Some of the best operating systems for the Smartphone are iOS (Apple products like Iphone, Ipad), android (an open source project of Google), windows and blackberry OS. We decided to build our system using android operating system because - (1) Android is an open source project of Google, so resources are quite easy to manage, ( 2) as an open source project most of the leading devices are running on this OS, leading Smartphone manufacturers like Samsung, HTC, Motorola, Sony are building their phones using this OS, so it will be easier to test our dataset. We have considered three approaches to solve the problem we have mentioned. In the first case, the data will be collected based on the touch input and pressure applied on the screen. It is assumed that due to different emotional states user will give different amount of pressure on the screen. So depending on the pressure we will map the emotional states later on. We are going to use the approximated tap pressure values from touch screen. Now the second approach is we will work with an external sensor which is called Force Sensing Resistor (FSR). The need of an external sensor is obvious here since the current phones do not have any built in pressure sensor. So to collect more accurate and precise data of the pressure applied, we are focusing on this sensor also. This external sensor will provide accurate tap pressure values on the smart-phone touch screen and it will be attached with the phone externally as well. When the user is stressed or in an angry mood he will generate more pressure on the device. The pressure applied on the device will passed on the FSR and we will get some readings from that sensor. It will exhibit significant variations in the readings for other emotional states we have assumed. Now the third case, we have to collect readings from the built-in sensors such as the accelerometer and gyroscope data and log them into files. It will provide us with a concrete idea about how these sensor values vary when orientation or movement of the device is changed in different emotional states. So we will use this data to for recognition of the emotional state of the user. Another approach is we will also collect the interaction log. Depending on the interaction log we will get the idea about what type of applications the user is using and also the amount of time that he is spending on each application. These interaction log data will be analyzed with the data collected from sensors. # VII. # Conclusion and Future Works The recognition of human affection is one of the most challenging research areas. Emotion based user interfaces or service recommendations are gaining popularity. Even though there are possibilities for emotion aware application development in desktop or web platform, we have chosen the mobile platform for its flexibility, mobility and frequent usage. As these devices have various sensors integrated in their hardware, there is a limited need for additional equipments. This encourages the study of human affection in mobile devices particularly Smartphone even more. There are various challenges such as linking the sensor values with user's emotional states and collecting user interaction log data. A solution is proposed in our work to overcome these problems as well. In future we will implement software in mobile operating system (e.g. Android/iOS) that will run as a background service collecting the data from the built in sensors. Another application will record the size of the area that has been touched. That software will eventually save the data locally and then we will extract the features so that that data can be mapped with different emotional states. We will collect data from the external FSR attached with the Smartphone to get exact pressure data. Then finally we will apply some machine learning approaches to the selected dataset and train our system to detect human affection. # Global Journal of Computer Science and Technology Volume XIV Issue VI Version I ![individually identified by the human eye, thus making the display super sharp and brilliant.](image-2.png "") © 2014 Global Journals Inc. (US)Recognizing Human Affection: Smartphone Perspective * EmoSens: Affective Entity Scoring, A Novel Service: Recommendation Framework for Mobile Platform Hyun-JunKim YoungSangChoi October 23-27, 2011 Chicago, Illinois, USA * AOrtony GClore ACollins The Cognitive Structure of Emotions Cambridge Cambridge University Press 1988 * The Nature of Emotion: Fundamental Questions PEkman RDavidson 1994 Oxford University Press * A Sensor Fusion Method for Smart phone Orientation Estimation ShahidAyub AlirezaBahraminisaab BahramHonary PGNet 2012 * Biologic verification based on pressure sensor keyboards and classifier fusion techniques HRLv IEEE trans .On Consumer Electronics 52 3 2006 * Emotion Recognition Based on Pressure Sensor Keyboards LVHai-Rong LinZhong-Lin YinWen-Jun JinDong copyright ©2008 IEEE * Use the Force (or something) -Pressure and Pressure-Like Input for Mobile Music Performance" -NIME2010, 15-18th GeorgEssl MichaelRohs SvenKratz June 2010 Sydney, Australia * MayankGoel JacobOWobbrock NShwetak Patel Gripesense Using Built-In Sensors to Detect Hand Posture and pressure on Commodity Mobile Phones. UIST '12 Cambridge, Massachusetts, USA October 7-10, 2012 * Huiru Zheng Activity Monitoring Using a Smart Phone's Accelerometer with Hierarchical Classification ShumeiZhang PaulMccullagh ChrisNugent * Recognizing Emotion From Facial Expressions: Psychological and Neurological Mechanisms" by Ralph Adolphs ScottBrave CliffordNass March 2002 1 Stanford University 11 ; University of Iowa College of Medicine-Behavioral and Cognitive Neuroscience Reviews Emotion in Human Computer Interaction * NicuSebe IraCohen TheoGevers ThomasSHuang Multimodal Approaches for Emotion Recognition: A Survey * Identifying Emotional States using Keystroke Dynamics ClaytonEpp MichaelLippold ReganLMandryk CHI 2011 Vancouver, BC, Canada May 7-12, 2011 * When Computers Fade Pervasive Computing and Situationally-Induced Impairments and Disabilities ASears MLin JJacko Xiao Y HCI International 2 2003 * Touch Behavior with Different Postures on Soft Smartphone Keyboards SAzenkot SZhai Proc. Moble HCI 2012 Moble HCI 2012 2012