Abstract:
Generally human beings reveals various types of emotions through different facial expressions. However even for a same person, depending on the surrounding environments, a same type of emotion can be exhibited by various facial expressions. Obviously for different persons, variation in facial expression is expected to be much higher for a same emotion. Hence automatic emotion recognition from a given facial expression is a very challenging task. In order to convey an emotion, different regions of a face may exhibit changes in pattern with respect to the neutral face (no emotion). These changes are sometimes very prominent and can easily be recognized by naked eye but in many cases, they are not clearly noticeable and changes occur in pixel level information. In the proposed schemes, first various dynamic regions of a face are identified, where significant changes are expected during emotion elicitation with respect to other parts of the facial image. Next each pixel in that dynamic region is taken into consideration to form a feature vector from the given facial image. Since entropy is a measure of information content in a process, in order to capture information about pixel intensity variation due to change in facial expression in a region, entropy function is proposed as feature. The main idea here is to capture the variation in intensity distribution along the vertical direction of an image, for example which can be an entropy values extracted from various overlapping rows of pixels. Apart from entropy, different statistical measures are also investigated as features. Moreover, similar feature extraction process is also tested on transformed images, considering Fourier transform, Discrete Cosine transform, Wavelet transform and Hilbert transform. Finally considering the ease of computation binary level logical operations, such as logical AND, OR, XOR between the pixels are also performed to extract the feature vector. One advantage of the proposed method is the generation of 1D feature vector from the 2D facial image. As a result a differential 1D feature vector with respect to neutral face feature is computed, which is expected to contain better discriminative characteristics for different emotions. Moreover, a simple sliding technique is presented that operates on proposed 1D feature to obtain vertical and horizontal shift invariance recognition system. For the purpose of classification distance based classifier is used and two widely used challenging facial emotion databases JAFFE and CK+ are tested. The proposed method offers very satisfactory recognition performance with respect to person, pose, illumination and shift variation.