The IIITM Face Emotion dataset originates from IIITM Face Data. It consists of a total of 1928 images collected from 107 participants (87 male and 20 female). These images have been recorded in three different vertical orientations (i.e., Front, Up, and Down) while exhibiting six different facial expressions (Smile, Surprise, Surprise with Mouth Open, Neutral, Sad, and yawning). The original IIITM Face dataset provides information about various other attributes such as gender, presence of mustaches, beard, eyeglasses, clothes worn by the subjects, and density of their hair. The original IIITM Face dataset has been modified to study facial expressions in different orientations. IIITM Face Emotion dataset has only facial region segmented for all the subjects and then all images are resized to fixed dimensions [800 x 1000 pixels] with the aspect ratio of 4:5. This method ensures uniform scale for varying face positions for the same subject.

The nomenclature of each image is as follows: 'SUB XX EE O'


XX -> denotes the subject ID

EE -> denotes the expressed emotion ( Smile -> SM, Surprise -> SU, Surprise with mouth open -> SO, Neutral -> NE, Sad-> SA, Yawning -> YN)

O  -> denotes the orientation (Front -> F, Down -> D, Up -> U)

For example. 'SUB1NEF' shows subject 1 depicting Neutral emotion in Front facing orientation.

For using The Dataset, please cite the following papers:

[1] U. Sharma, K. N. Faisal, R. R. Sharma, and K. V Arya, “Facial Landmark-Based Human Emotion Recognition Technique for Oriented Viewpoints in the Presence of Facial Attributes,” SN Comput. Sci., vol. 4, no. 3, p. 273, 2023.

[2] Arya, K. V., Verma, S., Gupta, R. K., Agarwal, S., & Gupta, P. (2020). IIITM Face: A Database for Facial Attribute Detection in Constrained and Simulated Unconstrained Environments. In Proceedings of the 7th ACM IKDD CoDS and 25th COMAD (pp. 185-189).