Facial expression databases

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

A facial expression database is a collection of images or video clips with facial expressions of a range of emotions. Well-annotated (emotion-tagged) media content of facial behavior is essential for training, testing, and validation of algorithms for the development of expression recognition systems. The emotion annotation can be done in discrete emotion labels or on a continuous scale. Most of the databases are usually based on the basic emotions theory (by Paul Ekman and Armindo Freitas-Magalhaes) which assumes the existence of six discrete basic emotions (anger, fear, disgust, surprise, joy, sadness). However, some databases include the emotion tagging in continuous arousal-valence scale. And some databases include the AU activations based on FACS [1].

In posed expression databases, the participants are asked to display different basic emotional expressions, while in spontaneous expression database, the expressions are natural. Spontaneous expressions differ from posed ones remarkably in terms of intensity, configuration, and duration. Apart from this, synthesis of some AUs are barely achievable without undergoing the associated emotional state. Therefore, in most cases, the posed expressions are exaggerated, while the spontaneous ones are subtle and differ in appearance.

Many publicly available databases are categorized here.[2][3] Here are some details of the facial expression databases.

Database Facial expression Number of Subjects Number of images/videos Gray/Color Resolution, Frame rate Ground truth Type
Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) [4] download Speech: Calm, happy, sad, angry, fearful, surprise, disgust, and neutral.

Song: Calm, happy, sad, angry, fearful, and neutral. Each expression at two levels of emotional intensity.

24  7356 video and audio files Color 1280x720 (720p) Facial expression labels

Ratings provided by 319 human raters

F-M FACS 3.0 (EDU, PRO & XYZ versions) [5] The F-M FACS 3.0 features 8 pioneering Action Units (AUs) and 22 pioneering Tongue Movements (TMs), in addition to functional and structural nomenclature;

3D technology and automatic and real-time recognition; neutral, sadness, surprise, happiness, fear, anger, contempt and disgust

10  4877 videos and images sequences Color 3D 4K Facial expression labels and (AU intensity for each video frame) Posed and Spontaneous
Extended Cohn-Kanade Dataset (CK+)[6] download neutral, sadness, surprise, happiness, fear, anger, contempt and disgust 123  593 image sequences (327 sequences having discrete emotion labels) Mostly gray 640* 490 Facial expression labels and FACS (AU label for final frame in each image sequence) Posed; spontaneous smiles
Japanese Female Facial Expressions (JAFFE)[7] download neutral, sadness, surprise, happiness, fear, anger, and disgust 10 213 static images Gray 256* 256 Facial expression label Posed
MMI Database[8] download 43 1280 videos and over 250 images Color 720* 576 AU label for the image frame with apex facial expression in each image sequence Posed and Spontaneous
Belfast Database[9] download Set 1 (disgust, fear, amusement, frustration, surprise) 114 570 video clips Color 720*576 Natural Emotion
Set 2 (disgust, fear, amusement, frustration, surprise, anger, sadness) 82 650 video clips Color
Set 3 (disgust, fear, amusement) 60 180 video clips Color 1920*1080
DISFA[10] download - 27 4,845 video frames Color 1024*768; 20 fps AU intensity for each video frame (12 AUs) Spontaneous
Multimedia Understanding Group (MUG)[11] download neutral, sadness, surprise, happiness, fear, anger, and disgust 86 1462 sequences Color 896*896, 19fps Emotion labels Posed
Indian Spontaneous Expression Database (ISED)[12] download sadness, surprise, happiness, and disgust 50 428 videos  Color 1920* 1080, 50 fps Emotion labels Spontaneous
Radboud Faces Database (RaFD)[13] download neutral, sadness, contempt, surprise, happiness, fear, anger, and disgust 67 Three different gaze directions and five camera angles (8*67*3*5=8040 images) Color 681*1024 Emotion labels Posed
Oulu-CASIA NIR-VIS database download surprise, happiness, sadness, anger, fear and disgust 80 three different illumination conditions: normal, weak and dark (total 2880 video sequences) Color 320×240 Posed
FERG (Facial Expression Research Group Database)-DB[14] for stylized characters angry, disgust, fear, joy, neutral, sad, surprise 6 55767 Color 768x768 Emotion labels Frontal pose
AffectNet[15] neutral, happy, sad, surprise, fear, disgust, anger, contempt ~450,000 manually annotated

~ 500,000 automatically annotated

Color Various Emotion labels, valence, arousal Wild setting
IMPA-FACE3D[16] neutral frontal, joy, sadness, surprise, anger, disgust, fear, opened, closed, kiss, left side, right side, neutral sagittal left, neutral sagittal right, nape and forehead (acquired sometimes) 38 534 static images Color 640X480 Emotion labels Posed
FEI Face Database neutral,smile 200 2800 static images Color 640X480 Emotion labels Posed
Aff-Wild[1][17] [18] 200 ~1,250,000 manually annotated Color Various (average = 640x360) Valence, Arousal In-the-Wild setting


  1. ^ Freitas-Magalhães, A. (2018). Facial Action Coding System 3.0: Manual of Scientific Codification of the Human Face (english edition). Porto: FEELab Science Books. ISBN 978-989-8766-89-2
  2. ^ "collection of emotional databases". Archived from the original on 2018-03-25.
  3. ^ "facial expression databases".
  4. ^ Livingstone & Russo (2018). The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. doi:10.1371/journal.pone.0196391
  5. ^ Freitas-Magalhães, A. (2018). Facial Action Coding System 3.0: Manual of Scientific Codification of the Human Face (english edition). Porto: FEELab Science Books. ISBN 978-989-8766-89-2
  6. ^ P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar and I. Matthews, "The Extended Cohn-Kanade Dataset (CK+): A complete facial expression dataset for action unit and emotion-specified expression," in 3rd IEEE Workshop on CVPR for Human Communicative Behavior Analysis, 2010
  7. ^ M. J. Lyons, M. Kamachi and J. Gyoba, "Japanese Female Facial Expressions (JAFFE)," Database of digital images, 1997
  8. ^ M. Valstar and M. Pantic, "Induced disgust, happiness and surprise: an addition to the MMI facial expression database," in Proc. Int. Conf. Language Resources and Evaluation, 2010
  9. ^ I. Sneddon, M. McRorie, G. McKeown and J. Hanratty, "The Belfast induced natural emotion database," IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 32-41, 2012
  10. ^ S. M. Mavadati, M. H. Mahoor, K. Bartlett, P. Trinh and J. Cohn., "DISFA: A Spontaneous Facial Action Intensity Database," IEEE Trans. Affective Computing, vol. 4, no. 2, pp. 151–160, 2013
  11. ^ N. Aifanti, C. Papachristou and A. Delopoulos, The MUG Facial Expression Database, in Proc. 11th Int. Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS), Desenzano, Italy, April 12–14, 2010.
  12. ^ S L Happy, P. Patnaik, A. Routray, and R. Guha,  “The Indian Spontaneous Expression Database for  Emotion Recognition,” in IEEE Transactions on Affective Computing,  2016, doi:10.1109/TAFFC.2015.2498174.
  13. ^ Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H.J., Hawk, S.T., & van Knippenberg, A. (2010). Presentation and validation of the Radboud Faces Database. Cognition & Emotion, 24(8), 1377—1388. doi:10.1080/02699930903485076
  14. ^ "Facial Expression Research Group Database (FERG-DB)". grail.cs.washington.edu. Retrieved 2016-12-06.
  15. ^ Mollahosseini, A.; Hasani, B.; Mahoor, M. H. (2017). "AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild". IEEE Transactions on Affective Computing. PP (99): 18–31. arXiv:1708.03985. doi:10.1109/TAFFC.2017.2740923. ISSN 1949-3045.
  16. ^ "IMPA-FACE3D Technical Reports". visgraf.impa.br. Retrieved 2018-03-08.
  17. ^ Zafeiriou, S.; Kollias, D.; Nicolaou, M.A.; Papaioannou, A.; Zhao, G.; Kotsia, I. (2017). "Aff-Wild: Valence and Arousal in-the-wild Challenge" (PDF). Computer Vision and Pattern Recognition Workshops (CVPRW), 2017.
  18. ^ Kollias, D.; Tzirakis, P.; Nicolaou, M.A.; Papaioannou, A.; Zhao, G.; Schuller, B.; Kotsia, I.; Zafeiriou, S. (2019). "Deep Affect Prediction in-the-wild: Aff-Wild Database and Challenge, Deep Architectures, and Beyond". International Journal of Computer Vision (IJCV), 2019.