EmojiGrid: Difference between revisions
Fix cite date error |
Citation bot (talk | contribs) Alter: url. URLs might have been anonymized. Add: volume, bibcode, s2cid, pmid, doi-access, authors 1-1. Removed proxy/dead URL that duplicated identifier. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 176/547 |
||
Line 4: | Line 4: | ||
== Applications == |
== Applications == |
||
The EmojiGrid was inspired by Russell’s Affect Grid <ref>{{Cite journal| |
The EmojiGrid was inspired by Russell’s Affect Grid <ref>{{Cite journal|last1=Russell|first1=James A.|last2=Weiss|first2=Anna|last3=Mendelsohn|first3=Gerald A.|date=1989|title=Affect Grid: A single-item scale of pleasure and arousal.|url=http://doi.apa.org/getdoi.cfm?doi=10.1037/0022-3514.57.3.493|journal=Journal of Personality and Social Psychology|language=en|volume=57|issue=3|pages=493–502|doi=10.1037/0022-3514.57.3.493|issn=1939-1315}}</ref> and was originally developed and validated for the affective appraisal of food stimuli,<ref name="TK2018">{{Cite journal|last1=Toet|first1=Alexander|last2=Kaneko|first2=Daisuke|last3=Ushiama|first3=Shota|last4=Hoving|first4=Sofie|last5=de Kruijf|first5=Inge|last6=Brouwer|first6=Anne-Marie|last7=Kallen|first7=Victor|last8=van Erp|first8=Jan B. F.|date=2018|title=EmojiGrid: A 2D Pictorial Scale for the Assessment of Food Elicited Emotions|journal=Frontiers in Psychology|volume=9|pages=2396|doi=10.3389/fpsyg.2018.02396|issn=1664-1078|pmc=6279862|pmid=30546339|doi-access=free}}</ref> since conventional affective self-report tools (e.g., Self-Assessment Mannikin<ref>{{Cite journal|last1=Bradley|first1=Margaret M.|last2=Lang|first2=Peter J.|date=1994|title=Measuring emotion: The self-assessment manikin and the semantic differential|url=https://linkinghub.elsevier.com/retrieve/pii/0005791694900639|journal=Journal of Behavior Therapy and Experimental Psychiatry|language=en|volume=25|issue=1|pages=49–59|doi=10.1016/0005-7916(94)90063-9|pmid=7962581}}</ref> are frequently misunderstood in that context.<ref name="TK2018"/><ref name="Kaneko 2019 541–551">{{Cite journal|last1=Kaneko|first1=Daisuke|last2=Toet|first2=Alexander|last3=Ushiama|first3=Shota|last4=Brouwer|first4=Anne-Marie|last5=Kallen|first5=Victor|last6=van Erp|first6=Jan B.F.|date=2019|title=EmojiGrid: A 2D pictorial scale for cross-cultural emotion assessment of negatively and positively valenced food|url=https://linkinghub.elsevier.com/retrieve/pii/S0963996918307713|journal=Food Research International|language=en|volume=115|pages=541–551|doi=10.1016/j.foodres.2018.09.049|pmid=30599977|s2cid=58653600}}</ref> It has since been used and validated for the affective appraisal of a wide range of affective stimuli such as images,<ref>{{Cite journal|last1=Toet|last2=van Erp|date=2019|title=The EmojiGrid as a Tool to Assess Experienced and Perceived Emotions|journal=Psych|language=en|volume=1|issue=1|pages=469–481|doi=10.3390/psych1010036|issn=2624-8611|doi-access=free}}</ref><ref>{{Cite journal|last1=Brouwer|first1=Anne-Marie|last2=van Beers|first2=Jasper J.|last3=Sabu|first3=Priya|last4=Stuldreher|first4=Ivo V.|last5=Zech|first5=Hilmar G.|last6=Kaneko|first6=Daisuke|date=2021-06-22|title=Measuring Implicit Approach–Avoidance Tendencies towards Food Using a Mobile Phone outside the Lab|journal=Foods|language=en|volume=10|issue=7|pages=1440|doi=10.3390/foods10071440|issn=2304-8158|pmc=8305314|pmid=34206278|doi-access=free}}</ref> audio and video clips,<ref>{{Cite journal|last1=Toet|first1=Alexander|last2=van Erp|first2=Jan B. F.|date=2020|title=Affective rating of audio and video clips using the EmojiGrid|url=https://f1000research.com/articles/9-970/v1|journal=F1000Research|language=en|volume=9|pages=970|doi=10.12688/f1000research.25088.1|issn=2046-1402|pmc=8080979|pmid=33968373}}</ref> 360 VR videos,<ref>{{Citation|last1=Toet|first1=Alexander|title=The EmojiGrid as an Immersive Self-report Tool for the Affective Assessment of 360 VR Videos|date=2019|url=http://link.springer.com/10.1007/978-3-030-31908-3_24|work=Virtual Reality and Augmented Reality|volume=11883|pages=330–335|editor-last=Bourdot|editor-first=Patrick|place=Cham|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-030-31908-3_24|isbn=978-3-030-31907-6|access-date=2021-11-28|last2=Heijn|first2=Fabienne|last3=Brouwer|first3=Anne-Marie|last4=Mioch|first4=Tina|last5=van Erp|first5=Jan B. F.|s2cid=203847617|editor2-last=Interrante|editor2-first=Victoria|editor3-last=Nedel|editor3-first=Luciana|editor4-last=Magnenat-Thalmann|editor4-first=Nadia}}</ref> touch events,<ref>{{Cite journal|last1=Toet|first1=Alexander|last2=van Erp|first2=Jan B. F.|date=2020|editor-last=Scilingo|editor-first=Enzo Pasquale|title=The EmojiGrid as a rating tool for the affective appraisal of touch|journal=PLOS ONE|language=en|volume=15|issue=9|pages=e0237873|doi=10.1371/journal.pone.0237873|issn=1932-6203|pmc=7467219|pmid=32877409|bibcode=2020PLoSO..1537873T|doi-access=free}}</ref> food,<ref>{{Cite journal|last1=de Wijk|first1=Rene A.|last2=Ushiama|first2=Shota|last3=Ummels|first3=Meeke J.|last4=Zimmerman|first4=Patrick H.|last5=Kaneko|first5=Daisuke|last6=Vingerhoeds|first6=Monique H.|date=2021-05-13|title=Effect of Branding and Familiarity of Soy Sauces on Valence and Arousal as Determined by Facial Expressions, Physiological Measures, Emojis, and Ratings|journal=Frontiers in Neuroergonomics|volume=2|pages=651682|doi=10.3389/fnrgo.2021.651682|issn=2673-6195|doi-access=free}}</ref> and odors.<ref>{{Cite journal|last1=Liu|first1=Yingxuan|last2=Toet|first2=Alexander|last3=Krone|first3=Tanja|last4=van Stokkum|first4=Robin|last5=Eijsman|first5=Sophia|last6=van Erp|first6=Jan B. F.|date=2020|editor-last=Greco|editor-first=Alberto|title=A network model of affective odor perception|journal=PLOS ONE|language=en|volume=15|issue=7|pages=e0236468|doi=10.1371/journal.pone.0236468|issn=1932-6203|pmc=7392242|pmid=32730278|bibcode=2020PLoSO..1536468L|doi-access=free}}</ref><ref>{{Cite journal|last1=Toet|first1=Alexander|last2=Eijsman|first2=Sophia|last3=Liu|first3=Yingxuan|last4=Donker|first4=Stella|last5=Kaneko|first5=Daisuke|last6=Brouwer|first6=Anne-Marie|last7=van Erp|first7=Jan B.F.|date=2020|title=The Relation Between Valence and Arousal in Subjective Odor Experience|url=http://link.springer.com/10.1007/s12078-019-09275-7|journal=Chemosensory Perception|language=en|volume=13|issue=2|pages=141–151|doi=10.1007/s12078-019-09275-7|s2cid=208302660|issn=1936-5802}}</ref><ref>{{Cite journal|last1=Van der Burg|first1=Erik|last2=Toet|first2=Alexander|last3=Brouwer|first3=Anne-Marie|last4=van Erp|first4=Jan B. F.|date=2021|title=Sequential Effects in Odor Perception|url=https://link.springer.com/10.1007/s12078-021-09290-7|journal=Chemosensory Perception|language=en|doi=10.1007/s12078-021-09290-7|s2cid=235650532|issn=1936-5802}}</ref> It has also been used for the affective analysis of architectural spaces <ref>{{Cite book|last=Sanatani|first=R.P.|title=User-specific predictive affective modeling for enclosure analysis and design assistance", Imaginable Futures: Design Thinking, and the Scientific Method. 54th International Conference of the Architectural Science Association 2020|publisher=Architectural Science Association (ANZAScA)|year=2020|location=Auckland, New Zealand|pages=1341–1350}}</ref> to assess affective experience of trail racing,<ref>{{Cite journal|last1=Aitken|first1=John A.|last2=Kaplan|first2=Seth A.|last3=Pagan|first3=Olivia|last4=Wong|first4=Carol M.|last5=Sikorski|first5=Eric|last6=Helton|first6=William|date=2021-10-02|title=Affective Forecasts for the Experience Itself: An Investigation of the Impact Bias during an Affective Experience|url=https://link.springer.com/10.1007/s12144-021-02337-8|journal=Current Psychology|language=en|doi=10.1007/s12144-021-02337-8|s2cid=244197232|issn=1046-1310}}</ref> and to assess the emotional face evaluation capability of people with early dementia.<ref>{{Cite journal|last1=Rutkowski|first1=Tomasz M.|last2=Abe|first2=Masato S.|last3=Koculak|first3=Marcin|last4=Otake-Matsuura|first4=Mihoko|date=July 2020|title=Classifying Mild Cognitive Impairment from Behavioral Responses in Emotional Arousal and Valence Evaluation Task – AI Approach for Early Dementia Biomarker in Aging Societies –|url=https://ieeexplore.ieee.org/document/9175805|journal=2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)|volume=2020|location=Montreal, QC, Canada|publisher=IEEE|pages=5537–5543|doi=10.1109/EMBC44109.2020.9175805|pmid=33019233|isbn=978-1-7281-1990-8|s2cid=221385462}}</ref> Since it is intuitive and language independent, the EmojiGrid is also suitable for cross-cultural research.<ref name="Kaneko 2019 541–551"/><ref>{{Cite journal|last1=Kaneko|first1=Daisuke|last2=Stuldreher|first2=Ivo|last3=Reuten|first3=Anne J. C.|last4=Toet|first4=Alexander|last5=van Erp|first5=Jan B. F.|last6=Brouwer|first6=Anne-Marie|date=2021|title=Comparing Explicit and Implicit Measures for Assessing Cross-Cultural Food Experience|journal=Frontiers in Neuroergonomics|volume=2|pages=646280|doi=10.3389/fnrgo.2021.646280|issn=2673-6195|doi-access=free}}</ref> |
||
== Implementation == |
== Implementation == |
Revision as of 15:22, 10 February 2022
The EmojiGrid is an affective self-report tool consisting of a rectangular grid that is labelled with emojis. The facial expressions of the emoji labels vary from disliking via neutral to liking along the x-axis, and gradually increase in intensity along the y-axis. To report their affective appraisal of a given stimulus, users mark the location inside the grid that best represents their impression. The EmojiGrid can either be used as a paper or computer-based response tool. The images needed to implement the EmojiGrid are freely available from the OSF repository.
Applications
The EmojiGrid was inspired by Russell’s Affect Grid [1] and was originally developed and validated for the affective appraisal of food stimuli,[2] since conventional affective self-report tools (e.g., Self-Assessment Mannikin[3] are frequently misunderstood in that context.[2][4] It has since been used and validated for the affective appraisal of a wide range of affective stimuli such as images,[5][6] audio and video clips,[7] 360 VR videos,[8] touch events,[9] food,[10] and odors.[11][12][13] It has also been used for the affective analysis of architectural spaces [14] to assess affective experience of trail racing,[15] and to assess the emotional face evaluation capability of people with early dementia.[16] Since it is intuitive and language independent, the EmojiGrid is also suitable for cross-cultural research.[4][17]
Implementation
In a computer-based response paradigm, only the image area inside the horizontal and vertical grid borders should be responsive (clickable), so that users can report their affective response by pointing and/or clicking inside the grid. In practice, this may be achieved by superimposing (1) a clickable image of the unlabeled grid area on top of (2) a larger image showing the grid area together with the emoji labels. The images needed to implement the EmojiGrid are freely available from the OSF repository. An implementation of the EmojiGrid rating task in the Gorilla experiment builder is freely available from the Gorilla Open Materials platform.
See also
- Affect measures
- Emotion classification
- Self-report inventory
- PAD emotional state model
- Valence (psychology)
- Arousal
Further reading
- P. Kuppens, F. Tuerlinckx, J. A. Russell et al., “The relation between valence and arousal in subjective experience”, Psychological Bulletin, 139(4), 917-940 (2013). doi: 10.1037/a0030811
- A. M. Mattek, G. L. Wolford, and P. J. Whalen, “A mathematical model captures the structure of subjective affect”, Perspectives on Psychological Science, 12(3), 508-526 (2017). doi: 10.1177/1745691616685863
- E. Van der Burg, A. Toet, Z. Abbasi et al., “Sequential dependency for affective appraisal of food images”, Humanities and Social Sciences Communications, 8(1), paper nr. 228 (2021). doi: 10.1057/s41599-021-00909-4
- E. Van der Burg, A. Toet, A.-M. Brouwer et al., “Serial dependence of emotion within and between stimulus sensory modalities”, Multisensory Research, 1-22 (2021). doi: 10.1163/22134808-bja10064
References
- ^ Russell, James A.; Weiss, Anna; Mendelsohn, Gerald A. (1989). "Affect Grid: A single-item scale of pleasure and arousal". Journal of Personality and Social Psychology. 57 (3): 493–502. doi:10.1037/0022-3514.57.3.493. ISSN 1939-1315.
- ^ a b Toet, Alexander; Kaneko, Daisuke; Ushiama, Shota; Hoving, Sofie; de Kruijf, Inge; Brouwer, Anne-Marie; Kallen, Victor; van Erp, Jan B. F. (2018). "EmojiGrid: A 2D Pictorial Scale for the Assessment of Food Elicited Emotions". Frontiers in Psychology. 9: 2396. doi:10.3389/fpsyg.2018.02396. ISSN 1664-1078. PMC 6279862. PMID 30546339.
- ^ Bradley, Margaret M.; Lang, Peter J. (1994). "Measuring emotion: The self-assessment manikin and the semantic differential". Journal of Behavior Therapy and Experimental Psychiatry. 25 (1): 49–59. doi:10.1016/0005-7916(94)90063-9. PMID 7962581.
- ^ a b Kaneko, Daisuke; Toet, Alexander; Ushiama, Shota; Brouwer, Anne-Marie; Kallen, Victor; van Erp, Jan B.F. (2019). "EmojiGrid: A 2D pictorial scale for cross-cultural emotion assessment of negatively and positively valenced food". Food Research International. 115: 541–551. doi:10.1016/j.foodres.2018.09.049. PMID 30599977. S2CID 58653600.
- ^ Toet; van Erp (2019). "The EmojiGrid as a Tool to Assess Experienced and Perceived Emotions". Psych. 1 (1): 469–481. doi:10.3390/psych1010036. ISSN 2624-8611.
- ^ Brouwer, Anne-Marie; van Beers, Jasper J.; Sabu, Priya; Stuldreher, Ivo V.; Zech, Hilmar G.; Kaneko, Daisuke (2021-06-22). "Measuring Implicit Approach–Avoidance Tendencies towards Food Using a Mobile Phone outside the Lab". Foods. 10 (7): 1440. doi:10.3390/foods10071440. ISSN 2304-8158. PMC 8305314. PMID 34206278.
- ^ Toet, Alexander; van Erp, Jan B. F. (2020). "Affective rating of audio and video clips using the EmojiGrid". F1000Research. 9: 970. doi:10.12688/f1000research.25088.1. ISSN 2046-1402. PMC 8080979. PMID 33968373.
{{cite journal}}
: CS1 maint: unflagged free DOI (link) - ^ Toet, Alexander; Heijn, Fabienne; Brouwer, Anne-Marie; Mioch, Tina; van Erp, Jan B. F. (2019), Bourdot, Patrick; Interrante, Victoria; Nedel, Luciana; Magnenat-Thalmann, Nadia (eds.), "The EmojiGrid as an Immersive Self-report Tool for the Affective Assessment of 360 VR Videos", Virtual Reality and Augmented Reality, vol. 11883, Cham: Springer International Publishing, pp. 330–335, doi:10.1007/978-3-030-31908-3_24, ISBN 978-3-030-31907-6, S2CID 203847617, retrieved 2021-11-28
- ^ Toet, Alexander; van Erp, Jan B. F. (2020). Scilingo, Enzo Pasquale (ed.). "The EmojiGrid as a rating tool for the affective appraisal of touch". PLOS ONE. 15 (9): e0237873. Bibcode:2020PLoSO..1537873T. doi:10.1371/journal.pone.0237873. ISSN 1932-6203. PMC 7467219. PMID 32877409.
- ^ de Wijk, Rene A.; Ushiama, Shota; Ummels, Meeke J.; Zimmerman, Patrick H.; Kaneko, Daisuke; Vingerhoeds, Monique H. (2021-05-13). "Effect of Branding and Familiarity of Soy Sauces on Valence and Arousal as Determined by Facial Expressions, Physiological Measures, Emojis, and Ratings". Frontiers in Neuroergonomics. 2: 651682. doi:10.3389/fnrgo.2021.651682. ISSN 2673-6195.
- ^ Liu, Yingxuan; Toet, Alexander; Krone, Tanja; van Stokkum, Robin; Eijsman, Sophia; van Erp, Jan B. F. (2020). Greco, Alberto (ed.). "A network model of affective odor perception". PLOS ONE. 15 (7): e0236468. Bibcode:2020PLoSO..1536468L. doi:10.1371/journal.pone.0236468. ISSN 1932-6203. PMC 7392242. PMID 32730278.
- ^ Toet, Alexander; Eijsman, Sophia; Liu, Yingxuan; Donker, Stella; Kaneko, Daisuke; Brouwer, Anne-Marie; van Erp, Jan B.F. (2020). "The Relation Between Valence and Arousal in Subjective Odor Experience". Chemosensory Perception. 13 (2): 141–151. doi:10.1007/s12078-019-09275-7. ISSN 1936-5802. S2CID 208302660.
- ^ Van der Burg, Erik; Toet, Alexander; Brouwer, Anne-Marie; van Erp, Jan B. F. (2021). "Sequential Effects in Odor Perception". Chemosensory Perception. doi:10.1007/s12078-021-09290-7. ISSN 1936-5802. S2CID 235650532.
- ^ Sanatani, R.P. (2020). User-specific predictive affective modeling for enclosure analysis and design assistance", Imaginable Futures: Design Thinking, and the Scientific Method. 54th International Conference of the Architectural Science Association 2020. Auckland, New Zealand: Architectural Science Association (ANZAScA). pp. 1341–1350.
- ^ Aitken, John A.; Kaplan, Seth A.; Pagan, Olivia; Wong, Carol M.; Sikorski, Eric; Helton, William (2021-10-02). "Affective Forecasts for the Experience Itself: An Investigation of the Impact Bias during an Affective Experience". Current Psychology. doi:10.1007/s12144-021-02337-8. ISSN 1046-1310. S2CID 244197232.
- ^ Rutkowski, Tomasz M.; Abe, Masato S.; Koculak, Marcin; Otake-Matsuura, Mihoko (July 2020). "Classifying Mild Cognitive Impairment from Behavioral Responses in Emotional Arousal and Valence Evaluation Task – AI Approach for Early Dementia Biomarker in Aging Societies –". 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). 2020. Montreal, QC, Canada: IEEE: 5537–5543. doi:10.1109/EMBC44109.2020.9175805. ISBN 978-1-7281-1990-8. PMID 33019233. S2CID 221385462.
- ^ Kaneko, Daisuke; Stuldreher, Ivo; Reuten, Anne J. C.; Toet, Alexander; van Erp, Jan B. F.; Brouwer, Anne-Marie (2021). "Comparing Explicit and Implicit Measures for Assessing Cross-Cultural Food Experience". Frontiers in Neuroergonomics. 2: 646280. doi:10.3389/fnrgo.2021.646280. ISSN 2673-6195.