User:Specs92/sandbox

From Wikipedia, the free encyclopedia

Cognitive Affective Architecture[edit]

Cognitive Affective Architecture(CogAff Architecture) is the process of how cognitive-emotional processing may occur in a human or machine.

History of CogAff Architecture[edit]

Mapping emotion onto cognitive architecture began when cognitive scientists started shifting from the brain as a computer metaphor to the brain as a place for the mind in embodied cognition. This idea of embodied cognition relies on, first, having the mind inside of a body, and second, having emotions that must be expressed through the body. In order to do this, it is important to know the underlying structures and functions of processing emotions in an organism. Scientists initiated studies looking to determine what makes an emotion and how to represent emotion in the form of a chart or model.This model or chart is called architecture. This allows us to map out the information processing mechanisms in an organism. In other words it allows us to look at all the devices or mechanisms that an organism uses to incorporate information gathering so that it can perform its intended actions[1] . This was the starting point of work in both affective computing and cognitive affective architecture also called CogAff architecture. [2] CogAff architecture looks at how we process and generate emotions while affective computing looks at how we can recognize emotions.

Affective Computing[edit]

Affective computing is the research of emotions and how it applies to robots. It combines the fields of cognitive science, and computer science, which work together to help create robotics that can recognize emotions in people. There are a number of computer programs in existence that do this type of information processing. Affecting computing uses data from physiological, algorithmic, body gesture, emotion classification and facial detection to create ways of processing emotions in robots. The data is then used to figure out multi-leveled models that will further allow for understanding of the processing of emotion in robots or organisms. [3]

Affective Computing Models[edit]

An example of an affective computing model is the Emotion Recognition project. This program looks at vocalized expressions with emotion by people and teaches the computer to process for emotional recognition. It used seven hundred word phrases that were matched to one of five emotions. The emotions used were: anger, fear, sadness, and a neutral emotion. In all the cases, it recognized the emotion in the phrases seventy percent of the time. This correlates to the same recognition rate that humans have. This program was designed by first creating a large data base of phrases, then looking at how people would recognize the emotions in the phrases, and then categorizing the phrases by emotions. People then listened to phrases played to them and categorized the phrases. With this data, the researchers then used a two layer back propagation network that allowed them to train the system to properly recognize these emotions. [4]

Cognitive Affective Model[edit]

This is a diagram of Cognitive Affective Architecture. It is used in cognitive sciences as a way to understand how emotions are processed in an organism.

The Cognitive Affective Model is used to describe how both cognitive functioning and emotional processing for the many varied states that humans, animals and robots can experience. There are many ways to discuss or show a model that can represent both cognitive and affective states. Different emotions can trigger different levels of the system. This can lead to various model designs being used. The main model used is one called CogAff Architecture. It has three levels for processing information in a robot or an organism. These three layers consist of working on the robot or organism's perceptual mechanisms, motor mechanisms, and their "central" processing mechanisms. All three levels have inter connections between them allowing for different routes of processing. These levels are meta-management, deliberative reasoning, and reactive mechanisms. These two sets of levels intersect each other and cross. This allows for multiple flow patterns for information. [5] The three main areas of the CogAff architecture mentioned above are:Perceptual mechanisms,central processing, and action. Perceptual mechanisms. Perceptual mechanisms are the ability of the robot/organism to see objects or information in the surrounding environment and use it for processing. Central processing is the how the main unit for processing future actions and procedures will be planned, allowing for the plotting and planning of routes to be followed. Action is where the planning routes and paths are taken and put into motion and use by the system. It also contains three other dimensions of processing. The three levels of processing in CogAff architecture are: Meta-managing mechanisms, deliberative reasoning, and reactive mechanisms. Meta-managing mechanisms this cross level deals with making sure that the information in the architecture is properly managing internal information to be monitored, categorized, evaluated, and controlled or modulated. Deliberative reasoning This layer requires the mechanisms in this level to clump the information together. These pieces of data can then be connected in ways that allow for learning in the role of planning and predicting future events that the meta-management layer produces. Reactive Mechanisms these mechanisms take information and immediately do whatever is felt needed at this point in the process of the organism or robot. Doing whatever is felt to be needed at this point in the process of the organism or robot. With these two, three layer groups, you can create an intricate model. This allows for some one to show more complex mental states and reactions. Thus, allowing for more ways to model how information is processed in these systems. This is helpful since the varied amount of emotions we have can trigger different levels in the system. Marvin Minsky further applies these different levels in emotional and cognitive processing. He believes that emotions are certain areas or nodes in the mind, that work together to create a pattern to create which makes up an emotion. This means that when we are angry, there is a certain pattern of nodes or modules in the mind that all fire at the same time. This pattern of firing represents being angry. With these differing patterns of firing we get different levels of thought. This allows for multiple levels to be added to the model. Marvin Minsky adds a self-conscious reflection level, self-reflective thinking level, and a learned reactions level. The learned reactions level of the model is all the processes that we have acquired through experience. The self-reflective thinking level of the model deals with how a person thinks about a situation and how they as a person feel that it matches with their own personal set of values. Lastly, the self-conscious reflection level deals with how a person feels they are doing with the values and morals that they set for themselves and the actions they do. [6] On top of these six layers described, there are functions in the model called alarm systems. Alarm systems are designed to detect situations where quickly redirecting of the processing is required from anywhere in the model. [5] This allows for abrupt changes in the flow of processing and shows how the organism can react in differing ways.

The Multiple Levels of CogAffec Architecture and Different Architecture Types[edit]

The Main Three Levels

Mechanism Description
Reactive processing at this level is automatic. With a stimulus triggering a reaction instantaneously.
Deliberative the mechanisms at this level show alternative plans of action. Where the best one is selected from many generated plans.
Reflective at this level the organism or robot will choose to either act on an emotion or not to act on it. This all depends on the circumstances. As well, it uses deliberative processes.[7]

Expanded on Levels

Mechanism Description
Learn reactions any process of reacting that we have learned by experience.
Self reflective thinking personal thoughts on a situation. How the person feels about his or her thoughts on that situation.
Self conscious reflection how a person feels about their own morals and if their actions match up with them.

Since there are many different levels there are different ways to represent CogAff architecture.

Omega Architecture[edit]

This form of architecture only uses a small amount of the functions in CogAff architecture. The way the information being processed, flows in an Omega shape. The amount of information selected in this architecture shape is a very small in size. The information flows through a pipeline that takes the possible actions from the reactive mechanisms layer and brings it to the top where the appropriate action is chosen. Then that chosen response is sent back down as a motor signal to be carried out. [8]

Subsumption Architecture[edit]

This CogAff architecture only uses the reactive mechanism level with many small sub-layers used to process the information.[8]

H-CogAff Architecture[edit]

This model is designed to show the processing of information in normal adult human beings. It shows how humans can process emotions on multiple levels. While also showing the evolution of how systems progressed and developed in an organism.[5]

Cognitive Affective Architecture[edit]

Being able to recognize and produce emotions is just one side of affective processing. This side was seen in the research of affective computing. The ability to understand what the internal design of an emotional being is like is the the other side of affective processing. [9] In CogAff architecture , the ability to process not only emotional content but also cognitive content is important. One example of this is MAMID architecture. MAMID architecture is a cognitive-affective architecture that allows us to explore affective meta-cognition processes.[10] This cognitive affective architecture models emotions onto cognitive processes, allowing it to model high level decision-making. It allows us to see the emotion and cognitive interactions in a model. MAMID breaks down emotions into two areas for processing, emotion generating and mediating the effects of that emotion. These two processes are done through an affect appraiser module and through general effects interactions. [11] One of the debates that is currently being worked on in Cog Aff research is what exactly constitutes an emotion. This is important because we need to define what emotions are, what they do to a system, and how they are processed. It is difficult to define them because the word emotion can be very vague. It can be used in different ways and there is not just one definition for it. Researchers have looked at whether emotions are are a certain set of words such as fear, anger, and distrust but have found that there are multiple ways to encode these features. Scientists have looked at explaining emotions as a set of physiology responses to a situation. They have also looked at emotions as responses in behaviour to objects in the environment around them. Yet there has been others that have defined emotions as introspection of what it feels like to have an emotional response. For one, scientists may encode fear as something can be caused by pain while another scientist can state it as a problem with a mental intellectual dilemma like a paradox. The only clear thing is that what ever emotion being used should be clearly defined in the operation of the model.[5] One proposed way to deal with this is an emotion as an alarm system in the CogAff model. This allows the emotional state to be defined as an organism that is responding to a detection of either some aspect that is interrupting, preventing, disturbing or modulating a process or processes. These processes would have been triggered independently of this detection. It could detect something that would interrupt prevent, or disturb a process that is currently being suppressed by a filter or priority mechanism.[5]

CogAff Architecture allows us to explain three different classes of human emotion. These classes are primary, secondary, and tertiary emotions. Primary emotions are things called alarm mechanisms that are found in the reactive layer of the model. Secondary emotions are a combination of reactive and deliberate layers. These two layers trigger alarm mechanisms that regulate both of these two layers. Tertiary emotions are alarm mechanisms that interrupt the meta-managing mechanisms in the system. This interference leads to a loss of attention control. The cognitive complexity that CogAff Architecture allows us to use it in various ways. On one hand, we can model things as simple as small insect cognitive processes. On the other hand, scientists can go way more in depth and show very complex processes such as human thought.[12]

CogAff Architecture impacted the field of cognitive science by allowing researchers to ask more specific questions that are not able to be swayed by outside ideas. It allowed the researchers to ask questions about the components of an organism, which parts connected together and how the systems interacted with one another overall. Along with the components, it allowed the researcher to look at the different ontology used in the system. So they can compare and contrast different the ontology to see how the systems work. With this comparing and contrasting of mechanisms among the different models, the researchers were also able to look at the pros and cons of specific pieces and to see the evolutionary development of other mechanisms in the system. This allowed for the system to be challenged and tested to see when useful mechanisms in a system would not work for CogAff Architecture. This allowed them to understand the limits of the model for emotional processing.[8]

Summarization of CogAff Architecture[edit]

The use of CogAff architecture is used to understand how emotions and cognitive processes work and interact within an organism. Using the six levels that can be used in a CogAff architecture, we can model many different emotions and thought processes forming a deeper understanding of how an organism will process an event. This allows us to understand what is going on. We can use this understanding to work on implementation into robots and eventually use these robots to interact with humans in every day life. The type of interactions that can be computed from CogAff architecture can allow many different types of robots to work properly. We can have robots that work on the same processing level as insects or small animals due to the ability to model varying levels of complexity. Emotions will be mapped onto framework that can be allowed to help robots produce emotion so that it can interact with the desired group. The way the emotion will come forth due to CogAff architecture is then through the intricate interplay of the different levels of the model. Much like how the human mind produces emotion in the mind. Emotions emerge as a consequence of the different parts of the brain firing. The factors that are tied to emotions are varied. They can be tied to entirely physical entities (such as animal bodies, brains, or muscles) or they can be states or processes in the machine. Additionally, they could be tied to particular body types.[5]

Applications[edit]

An example of where CogAff architecture can be useful is the Kismet (robot). This project is working on mapping emotions onto architecture that can be then be processed by a computer. The computer then can replay the emotions and show them in robots. This allows for some form of communication between the person and the robot.

References[edit]

  1. ^ Sloman, A., Chrisley, R.,& Scheutz, M.(2005). The architectural basis of affective states and processes. In J.M. Fellows &M. A. Arbub (Eds.),Who needs emotions? The brain meets the robot(pp. 203-244). Oxford, UK: Oxford University Press.
  2. ^ Ziemke, T., & Lowe, R., (2009).On the Role of Emotion in Embodied Cognitive Architectures:From Organisms to Robots. Cogn Comput1, 104-117.DOI 10.1007/s12559-009-9012-0
  3. ^ Grandjean, D., & Schere, K, R., (2008).Unpacking the Cognitive Architecture of Emotion Processes.Emotion, 8(3), 341-351. DOI: 10.1037/1528-3542.8.3.341
  4. ^ Petrushin, V. A. (2000). Emotion recognition in speech signal: experimental study, development, and application. studies, 3, 4.
  5. ^ a b c d e f Sloman, A., Chrisley, R.,& Scheutz, M.(2005). The architectural basis of affective states and processes. In J.M. Fellows &M. A. Arbub (Eds.),Who needs emotions? The brain meets the robot(pp. 203-244). Oxford, UK: Oxford University Press.
  6. ^ Minsky, M. (2006). The emotion machine. Commonsense thinking, artificial intelligence, and the future of the human mind. New York: Simon & Schuster
  7. ^ Friendenberg, J., & Silverman, G.(2012)Cognitive Science An Introduction to the Study of Mind. Los Angelese|London| New Delhi | Singapore| Washington DC:Sage Publishing
  8. ^ a b c Sloman, A., & Chrisley, R. L., (2005) More things than are dreamt of in your biology: Information-processing in biologically inspired robots. Cognitive Systems Research, 6,145-174.doi:10.1016/j.cogsys.2004.06.004
  9. ^ Friendenberg, J., & Silverman, G.(2012)Cognitive Science An Introduction to the Study of Mind. Los Angelese|London| New Delhi | Singapore| Washington DC:Sage Publishing
  10. ^ Marsella, S. C., & Gratch, J. (2009). EMA: A process model of appraisal dynamics. Cognitive Systems Research, 10(1), 70-90.
  11. ^ Hudlicka, E. (2008). Modeling the Mechanisms of Emotion Effects on Cognition. In Proceedings of the AAAI Fall Symposium on Biologically Inspired Cognitive Architectures (pp. 82-86).
  12. ^ Sloman, A., & Chrisley, R. L., (2005) More things than are dreamt of in your biology: Information-processing in biologically inspired robots. Cognitive Systems Research, 6,145-174.doi:10.1016/j.cogsys.2004.06.004