Jump to content

User:Simonye28/sandbox

From Wikipedia, the free encyclopedia
James McClelland
BornDecember 1, 1948
NationalityUnited States
Scientific career
FieldsPsychology

James McClelland

[edit]

James Lloyd (Jay) McClelland (born in Cambridge, MA, USA December 1, 1948) is the Lucie Stern Professor at Stanford University, where he is currently the Chair of the Psychology Department. He received his Ph.D. in Cognitive Psychology from the University of Pennsylvania in 1975. He is best known for his work on statistical learning and Parallel Distributed Processing, applying connectionist models (or neural networks) to explain cognitive phenomena such as spoken word recognition and visual word recognition. McClelland is to a large extent responsible for the "connectionist revolution" of the 1980s, which saw a large increase in scientific interest for connectionism.

In 1986, he published Parallel distributed processing: Explorations in the microstructure of cognition with David Rumelhart, which some still regard as a bible for cognitive scientists. His present work focuses on learning, memory processes and psycholinguistics, still within the framework of connectionist models. He is a former chair of the Rumelhart Prize committee, having collaborated with Rumelhart for many years, and himself received the award in 2010 at the Cognitive Science Society Annual Conference in Portland, Oregon.

In particular, McClelland and collaborator David Rumelhart are known for their debate with Steven Pinker and Alan Prince. McClelland and Rumelhart claimed to have proven that humans could learn language (in particular, the past tense) without language-specific hardware. Pinker and Prince demonstrated that they had not done so. In response, McClelland has continued to revise his connectionist model.

In Fall of 2006, he moved to Stanford University from Carnegie Mellon University, where he was a professor of Psychology and Cognitive Neuroscience. He also holds a part-time appointment as Consulting Professor at the Neuroscience and Aphasia Research Unit (NARU) within the School of Psychological Sciences, University of Manchester.

Researches

[edit]

James McClelland is interested in a wide range of cognitive neuroscience issues including learning, memory, language, and cognitive development, as well as the nature of processing and learning in language.[1]

Psychological processes

[edit]

A lot of research had done regarding parallel distributed processing (PDP). In 1986, James McClelland, David Rumelhart and PDP Research group had published the books Parallel Distributed Processing: Explorations in the Microstructure of Cognition - Volume 1 (foundations) and Volume 2 (Psychological and Biological Models). One of their goals is to provide an alternate view regarding cognitive phenomena and the simple computational mechanism for information processing [2] . They explained each cognition models corresponding to the mental processing properties, in which, provide another approaches of thinking cognitive processes. Moreover, they believed that parallel distributed processing mechanisms would be more natural and suitable instrument in coping with certain aspect of cognition as compared to other approaches[3] .

Parallel Distributed Processing Model (PDP)

[edit]

James McClelland and David Rinehart addressed human thought process are constructed by various level of cognitive units and the activations of those unit. They argued that human beings are better at processing natural information than computer. They suggested that the brain operates a fundamental computational architecture that is more suitable for such information processing [4]. These natural information tasks are mostly involves in the simultaneous consideration of constraints such as reaching and grasping, the common effect of syntax and semantics, and concurrent constraints in word recognition [5] . Each of these constraints plays a significant role in shaping the outcome of information processing. Furthermore, these constraints conspired to the solution that is suitable for the specific problem [6]. On the other hand, without any of these constraints the information processing would more likely to fail.

They suggested that these information processing may require some kind of mechanisms to constraint these information in the situation which can act on other aspect and being influenced by the mechanisms[7] . They conceptualized these mechanisms as Parallel Distributed Processing (PDP) model. This model is constructed based on a stimulated artificial neural network[8] . They believed this model of information processing would occur through the weighted connections with great amount of basic processing units, and each unit receive and send out both excitatory and inhibitory signals to other associated units[9] . These processing units act as a possible hypothesis such as certain semantic, visual, acoustic feature presented in input and the connection represents constraints among the hypotheses [10] . Due to the differ strengths of the connections, the weight adjustment of each unit can be happen as the result of processing [11].

Schemata
[edit]

James McClelland and David Rinehart believed that schemata are the structure of the mind and are flexible enough to fit around any kind of cognitive processing[12] . They suggested that schemata occur when interacting with large amount of basic cognitive units. Schemata are implicit in ones knowledge, formed by the pervious experiences and environment that schemata are trying to interpret [13] . That is when inputs enter to the cognitive system and a set of cognitive units is activated. Then the units regulate the starting state and form a mental network that is more appropriate. Once the system reaches a relatively stable state, and little tendency for the system to transfer toward another state[14] . This associations corresponding to the interconnected units is named schemata. Schemata are deposited within ones memory and are the general content of memory. Learning schemata are simply adjusting the weights of each connected units[15].

The inputs to PDP system effect the output to the environment and output of the system can travel from level to level
This figure depicted the relationship with the model of the world, including the interpretation network, the inputs and the outputs for mental simulations
Thinking
[edit]

James McClelland and David Rinehart addressed the process of thinking incorporated with the PDP framework. Suggested that parallel algorithms are employed for the mental processing and severed as a basic architectural design principle[16] . They assumed that cognitive system is progressing when new information comes into the system. The idea is that, new input comes into the cognitive system and the system relaxes to adapt this new input. Once the system became relatively stabilized and tends to remains this state until changes in the stimulus conditions. When new input arrives, relaxation process to determine an interpretation of new input in a short period of time and the system will remain stable again[17] . They suggested that the cognitive system could divide up into two sets of units. One of them is the one above, regard receiving inputs from the world and then process relaxation to a suitable state to produce appropriate action by changing the inputs to the system[18] . And the second one is that one system takes the actions and predicts how the inputs will influence response, which considered as a “mental model of the world events”[19]. They assumed that expectations generation regarding the state of the world would allow for predicting the consequences of the actions. That is take the inputs from ones model of world instead of the inputs from the world and the output from the mental model. Therefore, ones can mentally simulate the events and to judge the action consequences when particular action occurred[20] . Parallel Distributed Processing model provided the beneficial features of mental stimulation as well as learning through mental practice[21] . Furthermore, two main parts for performance, 1) a cognitive system regulate actions in a particular situation and 2) prediction of the consequences when performing a particular action.

The TRACE Model

[edit]

James McClelland and Jeffrey Elman proposed how people perceive speech and word stimulus. They came up with The TRACE model to capture the integration of inputs in speech perception by incorporated both interactive activation processes of word perception and parallel distributed processing framework. They suggested that people organized speech stimulus into three different layers including the feature, phoneme and word levels with represent a hypothesis about certain perceptual object[22]. Trace is a set of activated units that forms an active processing structure, which operates as the perceptual processing device as well as the system’s short-term memory[23] . Cognitive process occurs when interacting with large amount of simple processing units, a connected pattern of units activated spontaneously to update each other based on the pervious activation pattern[24] . To indication the speech processing, it is necessary to know how features obtained from speech signal, the duration of the different features and phonemes as well as their function of context [25] . To deal with these goals, they instantiate in two simulation processing called TRACE I and TRACE II model, in which both of them consist the same fundamental assumptions but target on the different aspect of speech perception [26] .

TRACE I model
[edit]

TRACE I model is primarily focus on the phonemes recognition in real speech and using TRACE framework to show the process of speech. Primarily used to address the issues on phoneme and word recognition mechanism that is excluded from TRACE II. At the feature level, TRACE I contain detectors for eight value ranges on 15 different input parameters, which capable to process 500msec samples of speech [27] . Furthermore, applying the perception convergence procedure to determine the connections from feature to phoneme units [28] .

TRACE II model
[edit]

TRACEII is a simplified version of TRACR I model. TRACE II model is used to address the perception of words and phoneme, the word information influence on perceiving phoneme and demonstrate how specific aspects of phoneme perception excluded from the framework of TRACR using on-line recognition of words. This model is used mock speech input instead of real speech input, which involved specific inputs to processing units at the feature level [29] .

The Programmable Blackboard Model

[edit]

James McClelland had invested the interactive model of reading and was inspired by the HEARSAY model of speech understanding [30]. HEARSAY model assumed that reading is comprised at different levels of processing such as visual features, letter, word, syntactic and semantic levels. Furthermore, hypotheses at each level is guided by a set of structure known as knowledge sources which concern with certain aspect of reading including lexical knowledge source (knowledge of the letter arrangement in each word), syntactic knowledge and semantic knowledge etc…[31] He proposed that these aspects of reading are processed in a parallel manner. It is because processing can occur in both within and between various levels, therefore, hypotheses can interact with each other and span in a wide range of input. They also showed that each word is associated with a specific set of neural network and parallel processing more than one word would require another network for the other word. This model is mainly focus on the significant aspects of reading which information integrate over fixations in reading and mechanisms processing of the syntactic and semantic content of the sentences[32] . This model assumed there is an alternate connection when combining the output line of letter unit and input line of words and instead of a fixed relationship between certain letter units with specific word units. Which allow to the network to process various combination of words correctly and make connections depending on the certain activated units in the system[33].

The connection information distribution mechanism (CID)
[edit]

James McClelland had invested the interactive model of reading and was inspired by the HEARSAY model of speech understanding [34]. HEARSAY model assumed that reading is comprised at different levels of processing such as visual features, letter, word, syntactic and semantic levels [35]. Furthermore, hypotheses at each level is guided by a set of structure known as knowledge sources which concern with certain aspect of reading including lexical knowledge source (knowledge of the letter arrangement in each word), syntactic knowledge and semantic knowledge etc… [36] He proposed that these aspects of reading are processed in a parallel manner. It is because processing can occur in both within and between various levels, therefore, hypotheses can interact with each other and span in a wide range of input. They also showed that each word is associated with a specific set of neural network and parallel processing more than one word would require another network for the other word. This model is mainly focus on the significant aspects of reading which information integrate over fixations in reading and mechanisms processing of the syntactic and semantic content of the sentences [37]. This model assumed there is an alternate connection when combining the output line of letter unit and input line of words and instead of a fixed relationship between certain letter units with specific word units. Which allow to the network to process various combination of words correctly and make connections depending on the certain activated units in the system[38] .

See also

[edit]
[edit]

References

[edit]
  1. ^ McClelland, James. "James L. (Jay) McClelland". Retrieved April 7th 2012. {{cite web}}: Check date values in: |accessdate= (help)
  2. ^ McClelland, James (1981). "An interactive activation model of context effects in letter perception" (PDF). An Account of Basic Findings. 88 (5): 575. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  3. ^ McClelland, James (1986). Parallel disributed processing: Vol 2 Psychological and Biological Models. Lodon, England: A Bradford Book. pp. 1–56.
  4. ^ McClelland, James (July 1988). "Parallel distributed processing" (PDF). Explorations in the Microstructure of Cognition. 2: 6–57. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)CS1 maint: date and year (link)
  5. ^ McClelland, James (July 1988). "Parallel distributed processing" (PDF). Explorations in the Microstructure of Cognition. 2: 3–4. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)CS1 maint: date and year (link)
  6. ^ McClelland, James (July 1988). "Parallel distributed processing" (PDF). Explorations in the Microstructure of Cognition. 2: 6–57. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)CS1 maint: date and year (link)
  7. ^ McClellnd, James (1986). Parallel Distributed Processing Vol 1:Fundations. London. England: A Bradford book.
  8. ^ McClelland, James. "Chapter 1 Introduction". Retrieved 9 April 2012.
  9. ^ McClelland, James (1986). Parallel Distributed Processing Vol 1: Foundations. London, England: A Bradford Book.
  10. ^ McClelland, James (11). "Parallel distributed processing: Implications for cognition and development" (AIP-47). {{cite journal}}: Check date values in: |date= and |year= / |date= mismatch (help); Cite journal requires |journal= (help); Unknown parameter |month= ignored (help)
  11. ^ McClelland, James. "Chapter 1 Introduction". Retrieved 9 April 2012.
  12. ^ McClelland, James (13). "Explorations in Parallel Distributed Processing-Macintosh Version". A Handbook of Models, Programs, and Exercises. ISBN 9780262631297. {{cite journal}}: Check date values in: |date= and |year= / |date= mismatch (help); Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  13. ^ Rogers, Timothy (1). Semantic cognition: A parallel distributed processing approach. MIT Press. ISBN 9780262182393. {{cite book}}: Check date values in: |date= and |year= / |date= mismatch (help); Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  14. ^ McCelelland, James (1986). Chapter 14: Sequential Thought Processes in PDP models. London, England: A Bradford Book. pp. 7–58.
  15. ^ McCelelland, James (1986). Chapter 14: Sequential Thought Processes in PDP models. London, England: A Bradford Book. pp. 7–58.
  16. ^ McClelland, James (1986). Part 1 The PDP perspective. London, England: A Bradford Book. pp. 3–45.
  17. ^ McClelland, James (1986). Psychological Processes. London, England: A Bradford Book. pp. 9–30.
  18. ^ McClelland, James (1986). Schemata and Sequential Thought Processes in PDP models. London, England: A Bradford Book. pp. 10–11.
  19. ^ McClelland, James (1986). Schemata and Sequential Thought Processes in PDP models. London, England: A Bradford Book. pp. 10–11.
  20. ^ Rumelhart, David (1986). Schemata and Sequential Thought Processes in PDP Models. {{cite journal}}: Missing or empty |title= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help)
  21. ^ McClelland, James (1986). The Appeal of Parallel Distributed Processing. London, England: the bradford Book. pp. 45–33.
  22. ^ McClelland, James (1985). Distributed Memory and the Representation of General and Specific Information. 114 (2): 159 http://psycnet.apa.org/journals/xge/114/2/159.pdf. {{cite journal}}: Missing or empty |title= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  23. ^ McClelland, James. "The TRACE Model of Speech Perception".
  24. ^ McClelland, James (1986). Interactive Processes in Speech Perception: The Trace Model. London, England: A Bradford Book. pp. 58–60.
  25. ^ McClelland, James (1986). Interactive Processes in Speech Perception: The Trace Model. London, England: A Bradford Book. pp. 59–60.
  26. ^ McClelland, James (1986). Interactive Processes in Speech Perception: The Trace Model. London, England: A Bradford Book. p. 70.
  27. ^ McClelland, James (1986). Interactive Processes in Speech Perception: The Trace Model. London, England: A Bradford Book. pp. 50–60.
  28. ^ McClelland, James (1986). Interactive Processes in Speech Perception: The Trace Model. London, England: A Bradford Book. pp. 58–60.
  29. ^ McClelland, James (1986). Interactive Processes in Speech Perception: The Trace Model. London, England: A Bradford Book. pp. 50–60.
  30. ^ McClelland, James (11). "Parallel distributed processing: Implications for cognition and development" (AIP-47). {{cite journal}}: Check date values in: |date= and |year= / |date= mismatch (help); Cite journal requires |journal= (help); Unknown parameter |month= ignored (help)
  31. ^ McClelland, James (1985). Distributed Memory and the Representation of General and Specific Information. 114 (2): 159 http://psycnet.apa.org/journals/xge/114/2/159.pdf. {{cite journal}}: Missing or empty |title= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  32. ^ McClelland, James (1986). The programmable Blackboard Model of Reading. London, England: A Bradford Book. pp. 170–172.
  33. ^ McClelland, James (1986). The programmable Blackboard Model of Reading. London, England: A Bradford Book. pp. 170–172.
  34. ^ McClelland, James (1986). The programmable Blackboard Model of Reading. London, England: A Bradford Book. pp. 120–125.
  35. ^ McClelland, James (1986). The programmable Blackboard Model of Reading. London, England: A Bradford Book. pp. 130–132.
  36. ^ Cohen, Jonathan D.; Servan-Schreiber, David; McClelland, James L. (1). "A parallel distributed processing approach to automaticity" (PDF). The American Journal of Psychology. 105 (2): 239–269. doi:10.2307/1423029. JSTOR 1423029. {{cite journal}}: Check date values in: |date= and |year= / |date= mismatch (help); Unknown parameter |month= ignored (help)
  37. ^ McClelland, James (1986). The programmable Blackboard Model of Reading. London, England: A Bradford Book. pp. 130–132.
  38. ^ St. John, Mark F.; McClelland, James L. (30). "Learning and applying contextual constraints in sentence comprehension". Artificial Intelligence. 46 (1–2): 217–257. doi:10.1016/0004-3702(90)90008-N. {{cite journal}}: Check date values in: |date= and |year= / |date= mismatch (help); Unknown parameter |month= ignored (help)