Problem solving

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Biogeographist (talk | contribs) at 12:57, 4 July 2017 (Undid revision 788860011 by 112.203.152.144 (talk)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Problem solving consists of using generic or ad hoc methods, in an orderly manner, for finding solutions to problems. Some of the problem-solving techniques developed and used in artificial intelligence, computer science, engineering, mathematics, or medicine are related to mental problem-solving techniques studied in psychology.

Definition

The term problem solving is used in many disciplines, sometimes with different perspectives, and often with different terminologies. For instance, it is a mental process in psychology and a computerized process in computer science. Problems can also be classified into two different types (ill-defined and well-defined) from which appropriate solutions are to be made. Ill-defined problems are those that do not have clear goals, solution paths, or expected solution. Well-defined problems have specific goals, clearly defined solution paths, and clear expected solutions. These problems also allow for more initial planning than ill-defined problems.[1] Being able to solve problems sometimes involves dealing with pragmatics (logic) and semantics (interpretation of the problem). The ability to understand what the goal of the problem is and what rules could be applied represent the key to solving the problem. Sometimes the problem requires some abstract thinking and coming up with a creative solution.

Psychology

In psychology, problem solving refers to a state of desire for reaching a definite 'goal' from a present condition that either is not directly moving toward the goal, is far from it, or needs more complex logic for finding a missing description of conditions or steps toward the goal.[2] It is the evolutionary drive for living organisms. The nature of human problem solving processes and methods has been studied by psychologists over the past hundred years. Methods of studying problem solving include introspection, behaviorism, simulation, computer modeling, and experiment. Social psychologists also look into the independent and interdependent problem-solving methods.[3] In psychology, problem solving is the concluding part of a larger process that also includes problem finding and problem shaping.

Considered the most complex of all intellectual functions, problem solving has been defined as a higher-order cognitive process that requires the modulation and control of more routine or fundamental skills.[4] Problem solving has two major domains: mathematical problem solving and personal problem solving where, in the second, some difficulty or barrier is encountered.[5]

Clinical psychology

Simple laboratory-based tasks can be useful solving; however, they usually omit the complexity and emotional valence of "real-world" problems. In clinical psychology, researchers have focused on the role of emotions in problem solving (D'Zurilla & Goldfried, 1971; D'Zurilla & Nezu, 1982), demonstrating that poor emotional control can disrupt focus on the target task and impede problem resolution (Rath, Langenbahn, Simon, Sherr, & Diller, 2004). In this conceptualization, human problem solving consists of two related processes: problem orientation, the motivational/attitudinal/affective approach to problematic situations and problem-solving skills. Working with individuals with frontal lobe injuries, neuropsychologists have discovered that deficits in emotional control and reasoning can be remediated, improving the capacity of injured persons to resolve everyday problems successfully (Rath, Simon, Langenbahn, Sherr, & Diller, 2003).

Cognitive sciences

The early experimental work of the Gestaltists in Germany placed the beginning of problem solving study (e.g., Karl Duncker in 1935 with his book The psychology of productive thinking[6]). Later this experimental work continued through the 1960s and early 1970s with research conducted on relatively simple (but novel for participants) laboratory tasks of problem solving.[7][8] Choosing simple novel tasks was based on the clearly defined optimal solutions and their short time for solving, which made it possible for the researchers to trace participants' steps in problem-solving process. Researchers' underlying assumption was that simple tasks such as the Tower of Hanoi correspond to the main properties of "real world" problems and thus the characteristic cognitive processes within participants' attempts to solve simple problems are the same for "real world" problems too; simple problems were used for reasons of convenience and with the expectation that thought generalizations to more complex problems would become possible. Perhaps the best-known and most impressive example of this line of research is the work by Allen Newell and Herbert A. Simon.[9] Other experts have shown that the principle of decomposition improves the ability of the problem solver to make good judgment.[10]

Computer science and algorithmics

In computer science and in the part of artificial intelligence that deals with algorithms ("algorithmics"), problem solving encompasses a number of techniques known as algorithms, heuristics, root cause analysis, etc. In these disciplines, problem solving is part of a larger process that encompasses problem determination, de-duplication, analysis, diagnosis, repair, etc.

Engineering

Problem solving is used in when products or processes fail, so corrective action can be taken to prevent further failures. It can also be applied to a product or process prior to an actual fail event, i.e., when a potential problem can be predicted and analyzed, and mitigation applied so the problem never actually occurs. Techniques such as Failure Mode Effects Analysis can be used to proactively reduce the likelihood of problems occurring.

Military science

In military science, problem solving is linked to the concept of "end-states", the desired condition or situation that strategists wish to generate.[11]: xiii, E-2  The ability to solve problems is important at any military rank, but is highly critical at the command and control level, where it is strictly correlated to the deep understanding of qualitative and quantitative scenarios. Effectiveness of problem solving is "a criterion used to assess changes in system behavior, capability, or operational environment that is tied to measuring the attainment of an end state, achievement of an objective, or creation of an effect".[11]: IV-24  Planning for problem-solving is a "process that determines and describes how to employ 'means' in specific 'ways' to achieve 'ends' (the problem's solution)."[11]: IV-1 

Other

Forensic engineering is an important technique of failure analysis that involves tracing product defects and flaws. Corrective action can then be taken to prevent further failures.

Reverse engineering[12] attempts to discover the original problem-solving logic used in developing a product by taking it apart.

Other problem solving tools are linear and nonlinear programming, queuing systems, and simulation.[13]

Problem-solving strategies

Problem-solving strategies are the steps that one would use to find the problem(s) that are in the way to getting to one's own goal. Firend's problem solving model (PSM) is practical in application and incorporates the conventional 5WH approach, with a systematic process of investigation, implementation and assessment cycle.[14][non-primary source needed] Some would refer to this as the "problem-solving cycle" (Bransford & Stein, 1993). In this cycle one will recognize the problem, define the problem, develop a strategy to fix the problem, organize the knowledge of the problem cycle, figure out the resources at the user's disposal, monitor one's progress, and evaluate the solution for accuracy. The reason it is called a cycle is that once one is completed with a problem another usually will pop up.

Blanchard-Fields (2007) looks at problem solving from one of two facets. The first looking at those problems that only have one solution (like mathematical problems, or fact-based questions) which are grounded in psychometric intelligence. The other that is socioemotional in nature and are unpredictable with answers that are constantly changing (like what's your favorite color or what you should get someone for Christmas).

The following techniques are usually called problem-solving strategies'[15]

  • Abstraction: solving the problem in a model of the system before applying it to the real system
  • Analogy: using a solution that solves an analogous problem
  • Brainstorming: (especially among groups of people) suggesting a large number of solutions or ideas and combining and developing them until an optimum solution is found
  • Divide and conquer: breaking down a large, complex problem into smaller, solvable problems
  • Hypothesis testing: assuming a possible explanation to the problem and trying to prove (or, in some contexts, disprove) the assumption
  • Lateral thinking: approaching solutions indirectly and creatively
  • Means-ends analysis: choosing an action at each step to move closer to the goal
  • Method of focal objects: synthesizing seemingly non-matching characteristics of different objects into something new
  • Morphological analysis: assessing the output and interactions of an entire system
  • Proof: try to prove that the problem cannot be solved. The point where the proof fails will be the starting point for solving it
  • Reduction: transforming the problem into another problem for which solutions exist
  • Research: employing existing ideas or adapting existing solutions to similar problems
  • Root cause analysis: identifying the cause of a problem
  • Trial-and-error: testing possible solutions until the right one is found

Problem-solving methods

Common barriers to problem solving

Common barriers to problem solving are mental constructs that impede our ability to correctly solve problems. These barriers prevent people from solving problems in the most efficient manner possible. Five of the most common processes and factors that researchers have identified as barriers to problem solving are confirmation bias, mental set, functional fixedness, unnecessary constraints, and irrelevant information.

Confirmation bias

Within the field of science there exists a set of fundamental standards, the scientific method, which outlines the process of discovering facts or truths about the world through unbiased consideration of all pertinent information and through impartial observation of and/or experimentation with that information. According to this method, one is able to most accurately find a solution to a perceived problem by performing the aforementioned steps. The scientific method does not prescribe a process that is limited to scientists, but rather one that all people can practice in their respective fields of work as well as in their personal lives. Confirmation bias can be described as one's unconscious or unintentional corruption of the scientific method. Thus when one demonstrates confirmation bias, one is formally or informally collecting data and then subsequently observing and experimenting with that data in such a way that favors a preconceived notion that may or may not have motivation.[16] Research has found that professionals within scientific fields of study also experience confirmation bias. Andreas Hergovich, Reinhard Schott, and Christoph Burger's experiment conducted online, for instance, suggested that professionals within the field of psychological research are likely to view scientific studies that are congruent with their preconceived understandings more favorably than studies that are incongruent with their established beliefs.[17]

Motivation refers to one's desire to defend or find substantiation for beliefs (e.g., religious beliefs) that are important to one.[18] According to Raymond Nickerson, one can see the consequences of confirmation bias in real-life situations, which range in severity from inefficient government policies to genocide. With respect to the latter and most severe ramification of this cognitive barrier, Nickerson argued that those involved in committing genocide of persons accused of witchcraft, an atrocity that occurred from the 15th to 17th centuries, demonstrated confirmation bias with motivation. Researcher Michael Allen found evidence for confirmation bias with motivation in school children who worked to manipulate their science experiments in such a way that would produce their hoped for results.[19] However, confirmation bias does not necessarily require motivation. In 1960, Peter Cathcart Wason conducted an experiment in which participants first viewed three numbers and then created a hypothesis that proposed a rule that could have been used to create that triplet of numbers. When testing their hypotheses, participants tended to only create additional triplets of numbers that would confirm their hypotheses, and tended not to create triplets that would negate or disprove their hypotheses. Thus research also shows that people can and do work to confirm theories or ideas that do not support or engage personally significant beliefs.[20]

Mental set

Mental set was first articulated by Abraham Luchins in the 1940s and demonstrated in his well-known water jug experiments.[21] In these experiments, participants were asked to fill one jug with a specific amount of water using only other jugs (typically three) with different maximum capacities as tools. After Luchins gave his participants a set of water jug problems that could all be solved by employing a single technique, he would then give them a problem that could either be solved using that same technique or a novel and simpler method. Luchins discovered that his participants tended to use the same technique that they had become accustomed to despite the possibility of using a simpler alternative.[22] Thus mental set describes one's inclination to attempt to solve problems in such a way that has proved successful in previous experiences. However, as Luchins' work revealed, such methods for finding a solution that have worked in the past may not be adequate or optimal for certain new but similar problems. Therefore, it is often necessary for people to move beyond their mental sets in order to find solutions. This was again demonstrated in Norman Maier's 1931 experiment, which challenged participants to solve a problem by using a household object (pliers) in an unconventional manner. Maier observed that participants were often unable to view the object in a way that strayed from its typical use, a phenomenon regarded as a particular form of mental set (more specifically known as functional fixedness, which is the topic of the following section). When people cling rigidly to their mental sets, they are said to be experiencing fixation, a seeming obsession or preoccupation with attempted strategies that are repeatedly unsuccessful.[23] In the late 1990s, researcher Jennifer Wiley worked to reveal that expertise can work to create a mental set in persons considered to be experts in certain fields, and she furthermore gained evidence that the mental set created by expertise could lead to the development of fixation.[24]

Functional fixedness

Functional fixedness is a specific form of mental set and fixation, which was alluded to earlier in the Maier experiment, and furthermore it is another way in which cognitive bias can be seen throughout daily life. Tim German and Clark Barrett describe this barrier as the fixed design of an object hindering the individual's ability to see it serving other functions. In more technical terms, these researchers explained that "[s]ubjects become "fixed" on the design function of the objects, and problem solving suffers relative to control conditions in which the object's function is not demonstrated."[25] Functional fixedness is defined as only having that primary function of the object itself hinder the ability of it serving another purpose other than its original function. In research that highlighted the primary reasons that young children are immune to functional fixedness, it was stated that "functional fixedness...[is when]subjects are hindered in reaching the solution to a problem by their knowledge of an object's conventional function."[26] Furthermore, it is important to note that functional fixedness can be easily expressed in commonplace situations. For instance, imagine the following situation: a man sees a bug on the floor that he wants to kill, but the only thing in his hand at the moment is a can of air freshener. If the man starts looking around for something in the house to kill the bug with instead of realizing that the can of air freshener could in fact be used not only as having its main function as to freshen the air, he is said to be experiencing functional fixedness. The man's knowledge of the can being served as purely an air freshener hindered his ability to realize that it too could have been used to serve another purpose, which in this instance was as an instrument to kill the bug. Functional fixedness can happen on multiple occasions and can cause us to have certain cognitive biases. If we only see an object as serving one primary focus than we fail to realize that the object can be used in various ways other than its intended purpose. This can in turn cause many issues with regards to problem solving. Common sense seems to be a plausible answer to functional fixedness. One could make this argument because it seems rather simple to consider possible alternative uses for an object. Perhaps using common sense to solve this issue could be the most accurate answer within this context. With the previous stated example, it seems as if it would make perfect sense to use the can of air freshener to kill the bug rather than to search for something else to serve that function but, as research shows, this is often not the case.

Functional fixedness limits the ability for people to solve problems accurately by causing one to have a very narrow way of thinking. Functional fixedness can be seen in other types of learning behaviors as well. For instance, research has discovered the presence of functional fixedness in many educational instances. Researchers Furio, Calatayud, Baracenas, and Padilla stated that "... functional fixedness may be found in learning concepts as well as in solving chemistry problems."[27] There was more emphasis on this function being seen in this type of subject and others.

There are several hypotheses in regards to how functional fixedness relates to problem solving.[28] There are also many ways in which a person can run into problems while thinking of a particular object with having this function. If there is one way in which a person usually thinks of something rather than multiple ways then this can lead to a constraint in how the person thinks of that particular object. This can be seen as narrow minded thinking, which is defined as a way in which one is not able to see or accept certain ideas in a particular context. Functional fixedness is very closely related to this as previously mentioned. This can be done intentionally and or unintentionally, but for the most part it seems as if this process to problem solving is done in an unintentional way.

Functional fixedness can affect problem solvers in at least two particular ways. The first is with regards to time, as functional fixedness causes people to use more time than necessary to solve any given problem. Secondly, functional fixedness often causes solvers to make more attempts to solve a problem than they would have made if they were not experiencing this cognitive barrier. In the worst case, functional fixedness can completely prevent a person from realizing a solution to a problem. Functional fixedness is a commonplace occurrence, which affects the lives of many people.

Unnecessary constraints

Unnecessary constraints are another very common barrier that people face while attempting to problem-solve. This particular phenomenon occurs when the subject, trying to solve the problem subconsciously, places boundaries on the task at hand, which in turn forces him or her to strain to be more innovative in their thinking. The solver hits a barrier when they become fixated on only one way to solve their problem, and it becomes increasingly difficult to see anything but the method they have chosen. Typically, the solver experiences this when attempting to use a method they have already experienced success from, and they can not help but try to make it work in the present circumstances as well, even if they see that it is counterproductive.[29]

Groupthink, or taking on the mindset of the rest of the group members, can also act as an unnecessary constraint while trying to solve problems.[30] This is due to the fact that with everybody thinking the same thing, stopping on the same conclusions, and inhibiting themselves to think beyond this. This is very common, but the most well-known example of this barrier making itself present is in the famous example of the dot problem. In this example, there are nine dots lying in a square- three dots across, and three dots running up and down. The solver is then asked to draw no more than four lines, without lifting their pen or pencil from the paper. This series of lines should connect all of the dots on the paper. Then, what typically happens is the subject creates an assumption in their mind that they must connect the dots without letting his or her pen or pencil go outside of the square of dots. Standardized procedures like this can often bring mentally invented constraints of this kind,[31] and researchers have found a 0% correct solution rate in the time allotted for the task to be completed.[32] The imposed constraint inhibits the solver to think beyond the bounds of the dots. It is from this phenomenon that the expression "think outside the box" is derived.[33]

This problem can be quickly solved with a dawning of realization, or insight. A few minutes of struggling over a problem can bring these sudden insights, where the solver quickly sees the solution clearly. Problems such as this are most typically solved via insight and can be very difficult for the subject depending on either how they have structured the problem in their minds, how they draw on their past experiences, and how much they juggle this information in their working memories[33] In the case of the nine-dot example, the solver has already been structured incorrectly in their minds because of the constraint that they have placed upon the solution. In addition to this, people experience struggles when they try to compare the problem to their prior knowledge, and they think they must keep their lines within the dots and not go beyond. They do this because trying to envision the dots connected outside of the basic square puts a strain on their working memory.[33]

Luckily, the solution to the problem becomes obvious as insight occurs following incremental movements made toward the solution. These tiny movements happen without the solver knowing. Then when the insight is realized fully, the "aha" moment happens for the subject.[34] These moments of insight can take a long while to manifest or not so long at other times, but the way that the solution is arrived at after toiling over these barriers stays the same.

Irrelevant information

Irrelevant information is information presented within a problem that is unrelated or unimportant to the specific problem.[29] Within the specific context of the problem, irrelevant information would serve no purpose in helping solve that particular problem. Often irrelevant information is detrimental to the problem solving process. It is a common barrier that many people have trouble getting through, especially if they are not aware of it. Irrelevant information makes solving otherwise relatively simple problems much harder.[35]

For example: "Fifteen percent of the people in Topeka have unlisted telephone numbers. You select 200 names at random from the Topeka phone book. How many of these people have unlisted phone numbers?"[36]

The people that are not listed in the phone book would not be among the 200 names you selected. The individuals looking at this task would have naturally wanted to use the 15% given to them in the problem. They see that there is information present and they immediately think that it needs to be used. This of course is not true. These kinds of questions are often used to test students taking aptitude tests or cognitive evaluations.[37] They aren't meant to be difficult but they are meant to require thinking that is not necessarily common. Irrelevant Information is commonly represented in math problems, word problems specifically, where numerical information is put for the purpose of challenging the individual.

One reason irrelevant information is so effective at keeping a person off topic and away from the relevant information, is in how it is represented.[37] The way information is represented can make a vast difference in how difficult the problem is to be overcome. Whether a problem is represented visually, verbally, spatially, or mathematically, irrelevant information can have a profound effect on how long a problem takes to be solved; or if it's even possible. The Buddhist monk problem is a classic example of irrelevant information and how it can be represented in different ways:

A Buddhist monk begins at dawn one day walking up a mountain, reaches the top at sunset, meditates at the top for several days until one dawn when he begins to walk back to the foot of the mountain, which he reaches at sunset. Making no assumptions about his starting or stopping or about his pace during the trips, prove that there is a place on the path which he occupies at the same hour of the day on the two separate journeys.

This problem is near impossible to solve because of how the information is represented. Because it is written out in a way that represents the information verbally, it causes us to try and create a mental image of the paragraph. This is often very difficult to do especially with all the irrelevant information involved in the question. This example is made much easier to understand when the paragraph is represented visually. Now if the same problem was asked, but it was also accompanied by a corresponding graph, it would be far easier to answer this question; irrelevant information no longer serves as a road block. By representing the problem visually, there are no difficult words to understand or scenarios to imagine. The visual representation of this problem has removed the difficulty of solving it.

These types of representations are often used to make difficult problems easier.[38] They can be used on tests as a strategy to remove Irrelevant Information, which is one of the most common forms of barriers when discussing the issues of problem solving.[29] Identifying crucial information presented in a problem and then being able to correctly identify its usefulness is essential. Being aware of irrelevant information is the first step in overcoming this common barrier.

Cognitive sciences: two schools

In cognitive sciences, researchers' realization that problem-solving processes differ across knowledge domains and across levels of expertise (e.g. Sternberg, 1995) and that, consequently, findings obtained in the laboratory cannot necessarily generalize to problem-solving situations outside the laboratory, has led to an emphasis on real-world problem solving since the 1990s. This emphasis has been expressed quite differently in North America and Europe, however. Whereas North American research has typically concentrated on studying problem solving in separate, natural knowledge domains, much of the European research has focused on novel, complex problems, and has been performed with computerized scenarios (see Funke, 1991, for an overview).

Europe

In Europe, two main approaches have surfaced, one initiated by Donald Broadbent (1977; see Berry & Broadbent, 1995) in the United Kingdom and the other one by Dietrich Dörner (1975, 1985; see Dörner & Wearing, 1995) in Germany. The two approaches share an emphasis on relatively complex, semantically rich, computerized laboratory tasks, constructed to resemble real-life problems. The approaches differ somewhat in their theoretical goals and methodology, however. The tradition initiated by Broadbent emphasizes the distinction between cognitive problem-solving processes that operate under awareness versus outside of awareness, and typically employs mathematically well-defined computerized systems. The tradition initiated by Dörner, on the other hand, has an interest in the interplay of the cognitive, motivational, and social components of problem solving, and utilizes very complex computerized scenarios that contain up to 2,000 highly interconnected variables (e.g., Dörner, Kreuzig, Reither & Stäudel's 1983 LOHHAUSEN project; Ringelband, Misiak & Kluwe, 1990). Buchner (1995) describes the two traditions in detail.

North America

In North America, initiated by the work of Herbert A. Simon on "learning by doing" in semantically rich domains (e.g. Anzai & Simon, 1979; Bhaskar & Simon, 1977), researchers began to investigate problem solving separately in different natural knowledge domains – such as physics, writing, or chess playing – thus relinquishing their attempts to extract a global theory of problem solving (e.g. Sternberg & Frensch, 1991). Instead, these researchers have frequently focused on the development of problem solving within a certain domain, that is on the development of expertise (e.g. Anderson, Boyle & Reiser, 1985; Chase & Simon, 1973; Chi, Feltovich & Glaser, 1981).

Areas that have attracted rather intensive attention in North America include:

Characteristics of complex problems

As elucidated by Dietrich Dörner and later expanded upon by Joachim Funke, complex problems have some typical characteristics that can be summarized as follows:[citation needed]

  • Complexity (large numbers of items, interrelations and decisions)
  • Dynamics (time considerations)
    • temporal constraints
    • temporal sensitivity
    • phase effects
    • dynamic unpredictability
  • Intransparency (lack of clarity of the situation)
    • commencement opacity
    • continuation opacity
  • Polytely (multiple goals)
    • inexpressiveness
    • opposition
    • transience

Collective problem solving

Problem solving is applied on many different levels − from the individual to the civilizational. Collective problem solving refers to problem solving performed collectively.

Social issues and global issues can typically only be solved collectively.

It has been noted that the complexity of contemporary problems has exceeded the cognitive capacity of any individual and requires different but complementary expertise and collective problem solving ability.[39]

Collective intelligence is shared or group intelligence that emerges from the collaboration, collective efforts, and competition of many individuals.

In a 1962 research report, Douglas Engelbart linked collective intelligence to organizational effectiveness, and predicted that pro-actively 'augmenting human intellect' would yield a multiplier effect in group problem solving: "Three people working together in this augmented mode [would] seem to be more than three times as effective in solving a complex problem as is one augmented person working alone".[40]

Henry Jenkins, a key theorist of new media and media convergence draws on the theory that collective intelligence can be attributed to media convergence and participatory culture.[41] He criticizes contemporary education for failing to incorporate online trends of collective problem solving into the classroom, stating "whereas a collective intelligence community encourages ownership of work as a group, schools grade individuals". Jenkins argues that interaction within a knowledge community builds vital skills for young people, and teamwork through collective intelligence communities contribute to the development of such skills.[42]

Collective impact is the commitment of a group of actors from different sectors to a common agenda for solving a specific social problem, using a structured form of collaboration.

After World War II the UN, the Bretton Woods organization and the WTO were created and collective problem solving on the international level crystallized since the 1980s around these 3 types of organizations. As these global institutions remain state-like or state-centric it has been called unsurprising that these continue state-like or state-centric approaches to collective problem-solving rather than alternative ones.[43]

It has been observed that models of liberal democracy provide neither adequate designs for collective problem solving nor handling the substantive challenges in society such as crime, war, economic decline, illness and environmental degradation to produce satisfying outcomes.[44]

Crowdsourcing is a process of accumulating the ideas, thoughts or information from many independent participants, with aim to find the best solution for a given challenge. Modern information technologies allow for massive number of subjects to be involved as well as systems of managing these suggestions that provide good results.[45] With the Internet a new capacity for collective, including planetary-scale, problem solving was created.[46]

See also

Notes

  1. ^ Schacter, D.L. et al. (2009). Psychology, Second Edition. New York: Worth Publishers. pp. 376
  2. ^ "In each case "where you want to be" is an imagined (or written) state in which you would like to be. We might use the term 'Problem Identification' or analysis in order to figure out exactly what the problem is. After we have found a problem we need to define what the problem is. In other words, a distinguished feature of a problem is that there is a goal to be reached and how you get there is not immediately obvious.", What is a problem? in S. Ian Robertson, Problem solving, Psychology Press, 2001, p. 2.
  3. ^ Rubin, M.; Watt, S. E.; Ramelli, M. (2012). "Immigrants' social integration as a function of approach-avoidance orientation and problem-solving style". International Journal of Intercultural Relations. 36: 498–505. doi:10.1016/j.ijintrel.2011.12.009.
  4. ^ Goldstein F. C., & Levin H. S. (1987). Disorders of reasoning and problem-solving ability. In M. Meier, A. Benton, & L. Diller (Eds.), Neuropsychological rehabilitation. London: Taylor & Francis Group.
  5. ^ Bernd Zimmermann, On mathematical problem solving processes and history of mathematics, University of Jena.
  6. ^ Duncker, K. (1935). Zur Psychologie des produktiven Denkens [The psychology of productive thinking]. Berlin: Julius Springer.
  7. ^ For example Duncker's "X-ray" problem; Ewert & Lambert's "disk" problem in 1932, later known as Tower of Hanoi.
  8. ^ Mayer, R. E. (1992). Thinking, problem solving, cognition. Second edition. New York: W. H. Freeman and Company.
  9. ^ *Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
  10. ^ J. Scott Armstrong, William B. Denniston, Jr. and Matt M. Gordon (1975). "The Use of the Decomposition Principle in Making Judgments" (PDF). Organizational Behavior and Human Performance. 14: 257–263. doi:10.1016/0030-5073(75)90028-8.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  11. ^ a b c "Commander's Handbook for Strategic Communication and Communication Strategy" (PDF). United States Joint Forces Command, Joint Warfighting Center, Suffolk, VA. 24 June 2010. Retrieved 10 October 2016.
  12. ^ "Einstein's Secret to Amazing Problem Solving (and 10 Specific Ways You Can Use It) - Litemind". litemind.com. Retrieved 2017-06-11.
  13. ^ Malakooti, Behnam (2013). Operations and Production Systems with Multiple Objectives. John Wiley & Sons. ISBN 978-1-118-58537-5.
  14. ^ Firend, Al R. (2014) The Problem Solving Model "PSM", The International Journal of Business and Management Research. Vol.7, No.1
  15. ^ Wang, Y., & Chiew, V. (2010). On the cognitive process of human problem solving. Cognitive Systems Research, 11(1), 81-92.
  16. ^ Nickerson, R. S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology. 2 (2): 176. doi:10.1037/1089-2680.2.2.175.
  17. ^ Hergovich, Schott; Burger (2010). "Biased evaluation of abstracts depending on topic and conclusion: Further evidence of a confirmation bias within scientific psychology". Current Psychology. 29 (3): 188–209. doi:10.1007/s12144-010-9087-5.
  18. ^ Nickerson, R. S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology. 2 (2): 175–220. doi:10.1037/1089-2680.2.2.175.
  19. ^ Allen (2011). "Theory-led confirmation bias and experimental persona". Research in Science & Technological Education. 29 (1): 107–127. doi:10.1080/02635143.2010.539973.
  20. ^ Wason, P. C. (1960). "On the failure to eliminate hypotheses in a conceptual task". Quarterly Journal of Experimental Psychology. 12: 129–140. doi:10.1080/17470216008416717.
  21. ^ Luchins, A. S. (1942). Mechanization in problem solving: The effect of Einstellung. Psychological Monographs, 54 (Whole No. 248).
  22. ^ Öllinger, Jones, & Knoblich (2008). Investigating the effect of mental set on insight problem solving. Experimental Psychology',' 55(4), 269–270.
  23. ^ Wiley, J (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition. 24 (4): 716–730. doi:10.3758/bf03211392.
  24. ^ Wiley, J (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition. 24 (4): 716–730. doi:10.3758/bf03211392.
  25. ^ German, Tim, P., and Barrett, Clark., H. Functional fixedness in a technologically sparse culture. University of California, Santa Barbara. American psychological society. 16 (1), 2005.
  26. ^ German, Tim, P., Defeyter, Margaret A. Immunity to functional fixedness in young children. University of Essex, Colchester, England. Psychonomic Bulletin and Review. 7 (4), 2000.
  27. ^ Furio, C.; Calatayud, M. L.; Baracenas, S; Padilla, O (2000). "Functional fixedness and functional reduction as common sense reasonings in chemical equilibrium and in geometry and polarity of molecules. Valencia, Spain". Science Education. 84 (5).
  28. ^ Adamson, Robert E. "Functional fixedness as related to problem solving: A repetition of three experiments. Stanford University. California". Journal of Experimental Psychology. 44 (4): 1952. doi:10.1037/h0062487.
  29. ^ a b c Kellogg, R. T. (2003). Cognitive psychology (2nd ed.). California: Sage Publications, Inc.
  30. ^ Cottam, Martha L., Dietz-Uhler, Beth, Mastors, Elena, & Preston, & Thomas. (2010). Introduction to Political Psychology (2nd ed.). New York: Psychology Press.
  31. ^ Meloy, J. R. (1998). The Psychology of Stalking, Clinical and Forensic Perspectives (2nd ed.). London, England: Academic Press.
  32. ^ MacGregor, J.N.; Ormerod, T.C.; Chronicle, E.P. (2001). "Information-processing and insight: A process model of performance on the nine-dot and related problems". Journal of Experimental Psychology: Learning, Memory, and Cognition. 27 (1): 176–201. doi:10.1037/0278-7393.27.1.176.
  33. ^ a b c Weiten, Wayne. (2011). Psychology: themes and variations (8th ed.). California: Wadsworth.
  34. ^ Novick, L. R., & Bassok, M. (2005). Problem solving. In K. J. Holyoak & R. G. Morrison (Eds.), Cambridge handbook of thinking and reasoning (Ch. 14, pp. 321-349). New York, NY: Cambridge University Press.
  35. ^ Walinga, Jennifer (2010). "From walls to windows: Using barriers as pathways to insightful solutions". The Journal of Creative Behavior. 44: 143–167. doi:10.1002/j.2162-6057.2010.tb01331.x.
  36. ^ Weiten, Wayne. (2011). Psychology: themes and variations (8th ed.) California: Wadsworth.
  37. ^ a b Walinga, Jennifer, Cunningham, J. Barton, & MacGregor, James N. (2011). Training insight problem solving through focus on barriers and assumptions. The Journal of Creative Behavior.
  38. ^ Vlamings, Petra H. J. M., Hare, Brian, & Call, Joseph. Reaching around barriers: The performance of great apes and 3-5-year-old children. Animal Cognition, 13, 273-285. doi:10.1007/s10071-009-0265-5
  39. ^ Hung, Woei (24 April 2013). "Team-based complex problem solving: a collective cognition perspective". Educational Technology Research and Development. 61 (3): 365–384. doi:10.1007/s11423-013-9296-3. Retrieved 29 January 2017.
  40. ^ Engelbart, Douglas (1962) Augmenting Human Intellect: A Conceptual Framework - section on Team Cooperation
  41. ^ Flew, Terry (2008). New Media: an introduction. Melbourne: Oxford University Press. {{cite book}}: Invalid |ref=harv (help)
  42. ^ Henry, Jenkins. "INTERACTIVE AUDIENCES? THE 'COLLECTIVE INTELLIGENCE' OF MEDIA FANS" (PDF). Retrieved December 11, 2016.
  43. ^ Park, Jacob; Conca, Ken; Conca, Professor of International Relations Ken; Finger, Matthias. The Crisis of Global Environmental Governance: Towards a New Political Economy of Sustainability. Routledge. ISBN 9781134059829. Retrieved 29 January 2017.
  44. ^ Briggs, Xavier de Souza. Democracy as Problem Solving: Civic Capacity in Communities Across the Globe. MIT Press. ISBN 9780262524858. Retrieved 29 January 2017.
  45. ^ Guazzini, Andrea; Vilone, Daniele; Donati, Camillo; Nardi, Annalisa; Levnajić, Zoran (10 November 2015). "Modeling crowdsourcing as collective problem solving". Scientific Reports. 5: 16557. doi:10.1038/srep16557. Retrieved 29 January 2017.
  46. ^ Stefanovitch, Nicolas; Alshamsi, Aamena; Cebrian, Manuel; Rahwan, Iyad (30 September 2014). "Error and attack tolerance of collective problem solving: The DARPA Shredder Challenge". EPJ Data Science. 3 (1). doi:10.1140/epjds/s13688-014-0013-1. {{cite journal}}: |access-date= requires |url= (help)

References

  • Altshuller, Genrich (1994). And Suddenly the Inventor Appeared. translated by Lev Shulyak. Worcester, MA: Technical Innovation Center. ISBN 0-9640740-1-X.
  • Amsel, E., Langer, R., & Loutzenhiser, L. (1991). Do lawyers reason differently from psychologists? A comparative design for studying expertise. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 223-250). Hillsdale, NJ: Lawrence Erlbaum Associates. ISBN 978-0-8058-1783-6
  • Anderson, J. R., Boyle, C. B., & Reiser, B. J. (1985). "Intelligent tutoring systems". Science. 228 (4698): 456–462. doi:10.1126/science.228.4698.456. PMID 17746875.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  • Anzai, K., & Simon, H. A. (1979). "The theory of learning by doing". Psychological Review. 86 (2): 124–140. doi:10.1037/0033-295X.86.2.124. PMID 493441.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  • Beckmann, J. F., & Guthke, J. (1995). Complex problem solving, intelligence, and learning ability. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 177-200). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Berry, D. C., & Broadbent, D. E. (1995). Implicit learning in the control of complex systems: A reconsideration of some of the earlier claims. In P.A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 131-150). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Bhaskar, R., & Simon, H. A. (1977). Problem solving in semantically rich domains: An example from engineering thermodynamics. Cognitive Science, 1, 193-215.
  • Blanchard-Fields, F. (2007). "Everyday problem solving and emotion: An adult developmental perspective". Current Directions in Psychological Science. 16 (1): 26–31. doi:10.1111/j.1467-8721.2007.00469.x.
  • Bransford, J. D., & Stein, B. S (1993). The ideal problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman.{{cite book}}: CS1 maint: multiple names: authors list (link)
  • Brehmer, B. (1995). Feedback delays in dynamic decision making. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 103-130). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Brehmer, B., & Dörner, D. (1993). Experiments with computer-simulated microworlds: Escaping both the narrow straits of the laboratory and the deep blue sea of the field study. Computers in Human Behavior, 9, 171-184.
  • Broadbent, D. E. (1977). Levels, hierarchies, and the locus of control. Quarterly Journal of Experimental Psychology, 29, 181-201.
  • Bryson, M., Bereiter, C., Scardamalia, M., & Joram, E. (1991). Going beyond the problem as given: Problem solving in expert and novice writers. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 61-84). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Buchner, A. (1995). Theories of complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 27-63). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55-81.
  • Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). "Categorization and representation of physics problems by experts and novices". Cognitive Science. 5 (2): 121–152. doi:10.1207/s15516709cog0502_2.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  • Dörner, D. (1975). Wie Menschen eine Welt verbessern wollten [How people wanted to improve the world]. Bild der Wissenschaft, 12, 48-53.
  • Dörner, D. (1985). Verhalten, Denken und Emotionen [Behavior, thinking, and emotions]. In L. H. Eckensberger & E. D. Lantermann (Eds.), Emotion und Reflexivität (pp. 157-181). München, Germany: Urban & Schwarzenberg.
  • Dörner, D. (1992). Über die Philosophie der Verwendung von Mikrowelten oder "Computerszenarios" in der psychologischen Forschung [On the proper use of microworlds or "computer scenarios" in psychological research]. In H. Gundlach (Ed.), Psychologische Forschung und Methode: Das Versprechen des Experiments. Festschrift für Werner Traxel (pp. 53-87). Passau, Germany: Passavia-Universitäts-Verlag.
  • Dörner, D., Kreuzig, H. W., Reither, F., & Stäudel, T. (Eds.). (1983). Lohhausen. Vom Umgang mit Unbestimmtheit und Komplexität [Lohhausen. On dealing with uncertainty and complexity]. Bern, Switzerland: Hans Huber.
  • Dörner, D., & Wearing, A. (1995). Complex problem solving: Toward a (computer-simulated) theory. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 65-99). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Duncker, K. (1935). Zur Psychologie des produktiven Denkens [The psychology of productive thinking]. Berlin: Julius Springer.
  • Ewert, P. H., & Lambert, J. F. (1932). Part II: The effect of verbal instructions upon the formation of a concept. Journal of General Psychology, 6, 400-411.
  • D'Zurilla, T. J.; Goldfried, M. R. (1971). "Problem solving and behavior modification". Journal of Abnormal Psychology. 78: 107–126. doi:10.1037/h0031360.
  • D'Zurilla, T. J., & Nezu, A. M. (1982). Social problem solving in adults. In P. C. Kendall (Ed.), Advances in cognitive-behavioral research and therapy (Vol. 1, pp. 201–274). New York: Academic Press.
  • Eyferth, K., Schömann, M., & Widowski, D. (1986). Der Umgang von Psychologen mit Komplexität [On how psychologists deal with complexity]. Sprache & Kognition, 5, 11-26.
  • Frensch, P. A., & Funke, J. (Eds.). (1995). Complex problem solving: The European Perspective. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Frensch, P. A., & Sternberg, R. J. (1991). Skill-related differences in game playing. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 343-381). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Funke, J. (1991). Solving complex problems: Human identification and control of complex systems. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 185-222). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Funke, J. (1993). Microworlds based on linear equation systems: A new approach to complex problem solving and experimental results. In G. Strube & K.-F. Wender (Eds.), The cognitive psychology of knowledge (pp. 313-330). Amsterdam: Elsevier Science Publishers.
  • Funke, J. (1995). Experimental research on complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 243-268). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Funke, U. (1995). Complex problem solving in personnel selection and training. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 219-240). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Goldstein F. C., & Levin H. S. (1987). Disorders of reasoning and problem-solving ability. In M. Meier, A. Benton, & L. Diller (Eds.), Neuropsychological rehabilitation. London: Taylor & Francis Group.
  • Groner, M., Groner, R., & Bischof, W. F. (1983). Approaches to heuristics: A historical review. In R. Groner, M. Groner, & W. F. Bischof (Eds.), Methods of heuristics (pp. 1-18). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hayes, J. (1980). The complete problem solver. Philadelphia: The Franklin Institute Press.
  • Hegarty, M. (1991). Knowledge and processes in mechanical problem solving. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 253-285). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Heppner, P. P., & Krauskopf, C. J. (1987). An information-processing approach to personal problem solving. The Counseling Psychologist, 15, 371-447.
  • Huber, O. (1995). Complex problem solving as multi stage decision making. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 151-173). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hübner, R. (1989). Methoden zur Analyse und Konstruktion von Aufgaben zur kognitiven Steuerung dynamischer Systeme [Methods for the analysis and construction of dynamic system control tasks]. Zeitschrift für Experimentelle und Angewandte Psychologie, 36, 221-238.
  • Hunt, E. (1991). Some comments on the study of complexity. In R. J. Sternberg, & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 383-395). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hussy, W. (1985). Komplexes Problemlösen - Eine Sackgasse? [Complex problem solving - a dead end?]. Zeitschrift für Experimentelle und Angewandte Psychologie, 32, 55-77.
  • Kay, D. S. (1991). Computer interaction: Debugging the problems. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 317-340). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Kluwe, R. H. (1993). Knowledge and performance in complex problem solving. In G. Strube & K.-F. Wender (Eds.), The cognitive psychology of knowledge (pp. 401-423). Amsterdam: Elsevier Science Publishers.
  • Kluwe, R. H. (1995). Single case studies and models of complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 269-291). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Kolb, S., Petzing, F., & Stumpf, S. (1992). Komplexes Problemlösen: Bestimmung der Problemlösegüte von Probanden mittels Verfahren des Operations Research ? ein interdisziplinärer Ansatz [Complex problem solving: determining the quality of human problem solving by operations research tools - an interdisciplinary approach]. Sprache & Kognition, 11, 115-128.
  • Krems, J. F. (1995). Cognitive flexibility and complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 201-218). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Lesgold, A., & Lajoie, S. (1991). Complex problem solving in electronics. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 287-316). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Mayer, R. E. (1992). Thinking, problem solving, cognition. Second edition. New York: W. H. Freeman and Company.
  • Müller, H. (1993). Komplexes Problemlösen: Reliabilität und Wissen [Complex problem solving: Reliability and knowledge]. Bonn, Germany: Holos.
  • Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
  • Paradies, M.W., & Unger, L. W. (2000). TapRooT - The System for Root Cause Analysis, Problem Investigation, and Proactive Improvement. Knoxville, TN: System Improvements.
  • Putz-Osterloh, W. (1993). Strategies for knowledge acquisition and transfer of knowledge in dynamic tasks. In G. Strube & K.-F. Wender (Eds.), The cognitive psychology of knowledge (pp. 331-350). Amsterdam: Elsevier Science Publishers.
  • Rath J. F.; Langenbahn D. M.; Simon D.; Sherr R. L.; Fletcher J.; Diller L. (2004). The construct of problem solving in higher level neuropsychological assessment and rehabilitation. Archives of Clinical Neuropsychology, 19, 613-635.
  • Rath, J. F.; Simon, D.; Langenbahn, D. M.; Sherr, R. L.; Diller, L. (2003). Group treatment of problem-solving deficits in outpatients with traumatic brain injury: A randomised outcome study. Neuropsychological Rehabilitation, 13, 461-488.
  • Riefer, D.M., & Batchelder, W.H. (1988). Multinomial modeling and the measurement of cognitive processes. Psychological Review, 95, 318-339.
  • Ringelband, O. J., Misiak, C., & Kluwe, R. H. (1990). Mental models and strategies in the control of a complex system. In D. Ackermann, & M. J. Tauber (Eds.), Mental models and human-computer interaction (Vol. 1, pp. 151-164). Amsterdam: Elsevier Science Publishers.
  • Schaub, H. (1993). Modellierung der Handlungsorganisation. Bern, Switzerland: Hans Huber.
  • Schoenfeld, A. H. (1985). Mathematical Problem Solving. Orlando, FL: Academic Press.
  • Sokol, S. M., & McCloskey, M. (1991). Cognitive mechanisms in calculation. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 85-116). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Stanovich, K. E., & Cunningham, A. E. (1991). Reading as constrained reasoning. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 3-60). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Sternberg, R. J. (1995). Conceptions of expertise in complex problem solving: A comparison of alternative conceptions. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 295-321). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Sternberg, R. J., & Frensch, P. A. (Eds.). (1991). Complex problem solving: Principles and mechanisms. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Strauß, B. (1993). Konfundierungen beim Komplexen Problemlösen. Zum Einfluß des Anteils der richtigen Lösungen (ArL) auf das Problemlöseverhalten in komplexen Situationen [Confoundations in complex problem solving. On the influence of the degree of correct solutions on problem solving in complex situations]. Bonn, Germany: Holos.
  • Strohschneider, S. (1991). Kein System von Systemen! Kommentar zu dem Aufsatz "Systemmerkmale als Determinanten des Umgangs mit dynamischen Systemen" von Joachim Funke [No system of systems! Reply to the paper "System features as determinants of behavior in dynamic task environments" by Joachim Funke]. Sprache & Kognition, 10, 109-113.
  • Van Lehn, K. (1989). Problem solving and cognitive skill acquisition. In M. I. Posner (Ed.), Foundations of cognitive science (pp. 527-579). Cambridge, MA: MIT Press.
  • Voss, J. F., Wolfe, C. R., Lawrence, J. A., & Engle, R. A. (1991). From representation to decision: An analysis of problem solving in international relations. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 119-158). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Wagner, R. K. (1991). Managerial problem solving. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 159-183). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Wisconsin Educational Media Association. (1993). "Information literacy: A position paper on information problem-solving." Madison, WI: WEMA Publications. (ED 376 817). (Portions adapted from Michigan State Board of Education's Position Paper on Information Processing Skills, 1992).