Jump to content

Causality

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 163.202.48.126 (talk) at 08:55, 11 July 2014 (~~~~test). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The Illustrated Sutra of Cause and Effect. 8th century, Japan

Causality (also referred to as causation[1]) is the relation between an event (the cause) and a second event (the effect), where the second event is understood as a consequence of the first.[2]

In common usage, causality is also the relation between a set of factors (causes) and sub-factors. Anything that affects an effect is a factor of that effect. A direct factor is a factor that affects an effect directly, that is, without any intervening factors. (Intervening factors are sometimes called "intermediate factors".) The connection between a cause(s) and an effect in this way can also be referred to as a causal nexus.

Though the causes and effects are typically related to changes or events, candidates include objects, processes, properties, variables, facts, and states of affairs; characterizing the causal relation can be the subject of much debate.

The philosophical treatment on the subject of causality extends over millennia. In the Western philosophical tradition, discussion stretches back at least to Aristotle, and the topic remains a staple in contemporary philosophy.

History

Western philosophy

Aristotelian

Aristotle identified four kinds of answer or explanatory mode to various "Why?" questions. As a result of traditional peculiarities of language, with translations between ancient Greek, Latin, and English, the word 'cause' is nowadays customarily used to label Aristotle's four kinds.[3][4]

  • Material cause, the material from whence a thing has come or that which persists while it changes, as for example, one's mother or the bronze of a statue (see also substance theory).[5]
  • Formal cause, whereby a thing's dynamic form or static shape determines the thing's properties and function, as a human differs from a statue of a human or as a statue differs from a lump of bronze.[6]
  • Efficient cause, which imparts the first relevant movement, as a human lifts a rock or raises a statue.
  • Final cause, the criterion of completion, or the end; it may refer to an action or to an inanimate process. Examples: Socrates takes a walk after dinner for the sake of his health; earth falls to the lowest level because that is its nature.

Of Aristotle's four kinds or explanatory modes, only one, the 'efficient cause' is a cause as defined in the leading paragraph of this present article. The other three explanatory modes would now be called material composition, structure and dynamics, and, again, criterion of completion. The word that Aristotle used was αἰτία. For the present purpose, that Greek word would be better translated as "explanation" than as "cause" as those words are most often used in current English. Another translation of Aristotle is that he meant "the four Becauses" as four kinds of answer to "why" questions.[3]

In some works of Aristotle, the four causes are listed as (1) the essential cause, (2) the logical ground, (3) the moving cause, and (4) the final cause. In this listing, a statement of essential cause is a demonstration that an indicated object conforms to a definition of the word that refers to it. A statement of logical ground is an argument as to why an object statement is true. These are further examples of the idea that a "cause" in general in the context of Aristotle's usage is an "explanation".[3]

The word "efficient" used here can also be translated from Aristotle as "moving" or "initiating".[3]

Efficient causation was connected with Aristotelian physics, which recognized the four elements (earth, air, fire, water), and added the fifth element (aether). Water and earth by their intrinsic property gravitas or heaviness intrinsically fall toward, whereas air and fire by their intrinsic property levitas or lightness intrinsically rise away from, Earth's center—the motionless center of the universe—in a straight line while accelerating during the substance's approach to its natural place.

As air remained on Earth, however, and did not escape Earth while eventually achieving infinite speed—an absurdity—Aristotle inferred that the universe is finite in size and contains an invisible substance that held planet Earth and its atmosphere, the sublunary sphere, centered in the universe. And since celestial bodies exhibit perpetual, unaccelerated motion orbiting planet Earth in unchanging relations, Aristotle inferred that the fifth element, aither, that fills space and composes celestial bodies intrinsically moves in perpetual circles, the only constant motion between two points. (An object traveling a straight line from point A to B and back must stop at either point before returning to the other.)

Left to itself, a thing exhibits natural motion, but can—according to Aristotelian metaphysics—exhibit enforced motion imparted by an efficient cause. The form of plants endows plants with the processes nutrition and reproduction, the form of animals adds locomotion, and the form of humankind adds reason atop these. A rock normally exhibits natural motion—explained by the rock's material cause of being composed of the element earth—but a living thing can lift the rock, an enforced motion diverting the rock from its natural place and natural motion. As a further kind of explanation, Aristotle identified the final cause, specifying a purpose or criterion of completion in light of which something should be understood.

Aristotle himself explained,

Cause means

(a) in one sense, that as the result of whose presence something comes into being—e.g., the bronze of a statue and the silver of a cup, and the classes which contain these [i.e., the material cause];

(b) in another sense, the form or pattern; that is, the essential formula and the classes which contain it—e.g. the ratio 2:1 and number in general is the cause of the octave—and the parts of the formula [i.e., the formal cause].

(c) The source of the first beginning of change or rest; e.g. the man who plans is a cause, and the father is the cause of the child, and in general that which produces is the cause of that which is produced, and that which changes of that which is changed [i.e., the efficient cause].

(d) The same as "end"; i.e. the final cause; e.g., as the "end" of walking is health. For why does a man walk? "To be healthy", we say, and by saying this we consider that we have supplied the cause [the final cause].

(e) All those means towards the end which arise at the instigation of something else, as, e.g., fat-reducing, purging, drugs and instruments are causes of health; for they all have the end as their object, although they differ from each other as being some instruments, others actions [i.e., necessary conditions].

— Metaphysics, Book 5, section 1013a, translated by Hugh Tredennick[7]

Aristotle further discerned two modes of causation: proper (prior) causation and accidental (chance) causation. All causes, proper and accidental, can be spoken as potential or as actual, particular or generic. The same language refers to the effects of causes, so that generic effects are assigned to generic causes, particular effects to particular causes, and actual effects to operating causes.

Averting infinite regress, Aristotle inferred the first mover—an unmoved mover. The first mover's motion, too, must have been caused, but, being an unmoved mover, must have moved only toward a particular goal or desire. So the universe of material causes, formal causes, and efficient causes reflected the universe's final cause.[citation needed]

Middle Ages

In line with Aristotelian cosmology, Thomas Aquinas posed a hierarchy prioritizing Aristotle's four causes: "final > efficient > material > formal".[8] Aquinas sought to identify the first efficient cause—now simply first cause—as everyone would agree, said Aquinas, to call it God. Later in the Middle Ages, many scholars conceded that the first cause was God, but explained that many earthly events occur within God's design or plan, and thereby scholars sought freedom to investigate the numerous secondary causes.

After the Middle Ages

With the end of the Middle Ages however, Aristotle's approach, especially concerning formal and final causes, was criticized by authors such as Niccolò Machiavelli, in the field of political thinking, and Francis Bacon, concerning science more generally. A widely used modern definition of causality was originally given by David Hume.[8] He denied that we can ever perceive cause and effect, except by developing a habit or custom of mind where we come to associate two types of object or event, always contiguous and occurring one after the other.[9] In Part III, section XV, Hume expanded this to a list of eight ways of judging whether two things might be cause and effect. The first three:

1. "The cause and effect must be contiguous in space and time."
2. "The cause must be prior to the effect."
3. "There must be a constant union betwixt the cause and effect. 'Tis chiefly this quality, that constitutes the relation."

And then additionally there are three connected criteria which come from our experience and which are "the source of most of our philosophical reasonings":

4. "The same cause always produces the same effect, and the same effect never arises but from the same cause. This principle we derive from experience, and is the source of most of our philosophical reasonings."
5. Hanging upon the above, Hume says that "where several different objects produce the same effect, it must be by means of some quality, which we discover to be common amongst them."
6. And "founded on the same reason": "The difference in the effects of two resembling objects must proceed from that particular, in which they differ."

And then two more:

7. "When any object encreases or diminishes with the encrease or diminution of its cause, 'tis to be regarded as a compounded effect, deriv'd from the union of the several different effects, which arise from the several different parts of the cause."
8. An "object, which exists for any time in its full perfection without any effect, is not the sole cause of that effect, but requires to be assisted by some other principle, which may forward its influence and operation."

In 1949, physicist Max Born distinguished determination from causality. For him, determination meant that actual events are so linked by laws of nature that certainly reliable predictions and retrodictions can be made from sufficient present data about them. For him, there are two kinds of causation, which we may here call nomic or generic causation, and singular causation. Nomic causality means that cause and effect are linked by more or less certain or probabilistic general laws covering many possible or potential instances; we may recognize this as a probabilized version of criterion 3. of Hume mentioned just above. Singular causation means that unique particular chains of actual events are essentially and physically linked by antecedence and contiguity, which we may here recognize as criteria 1. and 2. of Hume mentioned just above.[10]

19th century: The Second Law of Thermodynamics

In thermodynamics, a branch of physics, the Second Law of Thermodynamics, discovered in the 19th century, helps define an arrow of time. This provides an opportunity to physically describe how causes differ from effects: The sum of effects can never have lower entropy than the sum of causes - provided equilibrium conditions.

This is more thoroughly described below.

Causality, determinism, and existentialism

The deterministic world-view is one in which the universe is no more than a chain of events following one after another according to the law of cause and effect. To hold this worldview, as an incompatibilist, there is no such thing as "free will". However, compatibilists argue that determinism is compatible with, or even necessary for, free will. Existentialists argue that while no intrinsic meaning has been designed in a deterministic universe, we each can provide a meaning for ourselves.[11]

Indian philosophy

Karma is the belief held by Sanathana Dharma and major religions that a person's actions cause certain effects in the current life and/or in future life, positively or negatively. The various philosophical schools (darsanas) provide different accounts of the subject. The doctrine of satkaryavada affirms that the effect inheres in the cause in some way. The effect is thus either a real or apparent modification of the cause. The doctrine of asatkaryavada affirms that the effect does not inhere in the cause, but is a new arising. See Nyaya for some details of the theory of causation in the Nyaya school. In Brahma Samhita Brahma describes Krishna as the prime cause of all causes.[12]

The 4 types of causes identified by Aristotle (see above) are also recognized in the Vedic literature: the material cause (upādāna), the instrumental cause (nimitta), the formal cause (rūpa) and the ultimate cause (parāyana).[citation needed]

Bhagavad-gītā 18.14 identifies five causes for any action (knowing which it can be perfected): the body, the individual soul, the senses, the efforts and the supersoul.

Buddhist philosophy

According to the theory of action and result (karmaphala), our karmic actions are the principal cause of our happiness or suffering. From the Buddhist point of view, a positive or wholesome action is one that will lead to greater happiness for ourselves and others, and a negative or unwholesome action is one that will lead to greater suffering for ourselves or others.

The general or universal definition of pratityasamutpada (or "dependent origination" or "dependent arising" or "interdependent co-arising") is that everything arises in dependence upon multiple causes and conditions; nothing exists as a singular, independent entity.[b][c] A traditional example used in Buddhist texts is of three sticks standing upright and leaning against each other and supporting each other. If one stick is taken away, the other two will fall to the ground. Thich Nhat Hanh explains:[9]Pratitya samutpada is sometimes called the teaching of cause and effect, but that can be misleading, because we usually think of cause and effect as separate entities, with cause always preceding effect, and one cause leading to one effect. According to the teaching of Interdependent Co-Arising, cause and effect co-arise (samutpada) and everything is a result of multiple causes and conditions... In the sutras, this image is given: "Three cut reeds can stand only by leaning on one another. If you take one away, the other two will fall." For a table to exist, we need wood, a carpenter, time, skillfulness, and many other causes. And each of these causes needs other causes to be. The wood needs the forest, the sunshine, the rain, and so on. The carpenter needs his parents, breakfast, fresh air, and so on. And each of those things, in turn, has to be brought about by other causes and conditions. If we continue to look in this way, we'll see that nothing has been left out. Everything in the cosmos has come together to bring us this table. Looking deeply at the sunshine, the leaves of the tree, and the clouds, we can see the table. The one can be seen in the all, and the all can be seen in the one. One cause is never enough to bring about an effect. A cause must, at the same time, be an effect, and every effect must also be the cause of something else. Cause and effect inter-are. The idea of first and only cause, something that does not itself need a cause, cannot be applied.[d]

Logic

Necessary and sufficient causes

A similar concept occurs in logic, for this see Necessary and sufficient conditions

Causes are often distinguished into two types: Necessary and sufficient.[13] A third type of causation, which requires neither necessity nor sufficiency in and of itself, but which contributes to the effect, is called a "contributory cause."[14]

Necessary causes:

If x is a necessary cause of y, then the presence of y necessarily implies the presence of x. The presence of x, however, does not imply that y will occur.

Sufficient causes:

If x is a sufficient cause of y, then the presence of x necessarily implies the presence of y. However, another cause z may alternatively cause y. Thus the presence of y does not imply the presence of x.

Contributory causes:

A cause may be classified as a "contributory cause", if the presumed cause precedes the effect, and altering the cause alters the effect. It does not require that all those subjects which possess the contributory cause experience the effect. It does not require that all those subjects which are free of the contributory cause be free of the effect. In other words, a contributory cause may be neither necessary nor sufficient but it must be contributory.[15][16]

J. L. Mackie argues that usual talk of "cause", in fact refers to INUS conditions (insufficient but non-redundant parts of a condition which is itself unnecessary but sufficient for the occurrence of the effect).[17] For example, a short circuit as a cause for a house burning down. Consider the collection of events: the short circuit, the proximity of flammable material, and the absence of firefighters. Together these are unnecessary but sufficient to the house's burning down (since many other collections of events certainly could have led to the house burning down, for example shooting the house with a flamethrower in the presence of oxygen etc. etc.). Within this collection, the short circuit is an insufficient (since the short circuit by itself would not have caused the fire, but the fire would not have happened without it, everything else being equal) but non-redundant part of a condition which is itself unnecessary (since something else could have also caused the house to burn down) but sufficient for the occurrence of the effect. So, the short circuit is an INUS condition for the occurrence of the house burning down.

Causality contrasted with conditionals

Conditional statements are not statements of causality. An important distinction is that statements of causality require the antecedent to precede or coincide with the consequent in time, whereas conditional statements do not require this temporal order. Confusion commonly arises since many different statements in English may be presented using "If ..., then ..." form (and, arguably, because this form is far more commonly used to make a statement of causality). The two types of statements are distinct, however.

For example, all of the following statements are true when interpreting "If ..., then ..." as the material conditional:

  1. If Barack Obama is president of the United States in 2011, then Germany is in Europe.
  2. If George Washington is president of the United States in 2011, then <arbitrary statement>.

The first is true since both the antecedent and the consequent are true. The second is true in sentential logic and indeterminate in natural language, regardless of the consequent statement that follows, because the antecedent is false.

The ordinary indicative conditional has somewhat more structure than the material conditional. For instance, although the first is the closest, neither of the preceding two statements seems true as an ordinary indicative reading. But the sentence

  • If Shakespeare of Stratford-on-Avon did not write Macbeth, then someone else did.

intuitively seems to be true, even though there is no straightforward causal relation in this hypothetical situation between Shakespeare's not writing Macbeth and someone else's actually writing it.

Another sort of conditional, the counterfactual conditional, has a stronger connection with causality, yet even counterfactual statements are not all examples of causality. Consider the following two statements:

  1. If A were a triangle, then A would have three sides.
  2. If switch S were thrown, then bulb B would light.

In the first case, it would not be correct to say that A's being a triangle caused it to have three sides, since the relationship between triangularity and three-sidedness is that of definition. The property of having three sides actually determines A's state as a triangle. Nonetheless, even when interpreted counterfactually, the first statement is true.

A full grasp of the concept of conditionals is important to understanding the literature on causality. A crucial stumbling block is that conditionals in everyday English are usually loosely used to describe a general situation. For example, "If I drop my coffee, then my shoe gets wet" relates an infinite number of possible events. It is shorthand for "For any fact that would count as 'dropping my coffee', some fact that counts as 'my shoe gets wet' will be true". This general statement will be strictly false if there is any circumstance where I drop my coffee and my shoe doesn't get wet. However, an "If..., then..." statement in logic typically relates two specific events or facts—a specific coffee-dropping did or did not occur, and a specific shoe-wetting did or did not follow. Thus, with explicit events in mind, if I drop my coffee and wet my shoe, then it is true that "If I dropped my coffee, then I wet my shoe", regardless of the fact that yesterday I dropped a coffee in the trash for the opposite effect—the conditional relates to specific facts. More counterintuitively, if I didn't drop my coffee at all, then it is also true that "If I drop my coffee then I wet my shoe", or "Dropping my coffee implies I wet my shoe", regardless of whether I wet my shoe or not by any means. This usage would not be counterintuitive if it were not for the everyday usage. Briefly, "If X then Y" is equivalent to the first-order logic statement "A implies B" or "not A-and-not-B", where A and B are predicates, but the more familiar usage of an "if A then B" statement would need to be written symbolically using a higher order logic using quantifiers ("for all" and "there exists").

Questionable cause

Fallacies of questionable cause, also known as causal fallacies, non-causa pro causa (Latin for "non-cause for cause"), or false cause, are informal fallacies where a cause is incorrectly identified.

Theories

Counterfactual theories

A counterfactual conditional (sometimes called a subjunctive conditional or remote conditional and abbreviated cf) is a modal subjunctive conditional statement indicating what would be the case if its antecedent were true. This is to be contrasted with an material conditional, which indicates what is (in fact) the case if its antecedent is (in fact) true.

Psychological research shows that people's thoughts about the causal relationships between events influences their judgments of the plausibility of counterfactual alternatives, and conversely, their counterfactual thinking about how a situation could have turned out differently changes their judgements of the causal role of events and agents. Nonetheless, their identification of the cause of an event, and their counterfactual thought about how the event could have turned out differently do not always coincide.[18] People distinguish between various sorts of causes, e.g., strong and weak causes.[19] Research in the psychology of reasoning shows that people make different sorts of inferences from different sorts of causes.

Probabilistic causation

Interpreting causation as a deterministic relation means that if A causes B, then A must always be followed by B. In this sense, war does not cause deaths, nor does smoking cause cancer. As a result, many turn to a notion of probabilistic causation. Informally, A probabilistically causes B if A's occurrence increases the probability of B. This is sometimes interpreted to reflect imperfect knowledge of a deterministic system but other times interpreted to mean that the causal system under study is inherently probabilistic, such as quantum mechanics.

Causal calculus

When experiments are infeasible or illegal, the derivation of cause effect relationship from observational studies must rest on some qualitative theoretical assumptions, for example, that symptoms do not cause diseases, usually expressed in the form of missing arrows in causal graphs such as Bayesian Networks or path diagrams. The mathematical theory underlying these derivations relies on the distinction between conditional probabilities, as in , and interventional probabilities, as in . The former reads: "the probability of finding cancer in a person known to smoke" while the latter reads: "the probability of finding cancer in a person forced to smoke". The former is a statistical notion that can be estimated directly in observational studies, while the latter is a causal notion (also called "causal effect") which is what we estimate in a controlled randomized experiment.

The theory of "causal calculus"[20] permits one to infer interventional probabilities from conditional probabilities in causal Bayesian Networks with unmeasured variables. One very practical result of this theory is the characterization of confounding variables, namely, a sufficient set of variables that, if adjusted for, would yield the correct causal effect between variables of interest. It can be shown that a sufficient set for estimating the causal effect of on is any set of non-descendants of that -separate from after removing all arrows emanating from . This criterion, called "backdoor", provides a mathematical definition of "confounding" and helps researchers identify accessible sets of variables worthy of measurement.

Structure learning

While derivations in causal calculus rely on the structure of the causal graph, parts of the causal structure can, under certain assumptions, be learned from statistical data. The basic idea goes back to Sewall Wright's 1921 work[21] on path analysis. A "recovery" algorithm was developed by Rebane and Pearl (1987)[22] which rests on Wright's distinction between the three possible types of causal substructures allowed in a directed acyclic graph (DAG):

Type 1 and type 2 represent the same statistical dependencies (i.e., and are independent given ) and are, therefore, indistinguishable within purely cross-sectional data. Type 3, however, can be uniquely identified, since and are marginally independent and all other pairs are dependent. Thus, while the skeletons (the graphs stripped of arrows) of these three triplets are identical, the directionality of the arrows is partially identifiable. The same distinction applies when and have common ancestors, except that one must first condition on those ancestors. Algorithms have been developed to systematically determine the skeleton of the underlying graph and, then, orient all arrows whose directionality is dictated by the conditional independencies observed.[20][23][24][25]

Alternative methods of structure learning search through the many possible causal structures among the variables, and remove ones which are strongly incompatible with the observed correlations. In general this leaves a set of possible causal relations, which should then be tested by analyzing time series data or, preferably, designing appropriately controlled experiments. In contrast with Bayesian Networks, path analysis (and its generalization, structural equation modeling), serve better to estimate a known causal effect or to test a causal model than to generate causal hypotheses.

For nonexperimental data, causal direction can often be inferred if information about time is available. This is because (according to many, though not all, theories) causes must precede their effects temporally. This can be determined by statistical time series models, for instance, or with a statistical test based on the idea of Granger causality, or by direct experimental manipulation. The use of temporal data can permit statistical tests of a pre-existing theory of causal direction. For instance, our degree of confidence in the direction and nature of causality is much greater when supported by cross-correlations, ARIMA models, or cross-spectral analysis using vector time series data than by cross-sectional data.

Derivation theories

The Nobel Prize holder Herbert A. Simon and Philosopher Nicholas Rescher[26] claim that the asymmetry of the causal relation is unrelated to the asymmetry of any mode of implication that contraposes. Rather, a causal relation is not a relation between values of variables, but a function of one variable (the cause) on to another (the effect). So, given a system of equations, and a set of variables appearing in these equations, we can introduce an asymmetric relation among individual equations and variables that corresponds perfectly to our commonsense notion of a causal ordering. The system of equations must have certain properties, most importantly, if some values are chosen arbitrarily, the remaining values will be determined uniquely through a path of serial discovery that is perfectly causal. They postulate the inherent serialization of such a system of equations may correctly capture causation in all empirical fields, including physics and economics.

Manipulation theories

Some theorists have equated causality with manipulability.[27][28][29][30] Under these theories, x causes y only in the case that one can change x in order to change y. This coincides with commonsense notions of causations, since often we ask causal questions in order to change some feature of the world. For instance, we are interested in knowing the causes of crime so that we might find ways of reducing it.

These theories have been criticized on two primary grounds. First, theorists complain that these accounts are circular. Attempting to reduce causal claims to manipulation requires that manipulation is more basic than causal interaction. But describing manipulations in non-causal terms has provided a substantial difficulty.

The second criticism centers around concerns of anthropocentrism. It seems to many people that causality is some existing relationship in the world that we can harness for our desires. If causality is identified with our manipulation, then this intuition is lost. In this sense, it makes humans overly central to interactions in the world.

Some attempts to defend manipulability theories are recent accounts that don't claim to reduce causality to manipulation. These accounts use manipulation as a sign or feature in causation without claiming that manipulation is more fundamental than causation.[20][31]

Process theories

Some theorists are interested in distinguishing between causal processes and non-causal processes (Russell 1948; Salmon 1984).[32][33] These theorists often want to distinguish between a process and a pseudo-process. As an example, a ball moving through the air (a process) is contrasted with the motion of a shadow (a pseudo-process). The former is causal in nature while the latter is not.

Salmon (1984)[32] claims that causal processes can be identified by their ability to transmit an alteration over space and time. An alteration of the ball (a mark by a pen, perhaps) is carried with it as the ball goes through the air. On the other hand an alteration of the shadow (insofar as it is possible) will not be transmitted by the shadow as it moves along.

These theorists claim that the important concept for understanding causality is not causal relationships or causal interactions, but rather identifying causal processes. The former notions can then be defined in terms of causal processes.

Why-Because Graph of the capsizing of the Herald of Free Enterprise (click to see in detail).

Systemic causality

George Lakoff writes, in relation to the cause of Hurricane Sandy,[34]

Systemic causation, because it is less obvious, is more important to understand. A systemic cause may be one of a number of multiple causes. It may require some special conditions. It may be indirect, working through a network of more direct causes. It may be probabilistic, occurring with a significantly high probability. It may require a feedback mechanism. In general, causation in ecosystems, biological systems, economic systems, and social systems tends not to be direct, but is no less causal. And because it is not direct causation, it requires all the greater attention if it is to be understood and its negative effects controlled. Above all, it requires a name: systemic causation.

Fields

Science

Within the frames of a dynamic method called the scientific method, scientists set up experiments, normally with the end of determining causality in the physical world. For instance, one may want to know whether a high intake of carrots causes humans to develop the bubonic plague. As an observation of a correlation does not imply causation, it is necessary to use inductive reasoning from particular observations in order to strengthen (through observed reproducibility) or disprove hypotheses about causal relationships. The fundamentally uncertain nature of inductive reasoning has been claimed to give rise to scientific paradigm shifts, as described by Thomas Kuhn.

This framework is sometimes called the scientific method, and forms part of the Philosophy of science. The dichotomy between hard and soft science can be regarded as stemming from the increased uncertainty and vagueness connected to the inductive proofs of causal links in "softer" sciences.

Physics

Informally, physicists use the terminology of cause and effect in the same everyday fashion as most other people do. In the context of physical theory itself for example, some physicists will say that forces cause motions (or accelerations). However, strictly speaking, this is not the same as a formal theory of causality. Causality is not inherently implied in equations of motion, but postulated as an additional constraint that needs to be satisfied (i.e. a cause always precedes its effect). This constraint has mathematical implications[35] such as the Kramers-Kronig relations.

Causal notions appear in physics in the context of information, where "information" is what links a cause to its effect. Formally, it is expected that information can not travel faster than the speed of light since otherwise, reference coordinate systems could be constructed (using the Lorentz transform of special relativity) in which an observer would see an effect precede its cause (i.e. the postulate of causality would be violated).

Causal notions also appear in the related context of the flow of mass-energy (since mass-energy flow is generally considered to be linked to information flow). For example, it is commonplace to make use of the causality argument to argue that the group velocity of waves (such as electromagnetic waves) can not exceed the speed of light.

Causal notions are important in general relativity to the extent that to have an arrow of time demands that the universe's semi-Riemannian manifold be orientable, so that "future" and "past" are globally definable quantities.

Arguably the most prominent role of causal notions in physics, however, is statistical mechanics. The Second Law of Thermodynamics states that entropy - which can be thought of as a measure of disorder - will always increase in any closed system. (See also the fluctuation theorem). The irreversible increase of entropy therefore provides another "arrow of time" by which past and future can be distinguished. (As an analogy, if there is a stacked cube of 64 dice in a box and someone shakes the box, then the dice will no longer be stacked in a cube. The process is not reversible; shaking the box again will not cause the dice to be reassembled into a neat cube.) A formal physical definition of cause-and-effect, if such a thing is possible, may be related to the second law. However, while much has been written about this topic, there is not yet any generally accepted formal theory of causation tied to the second law.

Engineering

A causal system is a system with output and internal states that depends only on the current and previous input values. A system that has some dependence on input values from the future (in addition to possible past or current input values) is termed an acausal system, and a system that depends solely on future input values is an anticausal system. Acausal filters, for example, can only exist as postprocessing filters, because these filters can extract future values from a memory buffer or a file.

Biology, medicine & epidemiology

Austin Bradford Hill built upon the work of Hume and Popper and suggested in his paper "The Environment and Disease: Association or Causation?" that aspects of an association such as strength, consistency, specificity and temporality be considered in attempting to distinguish causal from noncausal associations in the epidemiological situation. See Bradford-Hill criteria. He did not note however, that temporality is the only necessary criterion among those aspects. Directed acyclic graphs (DAGs) are increasingly used in epidemiology to help enlighten causal thinking.[36]

Psychology

Psychologists take an empirical approach to causality, investigating how people and non-human animals detect or infer causation from sensory information, prior experience and innate knowledge.

Attribution

Attribution theory is the theory concerning how people explain individual occurrences of causation. Attribution can be external (assigning causality to an outside agent or force - claiming that some outside thing motivated the event) or internal (assigning causality to factors within the person - taking personal responsibility or accountability for one's actions and claiming that the person was directly responsible for the event). Taking causation one step further, the type of attribution a person provides influences their future behavior.

The intention behind the cause or the effect can be covered by the subject of action. See also accident; blame; intent; and responsibility.

Causal powers

Whereas David Hume argued that causes are inferred from non-causal observations, Immanuel Kant claimed that people have innate assumptions about causes. Within psychology, Patricia Cheng (1997)[37] attempted to reconcile the Humean and Kantian views. According to her power PC theory, people filter observations of events through a basic belief that causes have the power to generate (or prevent) their effects, thereby inferring specific cause-effect relations.

Causation and salience

Our view of causation depends on what we consider to be the relevant events. Another way to view the statement, "Lightning causes thunder" is to see both lightning and thunder as two perceptions of the same event, viz., an electric discharge that we perceive first visually and then aurally.

Naming and causality

David Sobel and Alison Gopnik from the Psychology Department of UC Berkeley designed a device known as the blicket detector which would turn on when an object was placed on it. Their research suggests that "even young children will easily and swiftly learn about a new causal power of an object and spontaneously use that information in classifying and naming the object."[38]

Perception of Launching Events

Some researchers such as Anjan Chatterjee at the University of Pennsylvania and Jonathan Fugelsang at the University of Waterloo are using neuroscience techniques to investigate the neural and psychological underpinnings of causal launching events in which one object causes another object to move. Both temporal and spatial factors can be manipulated.[39]

See Causal Reasoning (Psychology) for more information.

Statistics and economics

Statistics and economics usually employ pre-existing data or experimental data to infer causality by regression methods. The body of statistical techniques involves substantial use of regression analysis. Typically a linear relationship such as

is postulated, in which is the ith observation of the dependent variable (hypothesized to be the caused variable), for j=1,...,k is the ith observation on the jth independent variable (hypothesized to be a causative variable), and is the error term for the ith observation (containing the combined effects of all other causative variables, which must be uncorrelated with the included independent variables). If there is reason to believe that none of the s is caused by y, then estimates of the coefficients are obtained. If the null hypothesis that is rejected, then the alternative hypothesis that and equivalently that causes y cannot be rejected. On the other hand, if the null hypothesis that cannot be rejected, then equivalently the hypothesis of no causal effect of on y cannot be rejected. Here the notion of causality is one of contributory causality as discussed above: If the true value , then a change in will result in a change in y unless some other causative variable(s), either included in the regression or implicit in the error term, change in such a way as to exactly offset its effect; thus a change in is not sufficient to change y. Likewise, a change in is not necessary to change y, because a change in y could be caused by something implicit in the error term (or by some other causative explanatory variable included in the model).

The above way of testing for causality requires belief that there is no reverse causation, in which y would cause . This belief can be established in one of several ways. First, the variable may be a non-economic variable: for example, if rainfall amount is hypothesized to affect the futures price y of some agricultural commodity, it is impossible that in fact the futures price affects rainfall amount (provided that cloud seeding is never attempted). Second, the instrumental variables technique may be employed to remove any reverse causation by introducing a role for other variables (instruments) that are known to be unaffected by the dependent variable. Third, the principle that effects cannot precede causes can be invoked, by including on the right side of the regression only variables that precede in time the dependent variable; this principle is invoked, for example, in testing for Granger causality and in its multivariate analog, vector autoregression, both of which control for lagged values of the dependent variable while testing for causal effects of lagged independent variables.

Regression analysis controls for other relevant variables by including them as regressors (explanatory variables). This helps to avoid false inferences of causality due to the presence of a third, underlying, variable that influences both the potentially causative variable and the potentially caused variable: its effect on the potentially caused variable is captured by directly including it in the regression, so that effect will not be picked up as an indirect effect through the potentially causative variable of interest.

Management

Used in management and engineering, an Ishikawa diagram shows the factors that cause the effect. Smaller arrows connect the sub-causes to major causes.

For quality control in manufacturing in the 1960s, Kaoru Ishikawa developed a cause and effect diagram, known as an Ishikawa diagram or fishbone diagram. The diagram categorizes causes, such as into the six main categories shown here. These categories are then sub-divided. Ishikawa's method identifies "causes" in brainstorming sessions conducted among various groups involved in the manufacturing process. These groups can then be labeled as categories in the diagrams. The use of these diagrams has now spread beyond quality control, and they are used in other areas of management and in design and engineering. Ishikawa diagrams have been criticized for failing to make the distinction between necessary conditions and sufficient conditions. It seems that Ishikawa was not even aware of this distinction.[40]

Humanities

History

In the discussion of history, events are sometimes considered as if in some way being agents that can then bring about other historical events. Thus, the combination of poor harvests, the hardships of the peasants, high taxes, lack of representation of the people, and kingly ineptitude are among the causes of the French Revolution. This is a somewhat Platonic and Hegelian view that reifies causes as ontological entities. In Aristotelian terminology, this use approximates to the case of the efficient cause.

Some philosophers of history such as Arthur Danto have claimed that "explanations in history and elsewhere" describe "not simply an event – something that happens – but a change".[41] Like many practicing historians, they treat causes as intersecting actions and sets of actions which bring about "larger changes", in Danto’s words: to decide "what are the elements which persist through a change" is "rather simple" when treating an individual’s "shift in attitude", but "it is considerably more complex and metaphysically challenging when we are interested in such a change as, say, the break-up of feudalism or the emergence of nationalism".[42]

Much of the historical debate about causes has focused on the relationship between communicative and other actions, between singular and repeated ones, and between actions, structures of action or group and institutional contexts and wider sets of conditions.[43] John Gaddis has distinguished between exceptional and general causes (following Marc Bloch) and between "routine" and "distinctive links" in causal relationships: "in accounting for what happened at Hiroshima on August 6, 1945, we attach greater importance to the fact that President Truman ordered the dropping of an atomic bomb than to the decision of the Army Air Force to carry out his orders."[44] He has also pointed to the difference between immediate, intermediate and distant causes.[45] For his part, Christopher Lloyd puts forward four "general concepts of causation" used in history: the "metaphysical idealist concept, which asserts that the phenomena of the universe are products of or emanations from an omnipotent being or such final cause"; "the empiricist (or Humean) regularity concept, which is based on the idea of causation being a matter of constant conjunctions of events"; "the functional/teleological/consequential concept", which is "goal-directed, so that goals are causes"; and the "realist, structurist and dispositional approach, which sees relational structures and internal dispositions as the causes of phenomena".[46]

Law

According to law and jurisprudence, legal cause must be demonstrated to hold a defendant liable for a crime or a tort (i.e. a civil wrong such as negligence or trespass). It must be proven that causality, or a "sufficient causal link" relates the defendant's actions to the criminal event or damage in question. Causation is also an essential legal element that must be proven to qualify for remedy measures under international trade law.[47]

Theology

Note the concept of omnicausality in Abrahamic theology, which is the belief that God has set in motion all events at the dawn of time; he is the determiner and the cause of all things. It is therefore an attempt to rectify the apparent incompatibility between determinism and the existence of an omnipotent god.[48]

See also

References

  1. ^ 'The action of causing; the relation of cause and effect' OED
  2. ^ Random House Unabridged Dictionary
  3. ^ a b c d Graham, D.W. (1987). Aristotles's Two Systems, Oxford University Press, Oxford UK, ISBN 0-19-824970-5
  4. ^ http://www.wisdomsupreme.com/dictionary/aristotles-four-causes.php
  5. ^ Soccio, D.J. (2011). Archetypes of Wisdom: An Introduction to Philosophy, 8th Ed.: An Introduction to Philosophy. Wadsworth. p. 167. ISBN 9781111837792.
  6. ^ Falcon, Andrea (2012). Edward N. Zalta (ed.). "Aristotle on Causality". The Stanford Encyclopedia of Philosophy (Winter 2012 ed.). In the Physics, Aristotle builds on his general account of the four causes by developing explanatory principles that are specific to the study of nature. Here Aristotle insists that all four causes are involved in the explanation of natural phenomena, and that the job of "the student of nature is to bring the why-question back to them all in the way appropriate to the science of nature" (Phys. 198 a 21–23). The best way to understand this methodological recommendation is the following: the science of nature is concerned with natural bodies insofar as they are subject to change, and the job of the student of nature is to provide the explanation of their natural change. The factors that are involved in the explanation of natural change turn out to be matter, form, that which produces the change, and the end of this change. Note that Aristotle does not say that all four explanatory factors are involved in the explanation of each and every instance of natural change. Rather, he says that an adequate explanation of natural change may involve a reference to all of them. Aristotle goes on by adding a specification on his doctrine of the four causes: the form and the end often coincide, and they are formally the same as that which produces the change (Phys. 198 a 23–26).
  7. ^ Aristotle. Aristotle in 23 Volumes, Vols.17, 18, translated by Hugh Tredennick. Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1933, 1989. (hosted at perseus.tufts.edu.)
  8. ^ a b William E. May (April 1970). "Knowledge of Causality in Hume and Aquinas". The Thomist. 34. Retrieved 6 April 2011. {{cite journal}}: External link in |journal= (help)
  9. ^ Hume, David (1896) [1739], Selby-Bigge (ed.), A Treatise of Human Nature, Clarendon Press
  10. ^ Born, M. (1949). Natural Philosophy of Cause and Chance, Oxford University Press, London, p. 9.
  11. ^ "What Eminent People Have Said about the Meaning of Life". Richard T. Kinnier, Jerry L. Kernes, Nancy Tribbensee, Christina M. Van Puymbroeck. Retrieved 9 April 2012.
  12. ^ "Brahma Samhita, Chapter 5: Hymn to the Absolute Truth". Bhaktivedanta Book Trust. Retrieved 19 May 2014.
  13. ^ Epp, Susanna S.: "Discrete Mathematics with Applications, Third Edition", pp 25-26. Brooks/Cole—Thomson Learning, 2004. ISBN 0-534-35945-0
  14. ^ Necessary»Sufficient»Contributory cause retrieved 31 August 2009.
  15. ^ Attention: This template ({{cite pmid}}) is deprecated. To cite the publication identified by PMID 450828, please use {{cite journal}} with |pmid=450828 instead.
  16. ^ What Is Cause And Effect?: Main and Contributory Causes by Carolyn K., retrieved 31 August 2009.
  17. ^ Mackie, John L. The Cement of the Universe: A study in Causation. Clarendon Press, Oxford, England, 1988.
  18. ^ Byrne, R.M.J. (2005). The Rational Imagination: How People Create Counterfactual Alternatives to Reality. Cambridge, Massachusetts: MIT Press.
  19. ^ Miller, G. & Johnson-Laird, P.N. (1976). Language and Perception. Cambridge: Cambridge University Press.
  20. ^ a b c Pearl, Judea (2000). Causality: Models, Reasoning, and Inference, Cambridge University Press.
  21. ^ Wright, S., "Correlation and Causation", Journal of Agricultural Research, vol. 20, #7, pp. 557–585.
  22. ^ Rebane, G. and Pearl, J., "The Recovery of Causal Poly-trees from Statistical Data", Proceedings, 3rd Workshop on Uncertainty in AI, (Seattle) pp. 222-228,1987
  23. ^ Spirites, P. and Glymour, C., "An algorithm for fast recovery of sparse causal graphs", Social Science Computer Review, Vol. 9, pp. 62-72, 1991.
  24. ^ Spirtes, P. and Glymour, C. and Scheines, R., Causation, Prediction, and Search, New York: Springer-Verlag, 1993
  25. ^ Verma, T. and Pearl, J., "Equivalence and Synthesis of Causal Models", Proceedings of the Sixth Conference on Uncertainty in Artificial Intelligence, (July, Cambridge, Massachusetts), pp. 220-227, 1990. Reprinted in P. Bonissone, M. Henrion, L.N. Kanal and J.F.\ Lemmer (Eds.), Uncertainty in Artificial Intelligence 6, Amsterdam: Elsevier Science Publishers, B.V., pp. 225-268, 1991
  26. ^ Simon, Herbert, and Rescher, Nicholas (1966) "Cause and Counterfactual." Philosophy of Science 33: 323–40.
  27. ^ Collingwood, R.(1940) An Essay on Metaphysics. Clarendon Press.
  28. ^ Gasking, D. (1955) "Causation and Recipes" Mind (64): 479-487.
  29. ^ Menzies, P. and H. Price (1993) "Causation as a Secondary Quality" British Journal for the Philosophy of Science (44): 187-203.
  30. ^ von Wright, G.(1971) Explanation and Understanding. Cornell University Press.
  31. ^ Woodward, James (2003) Making Things Happen: A Theory of Causal Explanation. Oxford University Press, ISBN 0-19-515527-0
  32. ^ a b Salmon, W. (1984) Scientific Explanation and the Causal Structure of the World. Princeton University Press.
  33. ^ Russell, B. (1948) Human Knowledge. Simon and Schuster.
  34. ^ http://blogs.berkeley.edu/2012/11/05/global-warming-systemically-caused-hurricane-sandy/
  35. ^ Kinsler, P. (2011). "How to be causal". Eur. J. Phys. 32 (6): 1687. arXiv:1106.1692. Bibcode:2011EJPh...32.1687K. doi:10.1088/0143-0807/32/6/022.
  36. ^ Chiolero, A (1 January 2014). "Assessing the possible direct effect of birth weight on childhood blood pressure: a sensitivity analysis". American journal of epidemiology. 179 (1): 4–11. PMID 24186972. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  37. ^ Cheng, P.W. (1997). "From Covariation to Causation: A Causal Power Theory." Psychological Review 104: 367-405.
  38. ^ Gopnik, A; Sobel, David M. (September–October 2000). "Detecting Blickets: How Young Children Use Information about Novel Causal Powers in Categorization and Induction". Child Development. 71 (5): 1205–1222. doi:10.1111/1467-8624.00224. PMID 11108092.
  39. ^ Straube, B; Chatterjee, A (2010). "Space and time in perceptual causality". Frontiers in Human Neuroscience. 4: 28. doi:10.3389/fnhum.2010.00028. PMC 2868299. PMID 20463866.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  40. ^ Gregory, Frank Hutson (1992) Cause, Effect, Efficiency & Soft Systems Models, Warwick Business School Research Paper No. 42 (ISSN 0265-5976), later published in Journal of the Operational Research Society, vol. 44 (4), pp 333-344.
  41. ^ Danto, Arthur (1965) Analytical Philosophy of History, 233.
  42. ^ Ibid., 249.
  43. ^ Hewitson, Mark (2014) History and Causality, 86-116.
  44. ^ Gaddis, John L. (2002), The Landscape of History: How Historians Map the Past, 64.
  45. ^ Ibid., 95.
  46. ^ Lloyd, Christopher (1993) Structures of History, 159.
  47. ^ "Dukgeun Ahn & William J. Moon, Alternative Approach to Causation Analysis in Trade Remedy Investigations, Journal of World Trade". Retrieved 5 October 2010.
  48. ^ See for example van der Kooi, Cornelis (2005). As in a mirror: John Calvin and Karl Barth on knowing God: a diptych. Studies in the history of Christian traditions. Vol. 120. Brill. p. 355. ISBN 978-90-04-13817-9. Retrieved 3 May 2011. [Barth] upbraids Polanus for identifying God's omnipotence with his omnicausality.

Further Reading

  • Azamat Abdoullaev (2000). The Ultimate of Reality: Reversible Causality, in Proceedings of the 20th World Congress of Philosophy, Boston: Philosophy Documentation Centre, internet site, Paideia Project On-Line: http://www.bu.edu/wcp/MainMeta.htm
  • Arthur Danto (1965). Analytical Philosophy of History. Cambridge University Press.
  • Idem, 'Complex Events', Philosophy and Phenomenological Research, 30 (1969), 66-77.
  • Idem, 'On Explanations in History', Philosophy of Science, 23 (1956), 15-30.
  • Dorschel, Andreas, 'The Crypto-Metaphysic of 'Ultimate Causes'. Remarks on an alleged Exposé' (transl. Edward Craig), in: Ratio, N.S. I (1988), nr. 2, pp. 97–112.
  • Green, Celia (2003). The Lost Cause: Causation and the Mind-Body Problem. Oxford: Oxford Forum. ISBN 0-9536772-1-4 Includes three chapters on causality at the microlevel in physics.
  • Hewitson, Mark (2014). History and Causality. Palgrave Macmillan. ISBN 0-978113737239-0.
  • Little, Daniel (1998). Microfoundations, Method and Causation: On the Philosophy of the Social Sciences. New York: Transaction.
  • Lloyd, Christopher (1993). The Structures of History. Oxford: Blackwell.
  • Idem (1986). Explanation in Social History. Oxford: Blackwell.
  • Maurice Mandelbaum (1977). The Anatomy of Historical Knowledge. Baltimore: Johns Hopkins Press.
  • Judea Pearl (2000). Causality: Models of Reasoning and Inference [1] Cambridge University Press ISBN 978-0-521-77362-1
  • Rosenberg, M. (1968). The Logic of Survey Analysis. New York: Basic Books, Inc.
  • Spirtes, Peter, Clark Glymour and Richard Scheines Causation, Prediction, and Search, MIT Press, ISBN 0-262-19440-6
  • University of California journal articles, including Judea Pearl's articles between 1984 and 1998 [2].
  • Schimbera, Jürgen / Schimbera, Peter (2010), Determination des Indeterminierten. Kritische Anmerkungen zur Determinismus- und Freiheitskontroverse, Hamburg: Verlag Dr. Kovac, ISBN 978-3-8300-5099-5{{citation}}: CS1 maint: multiple names: authors list (link)

Stanford Encyclopedia of Philosophy

General