Jump to content

Talk:Collectively exhaustive events

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

I changed the P(head) = 0.5, P(tail)=0.5, and P(head)+P(tail)=1 explanation because it gives the wrong impression of what collectively exhaustive means. For instance, say you're picking an integer, x, less than or equal to 10. P(x is even) = 0.5, P(x<6) = 0.5, so P(x is even)+P(x<6)=1 but the two events aren't collectively exhaustive because odd numbers greater than or equal to 6 are never chosen. In fact, P(x is even OR x<6) = 0.8.

I added the section about the comparison of mutual exclusivity with colletive exhaustion. I did the same thing with the wikipedia stub on mutually exclusive and crossed referenced both articles with each other. I am not a professional mathematician, so if I have made an error in this update please correct it and I'll have no worries.

capitalist

I removed the stub tag from this article because its scope is narrow enough to justify the short length. capitalist 03:39, 6 September 2005 (UTC)[reply]

Quotation from Couturat 1914:23

[edit]

The following appears as a footnote on page 23

"As Mrs. LADD·FRANKLlN has truly remarked (BALDWIN, Dictionary of Philosophy and Psychology, article "Laws of Tbought"), the principle of contradiction is not sufficient to define contradictories; the principle of excluded middle must be added which equally deserves the name of principle of contradiction. This is why Mrs. LADD-FRANKLIN proposes to call them respectively the principle of exclusion and the principle of exhaustion, inasmuch as, according to the first, two contradictory terms are exclusive (the one of the other); and, according to the second, they are exhaustive (of the universe of discourse)." (Louis Couturat, translated by Lydia Gillingham Robinson, 1914, The Algebra of Logic, The Open Court Publishing Company, Chicago and London.)

The above can be downloaded from googlebooks as a pdf. Am not sure what to do with this yet, except that the last paragraph looks quite wrong, especially from a historical perspective. The other place to look is Stephen Kleene 1952 Metamathematics. Or even way back into Boole etc in the ca 1850's. BillWvbailey (talk) 19:28, 1 September 2012 (UTC)[reply]

Removed History Section

[edit]

I removed the history section because whoever wrote it was writing about the history of the term "mutually exclusive" when this Wiki entry is about "collectively exhaustive". For mutually exclusive events, go to that Wiki entry--we shouldn't be focusing on that concept in this entry. — Preceding unsigned comment added by 173.3.109.197 (talk) 16:06, 21 November 2012 (UTC)[reply]

Two of the quotes discuss "exhaustive" (of the universe of discourse, Ladd-Franklin) and only one of the three alternatives hold (Kleene). I pruned out the two quotes re mutual exclusion only and left in "collectively" exhaustive, i.e. both quotes are pertinent. BillWvbailey (talk) 16:48, 21 November 2012 (UTC)[reply]
I added italics to emphasize that the word "exhaustion" does in fact appear in the quotes (apparently anonymous didn't read the quotes very carefully). Also note that the article lead does in fact discuss "mutually exclusive"; the two ideas "mutual exclusion" and "exhaustion" are intimately connected. The adjective "collectively" is redundant. BillWvbailey (talk) 17:08, 21 November 2012 (UTC)[reply]

History insufficiently nearly exhaustive.

[edit]

Jeremy Bentham used the term exhaustive in the set theory sense in his "Chrestomathia" of 1817, (available at http://archive.org/details/chrestomathiabe00bentgoog) and what is more he explained it in the following terms "words can, it is supposed, be necessary. To be exhaustive, the parts which, at each partition or division so made, are the results of the operation

— must, if put together again, be equal to the whole,
— and thus, and in this sense, exhaust (to use the word employed by logicians) the contents of the whole."

Accordingly it must already have been in use by that time in much the sense in which it is used here. Ngram also gives several books using the term in such senses in the 19th century. This was not in connection with statistics or probability, but the sense is so closely connected with set theory (in connection with which I was looking it up) that I am not sure why the two are separated. JonRichfield (talk) 14:30, 3 June 2013 (UTC)[reply]

Nice find. This would make a good addition to the article. As I recall, when I added the Ladd-Franklin and Kleene quotes I first hunted through my various googlebooks dating from the 19th C (starting around the time of Boole) but didn't find the words used anywhere. I'd expect (because of the classical "logicians", aka "the schoolmen" with their fussy syllogisms) that if there's anything earlier than the 18th C it would go all the way back to the ancient Greeks.
I agree with you that this logical priciple has much broader scope than statistics; e.g. the "switch" or "case" statements. Bill Wvbailey (talk) 21:58, 3 June 2013 (UTC)[reply]

Thanks Bill. Would you like to expand the content to at least mention such concepts, or create an article on partitioning and exhaustion? I'd prefer to keep it all together in this article, but the title seems to me to be too restrictive for that. If you would prefer that I did the legwork, I wouldn't mind, but then I'd like some helpful suggestions concerning what you would see as the right approach (say, splitting, uniting, renaming articles, redirs etc). Cheers, Jon JonRichfield (talk) 11:27, 5 June 2013 (UTC)[reply]

Anything you can do to improve this article would be appreciated by the community -- as it stands now this article is awful. When I added the quotes, I also added the four references. But none of them offer anything more a citation (i.e. origin) of the quotes. I've looked in my books and can find nothing -- in a book on information theory (in particular, conditional probability), in a book on machine learning (in particular "Bayesian Learning" and a formula that sums all the probabilities to 1), in a graduate-level engineering text on combinatorial and sequential "logic" (in particular, the generation of a Karnaugh map with its 2^n minterms and their combinations to "cover the map"), etc. Nothing at all about "collectively exhaustive events." I can't seem to locate in my library any statistics books worthy of the name. So my knowledge and library is not going to be of much use. Bill Wvbailey (talk) 14:23, 7 June 2013 (UTC)[reply]
Oh Blast! I'm not promising anything, but I'll have a look and see. I'll probably come back here before doing anything bold though! Cheers,

Jon JonRichfield (talk) 18:52, 8 June 2013 (UTC)[reply]

"Mutually Exclusive and Collectively Exhaustive" in Information theory via statistics

[edit]

The following is thin gruel, but it does point to where we might find some useful references. Although the words were not to be found in the index, I did find something in Reza 1961 [An Introduction to Information Theory, reprinted 1994, McGraw-Hill, ISBN 0-486-68210-2], section 2-12 "Bayes's Theorem": [U here represents the "universal set" (universe of discourse), Ø represents the empty set (cf page 24); the original uses engineering symbolism and omits the ∩ and substitutes + for ∪; previously on page 24 he defines "mutually exclusive" sets as two sets such that A ∩ B = Ø. ]:

"In many problems we wish to concentrate on two mutually exclusive and exhaustive events of the sample space, that is two events A1 and A2 such that
A1 ∩ A2 = Ø
A1 ∪ A2 = U " (page 46)
"Bayes's theorem, like Thevenin's theorem, can be extended to a partitioning of the sample space into mutually exclusive and exhaustive parts. Suppose that an event E can occur as the result of the occurrence of several mutually exclusive and exhaustive events A1, A2, . . ., An. [etc]" (p. 47) From this he derives Bayes's theorem in order "to find the a posteriori probability of the occurrence of event Ak, given the occurrence of E". (p. 48)

Then on page 76, "3-1. A Measure of Uncertainty", he extends the above pair of equations to a "sample space" as follows:

"Consider the sample space of events pertaining to a random experiment. We partition the sample space in a finite number of mutually exclusive events Ek, whose probabilities pk are assumed to be known [figure 3-1]. The set of all events under consideration can be designated as a row matrix [E] and the corresponding probabilities as another row matrix [P].
[E] = [E1, E2, . . ., Ek], with ∪k=1n Ek = U
[p] = [pk, pk, . . ., pk], with Σk=1n pk = 1."
"The two equations above] contain all the information that we have about the probability space which is called a complete finite scheme."(*p. 77)
From this he derives H, "a measure of surprise or uncertainty", of such a "scheme" i.e. the "entropy" (cf p. 77ff). Later on page 81 he shows "a partitioning" of the probability space of a given En into disjoint subsets F1, F2, . . ., Fn such that their union is the given En (although he doesn't use the word "collectively exhaustive").

So from this thin gruel I have to assume that there is, somewhere, a book (statistics? information theory?) that will define the notion "collectively exhaustive" or the equivalent overtly, defining the notion as the various equations do above. Also, clearly the use of the word "partitioning" is justified in this context.

There's an interesting historical footnote on page 49: "*Reverend Thomas Bayes's article An Essay forwards soving a Problem in the Doctrine of Chances was published in Philosophical Transactions of the Royal Socienty of London (vol. 1, no. 3, p. 370, 1763). However, Bayes's work remained rather unknown until 1774, when Laplace discussed it in one of his memoirs".

Bill Wvbailey (talk) 15:57, 9 June 2013 (UTC)[reply]

Aristotle by way of William Hamilton 1860 and the "Doctrine of Division"

[edit]

RE history: I found something in Aristotle, via William Hamilton 1860 Lectures on Metaphysics and Logic, Vol. II Logic, In Hamilton's chapter on "The Doctrine of Division" (pages 350ff) Hamilton lists 7 principles of "Division", numbers 5 and 6 being of interest here:

“Logical Division proposes to render the characters contained under an object, that is, the extension of a notion, Distinct and Exhaustive. Division is, therefore, the evolution of the extension of a notion; and it is expressed in a disjunctive proposition, of which the notion divided constitutes the subject, and the notions contained under it, the predicate.. It is, therefore, regulated by the law which governs Disjunctive Judgments, (the Principle of Excluded Middle), although it is usually expressed in the form of a Copulative Categorical Judgment.
The rules by which this process is regulated are seven:
1. Every Division should be governed by some principle,
2. Every Division should be governed by only a single principle. .
3. The principle of Division should be an actual and essential character of the divided notion, and the division, therefore, neither complex nor without a purpose.
4. No dividing .member of the predicate must by itself exhaust the subject.
5. The dividing members, taken together, must exhaust, but only exhaust, the subject.
6. The divisive members must be reciprocally exclusive.
7. The divisions must proceed continuously from immediate to mediate differences.

(See the three stipulations proposed by Aristotle, below).

In this archaic, fussy treatment Hamilton distinguishes between "partition" and "logical division", etc. Among his footnotes, we find on page 354 the note that these ideas appear in Aristotle: Posterior Analytics Book II Chapter 13 93 [25] . In my volume, this beings at 93 [25] with: "Divisions according to differentiae . . ." The words "exhaust the genus" appears a number of times, thereafter. I haven't read the whole Aristotle thing (difficult, fussy), nor Hamilton (fussy, fussy). But this appears to be the source of the words and the ideas. Whether this is committing the sin of O.R. I'm not sure; I don't think so, but it would be nice to find a source that references these. My Aristotle is from the Great Books of the Western World, and the William Hamilton I derived from books.google.com. BillWvbailey (talk) 20:01, 9 June 2013 (UTC)[reply]

RE Aristotle: All of this is in context of "definition by division", by use of what we now call the law of excluded middle [ambiguous: it appears "the middle" seems to have do with the construction of the middle term of a syllogism; he invokes the law of contradiction: "The law that it is impossible to affirm and deny simultaneously the same predicate of the same subject" (Book I, Ch 11, cf p. 106)]; Artistotle's Chapter 3 begins with "It is clear, then, that all questions are a search for a 'middle'. Let us now state how essential nature is revealed, and in what way it can be reduced to demonstration; what definition is, and what things are definable." (p. 123) In chapter 13, the notion of "exhaustion" appears as "(3) the omission of no such elements" in a "definition by division":

"In establishing a definition by division one should keep three objects in view: (1) the admission only of elements in the definable form, (2) the arrangement of these in the right order, (3) the omission of no such elements." (page 132)

BillWvbailey (talk) 14:35, 11 June 2013 (UTC)[reply]

John Venn 1881

[edit]

RE History: From John Venn 1881 Symbolic Logic, Macmillan and Co., London 1881:

"At the basis of our Symbolic Logic, however represented, whether by words by letters or by diagrams, we shall always find the same state of things. What we ultimately have to do is to break up the entire field into a definite number of classes or compartments which are mutually exclusive and collectively exhaustive." [p. 101]

BillWvbailey (talk) 14:12, 11 June 2013 (UTC)[reply]

The singular of "dice" is "die."

[edit]

This is a common error - editors keep changing the word "die" to "dice." It's a six-sided die. That's not a typo. Thanks! capitalist (talk) 02:15, 4 March 2024 (UTC)[reply]