Jump to content

Counting

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by ClueBot NG (talk | contribs) at 16:33, 21 January 2016 (Reverting possible vandalism by 70.33.137.145 to version by Kilroywasnthere. Report False Positive? Thanks, ClueBot NG. (2513955) (Bot)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Counting is the action of finding the number of elements of a finite set of objects. The traditional way of counting consists of continually increasing a (mental or spoken) counter by a unit for every element of the set, in some order, while marking (or displacing) those elements to avoid visiting the same element more than once, until no unmarked elements are left; if the counter was set to one after the first object, the value after visiting the final object gives the desired number of elements. The related term enumeration refers to uniquely identifying the elements of a finite (combinatorial) set or infinite set by assigning a number to each element.

Counting using tally marks at Hanakapiai Beach

Counting sometimes involves numbers other than one; for example, when counting money, counting out change, "counting by twos" (2, 4, 6, 8, 10, 12, ...), or "counting by fives" (5, 10, 15, 20, 25, ...).

There is archeological evidence suggesting that humans have been counting for at least 50,000 years.[1] Counting was primarily used by ancient cultures to keep track of social and economic data such as number of group members, prey animals, property, or debts (i.e., accountancy). The development of counting led to the development of mathematical notation, numeral systems, and writing.

Forms of counting

Counting can occur in a variety of forms.

Counting can be verbal; that is, speaking every number out loud (or mentally) to keep track of progress. This is often used to count objects that are present already, instead of counting a variety of things over time.

Counting can also be in the form of tally marks, making a mark for each number and then counting all of the marks when done tallying. This is useful when counting objects over time, such as the number of times something occurs during the course of a day. Tallying is base 1 counting; normal counting is done in base 10. Computers use base 2 counting (0's and 1's).

Counting can also be in the form of finger counting, especially when counting small numbers. This is often used by children to facilitate counting and simple mathematical operations. Finger-counting uses unary notation (one finger = one unit), and is thus limited to counting 10 (unless you start in with your toes). Older finger counting used the four fingers and the three bones in each finger (phalanges) to count to the number twelve.[2] Other hand-gesture systems are also in use, for example the Chinese system by which one can count 10 using only gestures of one hand. By using finger binary (base 2 counting), it is possible to keep a finger count up to 1023 = 210 − 1.

Various devices can also be used to facilitate counting, such as hand tally counters and abacuses.

Inclusive counting

Inclusive counting is usually encountered when dealing with time in the Romance languages.[3] In exclusive counting countries, such as the United States, when counting "8" days from Sunday, Monday will be day 1, Tuesday day 2, and the following Monday will be the eighth day. When counting "inclusively," the Sunday (the start day) will be day 1 and therefore the following Sunday will be the eighth day. For example, the French phrase for "fortnight" is quinzaine (15 [days]), and similar words are present in Greek (δεκαπενθήμερο, dekapenthímero), Spanish (quincena) and Portuguese (quinzena). In contrast, the English word "fortnight" itself derives from "a fourteen-night", as the archaic "sennight" does from "a seven-night"; the English words are not examples of inclusive counting.

Names based on inclusive counting appear in other calendars as well: in the Roman calendar the nones (meaning "nine") is 8 days before the ides; and in the Christian calendar Quinquagesima (meaning 50) is 49 days before Easter Sunday.

Musical terminology also uses inclusive counting of intervals between notes of the standard scale: going up one note is a second interval, going up two notes is a third interval, etc., and going up seven notes is an octave.

Education and development

Learning to count is an important educational/developmental milestone in most cultures of the world. Learning to count is a child's very first step into mathematics, and constitutes the most fundamental idea of that discipline. However, some cultures in Amazonia and the Australian Outback do not count,[4][5] and their languages do not have number words.

Many children at just 2 years of age have some skill in reciting the count list (i.e., saying "one, two, three, ..."). They can also answer questions of ordinality for small numbers, e.g., "What comes after three?". They can even be skilled at pointing to each object in a set and reciting the words one after another. This leads many parents and educators to the conclusion that the child knows how to use counting to determine the size of a set.[6] Research suggests that it takes about a year after learning these skills for a child to understand what they mean and why the procedures are performed.[7][8] In the mean time, children learn how to name cardinalities that they can subitize.

Counting in mathematics

In mathematics, the essence of counting a set and finding a result n, is that it establishes a one to one correspondence (or bijection) of the set with the set of numbers {1, 2, ..., n}. A fundamental fact, which can be proved by mathematical induction, is that no bijection can exist between {1, 2, ..., n} and {1, 2, ..., m} unless n = m; this fact (together with the fact that two bijections can be composed to give another bijection) ensures that counting the same set in different ways can never result in different numbers (unless an error is made). This is the fundamental mathematical theorem that gives counting its purpose; however you count a (finite) set, the answer is the same. In a broader context, the theorem is an example of a theorem in the mathematical field of (finite) combinatorics—hence (finite) combinatorics is sometimes referred to as "the mathematics of counting."

Many sets that arise in mathematics do not allow a bijection to be established with {1, 2, ..., n} for any natural number n; these are called infinite sets, while those sets for which such a bijection does exist (for some n) are called finite sets. Infinite sets cannot be counted in the usual sense; for one thing, the mathematical theorems which underlie this usual sense for finite sets are false for infinite sets. Furthermore, different definitions of the concepts in terms of which these theorems are stated, while equivalent for finite sets, are inequivalent in the context of infinite sets.

The notion of counting may be extended to them in the sense of establishing (the existence of) a bijection with some well understood set. For instance, if a set can be brought into bijection with the set of all natural numbers, then it is called "countably infinite." This kind of counting differs in a fundamental way from counting of finite sets, in that adding new elements to a set does not necessarily increase its size, because the possibility of a bijection with the original set is not excluded. For instance, the set of all integers (including negative numbers) can be brought into bijection with the set of natural numbers, and even seemingly much larger sets like that of all finite sequences of rational numbers are still (only) countably infinite. Nevertheless, there are sets, such as the set of real numbers, that can be shown to be "too large" to admit a bijection with the natural numbers, and these sets are called "uncountable." Sets for which there exists a bijection between them are said to have the same cardinality, and in the most general sense counting a set can be taken to mean determining its cardinality. Beyond the cardinalities given by each of the natural numbers, there is an infinite hierarchy of infinite cardinalities, although only very few such cardinalities occur in ordinary mathematics (that is, outside set theory that explicitly studies possible cardinalities).

Counting, mostly of finite sets, has various applications in mathematics. One important principle is that if two sets X and Y have the same finite number of elements, and a function f: XY is known to be injective, then it is also surjective, and vice versa. A related fact is known as the pigeonhole principle, which states that if two sets X and Y have finite numbers of elements n and m with n > m, then any map f: XY is not injective (so there exist two distinct elements of X that f sends to the same element of Y); this follows from the former principle, since if f were injective, then so would its restriction to a strict subset S of X with m elements, which restriction would then be surjective, contradicting the fact that for x in X outside S, f(x) cannot be in the image of the restriction. Similar counting arguments can prove the existence of certain objects without explicitly providing an example. In the case of infinite sets this can even apply in situations where it is impossible to give an example; for instance there must exists real numbers that are not computable numbers, because the latter set is only countably infinite, but by definition a non-computable number cannot be precisely specified.

The domain of enumerative combinatorics deals with computing the number of elements of finite sets, without actually counting them; the latter usually being impossible because infinite families of finite sets are considered at once, such as the set of permutations of {1, 2, ..., n} for any natural number n.

See also

References

  1. ^ An Introduction to the History of Mathematics (6th Edition) by Howard Eves (1990) p.9
  2. ^ Macey, Samuel L. (1989). The Dynamics of Progress: Time, Method, and Measure. Atlanta, Georgia: University of Georgia Press. p. 92. ISBN 978-0-8203-3796-8.
  3. ^ James Evans, The History and Practice of Ancient Astronomy. Oxford University Press, 1998. ISBN 019987445X. Chapter 4, page 164.
  4. ^ Butterworth, B., Reeve, R., Reynolds, F., & Lloyd, D. (2008). Numerical thought with and without words: Evidence from indigenous Australian children. Proceedings of the National Academy of Sciences, 105(35), 13179–13184.
  5. ^ Gordon, P. (2004). Numerical cognition without words: Evidence from Amazonia. Science, 306, 496–499.
  6. ^ Fuson, K.C. (1988). Children's counting and concepts of number. New York: Springer–Verlag.
  7. ^ Le Corre, M., & Carey, S. (2007). One, two, three, four, nothing more: An investigation of the conceptual sources of the verbal counting principles. Cognition, 105, 395–438.
  8. ^ Le Corre, M., Van de Walle, G., Brannon, E. M., Carey, S. (2006). Re-visiting the competence/performance debate in the acquisition of the counting principles. Cognitive Psychology, 52(2), 130–169.