Alphabet of human thought
The alphabet of human thought is a concept originally proposed by Gottfried Leibniz that provides a universal way to represent and analyze ideas and relationships by breaking down their component pieces. All ideas are compounded from a very small number of simple ideas which can be represented by a unique character.
Logic and the Universal Language
Logic was Leibniz's earliest philosophic interest going back to his teens. René Descartes had suggested that the lexicon of a universal language should consist of primitive elements. The systematic combination of these elements, according to syntactical rules, would generate the infinite combinations of computational structures required to represent human language. In this way Descartes and Leibniz were precursors to computational linguistics as defined by Noam Chomsky.
In the early 18th century, Leibniz outlined his characteristica universalis, an artificial language in which grammatical and logical structure would coincide, which would allow reasoning to be reduced to calculation. Leibniz acknowledged the work of Ramon Llull, particularly the Ars generalis ultima (1305), as one of the inspirations for this idea. The basic elements of his characteristica would be pictographic characters representing unambiguously a limited number of elementary concepts. Leibniz called the inventory of these concepts "the alphabet of human thought." There are quite a few mentions of the characteristica in Leibniz's writings, but he never set out any details save for a brief outline of some possible sentences in his Dissertation on the Art of Combinations.
His main interest was what is known in modern logic as classification and composition. In modern terminology Leibniz's alphabet was a proposal for an automated theorem prover or ontology classification reasoner written centuries before the technology to implement them.
Semantic web implementation
In a speech by former CTO and co-founder of Metaweb Technologies, John Giannandrea acknowledges that Freebase was at least linked to the Alphabet of human thought by Leibniz, if not an implementation of it.
- Geiger, Richard A.; Rudzka-Ostyn, Brygida, eds. (1993). Conceptualizations and mental processing in language. International Cognitive Linguistics Conference (1 : 1989 : Duisburg). Walter de Gruyter. pp. 25–26. ISBN 978-3-11-012714-0.
- Bunnin, Nicholas; Jiyuan Yu (2004). The Blackwell Dictionary of Western Philosophy. Blackwell Publishing. p. 715. ISBN 978-1-4051-0679-5.
- Hatfield, Gary. "René Descartes, The Stanford Encyclopedia of Philosophy (Summer 2014 Edition)". http://plato.stanford.edu. Stanford University. Retrieved 12 July 2014.
he offered a new vision of the natural world that continues to shape our thought today: a world of matter possessing a few fundamental properties and interacting according to a few universal laws.
- Chomsky, Noam. New Horizons in the Study of Language and Mind (Kindle ed.). Cambridge University Press. pp. 425–428. ISBN 0521658225.
I mentioned that modern generative grammar has sought to address concerns that animated the tradition; in particular, the Cartesian idea that "the true distinction" (Descartes 1649/1927: 360) between humans and other creatures or machines is the ability to act in the manner they took to be most clearly illustrated in the ordinary use of language: without any finite limits, influenced but not determined by internal state, appropriate to situations but not caused by them, coherent and evoking thoughts that the hearer might have expressed, and so on. The goal of the work I have been discussing is to unearth some of the factors that enter into such normal practice.
- Russell, L.J. (1985). "Leibniz, Gottfried Wilhelm". In Paul Edwards. The Encyclopedia of Philosophy Volumes 3 and 4. Macmillan Publishing. pp. 422–423. ASIN B0017IMQME.
his main emphasis... was on classification, deduction was a natural consequence of combining classified items into new classes.
- "PARCForum Presentation by Giannandrea, J.". YouTube. min 37+. Retrieved 2015-10-30.