User:Sulgi Kim/sandbox
베이지안 확률론
Part of a series on |
Bayesian statistics |
---|
Posterior = Likelihood × Prior ÷ Evidence |
Background |
Model building |
Posterior approximation |
Estimators |
Evidence approximation |
Model evaluation |
베이지안 확률론은 확률 개념의 한 해석방법이며 증거기반 확률론의 범주에 해당한다. 확률의 베이지언 해석법은 참 거짓이 확실하지 않은 명제들로 추론하는, 명제 논리라는 수학논리분야의 확장으로 볼 수 있다. 명제가 참일 확률을 구하기 위해, 베이지안 확률론자들은 사전 확률을 지정하는데, 이후 새로운 관련데이터를 얻는 것과 함께 업데이트 한다.[1]
The Bayesian interpretation provides a standard set of procedures and formulae to perform this calculation. Bayesian probability interprets the concept of probability as "an abstract concept, a quantity that we assign theoretically, for the purpose of representing a state of knowledge, or that we calculate from previously assigned probabilities,"[2] in contrast to interpreting it as a frequency or "propensity" of some phenomenon.
베이지안 해석은 이러한 계산을 수행할 수 있는 표준적인 방법과 공식을 제공한다. 베이지안 확률론은 확률을 빈도나 어떤 현상의 경향으로 해석하지 않고, "지식의 상태를 나타내기 위해 이론적으로 매기거나, 이전에 매긴 확률로 부터 계산한, 수량이라는 추상적 개념"[2] 으로 해석한다.
The term "Bayesian" refers to the 18th century mathematician and theologian Thomas Bayes, who provided the first mathematical treatment of a non-trivial problem of Bayesian inference.[3] Nevertheless, it was the French mathematician Pierre-Simon Laplace who pioneered and popularised what is now called Bayesian probability.[4]
Broadly speaking, there are two views on Bayesian probability that interpret the probability concept in different ways. According to the objectivist view, the rules of Bayesian statistics can be justified by requirements of rationality and consistency and interpreted as an extension of logic.[2][5] According to the subjectivist view, probability quantifies a "personal belief".[6] Many modern machine learning methods are based on objectivist Bayesian principles.[7] In the Bayesian view, a probability is assigned to a hypothesis, whereas under the frequentist view, a hypothesis is typically tested without being assigned a probability.
- ^ Paulos, John Allen. The Mathematics of Changing Your Mind, New York Times (US). August 5, 2011; retrieved 2011-08-06
- ^ a b c Jaynes, E.T. "Bayesian Methods: General Background." In Maximum-Entropy and Bayesian Methods in Applied Statistics, by J. H. Justice (ed.). Cambridge: Cambridge Univ. Press, 1986
- ^ Stigler, Stephen M. (1986) The history of statistics. Harvard University press. pg 131.
- ^ Stigler, Stephen M. (1986) The history of statistics., Harvard University press. pp97-98, 131.
- ^ Cox, Richard T. Algebra of Probable Inference, The Johns Hopkins University Press, 2001
- ^ de Finetti, B. (1974) Theory of probability (2 vols.), J. Wiley & Sons, Inc., New York
- ^ Bishop, C.M. Pattern Recognition and Machine Learning. Springer, 2007