Problem with the Bayesian Divergence example
The only reason the second character in this story fails to update her posterior probability is that her prior probabilities were ill-formed (and worded in a terribly confusing way, as a matter of clarity). While the beta(0,0) distribution (characterized as "either always or never") can be used as an improper prior, the character in this story did not use it correctly. Even better than the use of an improper prior would be to use a markov chain as follows:
AliceX thinks the coin-flipper either chose the unfair coin with probability X or a fair coin with probability 1-X. Probability of heads with fair coin = .5 ; Probability with unfair coin = 1
Alice1 and Alice0 are the character in the story of this article who never update. Any other Alice(between 0 and 1) will update and the posterior probability of all those other Alices will converge to the same posterior probabilities as the coin-flipper flips more consecutive heads (and will instantly update to the same posterior as soon as a tail is flipped).
Bad examples with poor wording will lead non-experts to misunderstanding and should be purged. — Preceding unsigned comment added by 188.8.131.52 (talk) 22:00, 23 December 2014 (UTC)