Jump to content

Talk:Chernoff bounds

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Base 2 vs natural Logarithms

[edit]

Is there a reason logarithms base 2 are being used? Although there is application of relative entropy in information theory, these theorems were not proved in that context, and have applicability beyond them. It would also clean things up by removing some pesky subscripts and constants if natural logarithms were used. --Steve Kroon 14:31, 8 December 2006 (UTC)[reply]

History

[edit]

All the historical part of this article is completely erroneous. Most of the inequalities mentioned were proved by Bernstein (in the 1920-s) and Cramer (in the 1930-s) Sodin 02:32, 9 August 2007 (UTC)[reply]

Second (Relative) Chernoff Bound

[edit]

I think that the second bound stated in the proof section to be "obtainable using a similar proof strategy" should be

for instead of the weaker

.

This can be proven by applying Markov's inequality on (instead of , as done for the first bound) and substituting . The weaker bound given in the article right now then follows from basic calculus by taking logarithms and comparing derivatives.

Also, this second bound should probably be stated in the theroem itself. --Björn —The preceding unsigned comment was added by 85.179.24.59 (talk) 11:24, August 22, 2007 (UTC)

Response

[edit]

I agree, we probably ought to state the second bound. I think you are correct in this, I was just lazy when I wrote the page originally and thought a slightly simpler bound might be nice. --John (Jduchi 21:41, 5 October 2007 (UTC))[reply]

First Theorem

[edit]

I am not quite sure why the first theorem is stated here. I'm not an expert on probability theory, so I cannot tell whether it's valid or not. But surely, it cannot be derived from the Chernoff bounds: applying the Chernoff bounds to the left side of the second bound, one obtains

Therefore, I would expect (in case the theorem was a simple consequence from the Chernoff bounds) that

Now, terms equal at . Comparing derivatives in , one should then have

which is not true as is continuously growing. Therefore, I think that either the theorem is cited wrongly or it doesn't really belong here. Any suggestions? --Bjoern —The preceding unsigned comment was added by 85.179.24.59 (talk) 12:33, August 22, 2007 (UTC)

Response to above

[edit]

These theorems are actually slightly different and not applications of one another, which explains a little confusion I hope. I have modified the main page to reflect those differences. One theorem deals with absolute error of the mean of the random variables; the other deals with the relative error. So they are derived using the same strategy (i.e. the exponentiating and using Markov's Inequality), but they are different bounds and are used in different contexts. -- John (Jduchi 21:41, 5 October 2007 (UTC))[reply]

Proof and statement

[edit]

In the statement of the Theorem absolute error we claim . But in the proof we use when we say Now, knowing that , . --gala.martin (what?) 22:06, 8 October 2007 (UTC)[reply]

You're right. I've changed it. Thanks for the catch! Jduchi 06:37, 23 October 2007 (UTC)[reply]

Bounds on delta

[edit]

Is the last formulation of the theorem: still valid for (in particular, )? It would be quite helpful if these bounds were mentioned explicitly, even if they're the same as in the other formulation of the theorem. —Preceding unsigned comment added by 69.202.72.56 (talk) 05:38, 10 February 2008 (UTC)[reply]