Talk:Limiting density of discrete points

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Mathematics (Rated Start-class, Mid-importance)
WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
Start Class
Mid Importance
 Field: Probability and statistics
This article has comments.
WikiProject Statistics  
WikiProject icon

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

 ???  This article has not yet received a rating on the quality scale.
 ???  This article has not yet received a rating on the importance scale.
 

I will eventually expand this to the more general concept of limiting density and move it to that title then. For the meantime, however, I just started with the limiting density in the application with which I am most familiar. Please feel free to contribute/correct. WDavis1911 (talk) 23:47, 1 May 2008 (UTC)

I'm puzzled as to what this article is about. There seems to be an abrupt change of subject, and the limit stated before that is true under any of various different sorts of assumptions, none of which are stated or even hinted at. Michael Hardy (talk) 05:00, 9 May 2008 (UTC)
Totally understandable, and thanks for the attention to the article. I originally started this stub because in my quest to understand Maximum entropy more, I ran into the term limiting density of discrete points, with "limiting density" redlinked. It wasn't clear to me if this meant something special in the context of information theory, so I needed to delve into this more. I couldn't find much more information on how this related to the concept except for the original Jaynes' paper (or at least what I think is the first mention of the term by him in this context) and a few other references discussing Jaynes' work.
Since starting this, I haven't been able to give this the full attention that it needed, but I haven't worried so much because I was sure that if I didn't do it, others in the community would. My feeling, however, is that Jaynes' use of this is not specific to his work at all, and just an application of limiting density to adjust Shannon's entropy measure for continuous distributions. In other words, I think this should be eventually converted to an article talking about limiting density from a probability theory perspective, and not this specific application to maximum entropy. I'm not an expert in probability theory however, so I'd rather let someone else be bold in that regard, at least for now. In the meanwhile, I am trying to supply as much information as I can from my limited perspective. WDavis1911 (talk) 18:47, 15 May 2008 (UTC)


I've added some extra background to try and make this article easier to understand - I hope that's ok.

However I would suggest that this article could/should maybe be merged into principle of maximum entropy where it's linked from -- these few lines of derivation are quite important in order to understand why Jaynes introduced a new formula for the continuous entropy and there are not many other pages that are likely to link to it. Nathanielvirgo (talk) 17:17, 17 December 2008 (UTC)

Having thought about it a bit more I think this page should be extended a bit and renamed to something more like "Jaynes' continuous entropy formula" ("limiting density of discrete points" sounds too general to me), as I quite often need to explain this to people and it would be good to have a decent page for it on Wikipedia. I will try to find time to do this some time soon. Nathanielvirgo (talk) 14:01, 18 December 2008 (UTC)


I added a small clarification on the relationship between this concept and the Kullback-Leibler divergence, because the article implied that they were the same concept. Although formally similar they are conceptually different, and I hope this is now clear. I also removed a few comments stating that Jaynes' formula gives the "relative information entropy." I hope that doesn't annoy anyone - the reason is this: the term "relative information entropy" refers to the Kullback-Leiber divergence, and it's "relative" because there are two probability distributions involved. It's effectively the entropy of one probability distribution relative to another. In Jaynes' formula one of these probability distributions is replaced by a function defining a density of points, not probabilities. In the framework Jaynes gave for the use of his formula there is nothihg "relative" about it - m(x) is an invariant that is defined by the problem under consideration and is not to be updated in the manner of a probability distribution. Jaynes saw his formula simply as the correct formula for the (absolute) entropy of a continuous distribution. Nathanielvirgo (talk) 14:20, 13 August 2009 (UTC)



Also, I think it would be inappropriate to merge this article with the one on the Kullback-Leibler divergence. The formulae are similar but the symbols stand for different quantities. The two definitons are used in different circumstances and motivated in entirely different ways.

Nathanielvirgo (talk) 14:29, 13 August 2009 (UTC)

Merging[edit]

I don't think this article should be merged with Kullback-Leibler divergence for reasons given by others elsewhere on this page. Right now, I think only one person tentatively supports the merging. I propose we keep this article separate, and remove the merge tag. Njerseyguy (talk) 16:27, 17 August 2009 (UTC)

I agree. There has been no further action on this for quite a while and not much in favour, so I will remove the merge tags. --mcld (talk) 12:41, 12 April 2010 (UTC)

Name of this page[edit]

The name of this page, "limiting density of discrete points" is not quite right, because it refers to a step in Jaynes' derivation of his version of the continuous entropy, rather than the quantity itself. I would like to change the page's name.

The problem is that Jaynes didn't give his quantity a name, since he saw it as simply a correction of Shannon's formula. So what should this page be called? I would like to call it "Jaynes continuous entropy", but I don't think it's been referred to as that in the literature. So I'm thinking of calling the page "differential entropy (Edwin Thompson Jaynes)". Does anyone have an opinion on this? Nathaniel Virgo (talk) 21:53, 22 August 2009 (UTC)

I agree that the name is a bit wrong. There are actually TWO ideas on this page:
(1) Introduction of m, with the understanding that it should transform the same as p.
(2) Passing points to limits.
Passing points to limits serves as a motivation of m but is not the core idea why the new entropy is sensible. The core idea is really the covariance of m and p. 178.38.142.81 (talk) 01:20, 2 February 2015 (UTC)

Negative Entropy[edit]

Another issue: the continuous entropy can still be negative, even after introducing m. Negativeness was indicated (in several entropy articles!) as a problem for differential entropy that requires handling, and partially motivates Jaynes' entropy. Actually, it never goes away and it's not a problem, just a difference to the discrete case. 178.38.142.81 (talk) 01:20, 2 February 2015 (UTC)

What is wrong with the page that needs a mathematics expert[edit]

Re-order the emphasis and thrust of the page to deal with the issues under Name of this page. 178.38.142.81 (talk) 01:20, 2 February 2015 (UTC)

Need for clarification[edit]

I don't understand the relationship between the set of points x_i and p(x). There must be one. Are these points drawn according to p(x)? Thanks for any clarification. Renato (talk) —Preceding undated comment added 13:37, 27 February 2013 (UTC)