This article is within the scope of WikiProject Physics, a collaborative effort to improve the coverage of Physics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Section doesn't feel like it's quite there yet. Maybe consider what happens when you're using a Renyi entropy as a measure of ecological diversity, and then realise that you need to split one species into two... -- Jheald 22:19, 23 January 2006 (UTC).
The first thing I saw was 1/(1-1) = inf Full Decent (talk) 16:07, 11 December 2009 (UTC)
Renyi entropy was defined axiomatically in Renyi's Berkeley entropy paper. In this, a weakening of one of the Shannon axioms results in Renyi entropy; that's why α=1 is special. Also, some of Renyi entropy's applications - Statistical physics, General statistics, Machine learning, Signal processing, Cryptography (a measure of randomness, robustness), Shannon theory (generalizing, proving theorems), Source coding - should be added with context. I don't have all this handy right now, but I'm sure each piece of this is familiar to at least one person reading this page.... Calbaer 05:59, 5 May 2006 (UTC)
Also the continuous case is missing. -- Zz 11:58, 24 October 2006 (UTC)
The statement that is non-decreasing in seems to contradict the statement that . Also, should that be a weak inequality? LachlanA 23:24, 21 November 2006 (UTC)
Mathworld says they're non-decreasing. I think that's an error; it probably depends on whether you're using positive or negative entropies. I've fixed the inequalities, an obvious example of is . ⇌Elektron 18:58, 29 June 2012 (UTC)