skbio.diversity.alpha.renyi#

skbio.diversity.alpha.renyi(counts, order=2, base=None)[source]#

Calculate Renyi entropy.

Renyi entropy (\(^qH\)) is a generalization of Shannon index, with an exponent (order) \(q\) instead of 1. It is defined as:

\[^qH = \frac{1}{1-q}\log_b{(\sum_{i=1}^S p_i^q)}\]

where \(S\) is the number of taxa and \(p_i\) is the proportion of the sample represented by taxon \(i\).

Parameters:
counts1-D array_like, int

Vector of counts.

orderint or float, optional

Order (\(q\)). Ranges between 0 and infinity. Default is 2.

baseint or float, optional

Logarithm base to use in the calculation. Default is e.

Returns:
float

Renyi entropy.

Notes

Renyi entropy was originally defined in [1]. It is a generalization of multiple entropy notions, as determined by the order (\(q\)). Special cases of Renyi entropy include:

  • \(q=0\): Max-entropy (\(\log{S}\)).

  • \(q \to 1\): Shannon entropy (index).

  • \(q=2\): Collision entropy, a.k.a, Renyi’s quadratic entropy, or “Renyi entropy”. Equivalent to the logarithm of inverse Simpson index.

  • \(q \to \infty\): Min-entropy (\(-\log{\max{p}}\)).

Renyi entropy is equivalent to the logarithm of Hill number.

References

[1]

Rényi, A. (1961, January). On measures of entropy and information. In Proceedings of the fourth Berkeley symposium on mathematical statistics and probability, volume 1: contributions to the theory of statistics (Vol. 4, pp. 547-562). University of California Press.