Trait ndarray_stats::EntropyExt

source ·
pub trait EntropyExt<A, S, D>
where S: Data<Elem = A>, D: Dimension,
{ // Required methods fn entropy(&self) -> Result<A, EmptyInput> where A: Float; fn kl_divergence<S2>( &self, q: &ArrayBase<S2, D>, ) -> Result<A, MultiInputError> where S2: Data<Elem = A>, A: Float; fn cross_entropy<S2>( &self, q: &ArrayBase<S2, D>, ) -> Result<A, MultiInputError> where S2: Data<Elem = A>, A: Float; fn __private__(&self, _: PrivateMarker); }
Expand description

Extension trait for ArrayBase providing methods to compute information theory quantities (e.g. entropy, Kullback–Leibler divergence, etc.).

Required Methods§

source

fn entropy(&self) -> Result<A, EmptyInput>
where A: Float,

Computes the entropy S of the array values, defined as

      n
S = - ∑ xᵢ ln(xᵢ)
     i=1

If the array is empty, Err(EmptyInput) is returned.

Panics if ln of any element in the array panics (which can occur for negative values for some A).

§Remarks

The entropy is a measure used in Information Theory to describe a probability distribution: it only make sense when the array values sum to 1, with each entry between 0 and 1 (extremes included).

The array values are not normalised by this function before computing the entropy to avoid introducing potentially unnecessary numerical errors (e.g. if the array were to be already normalised).

By definition, xᵢ ln(xᵢ) is set to 0 if xᵢ is 0.

source

fn kl_divergence<S2>(&self, q: &ArrayBase<S2, D>) -> Result<A, MultiInputError>
where S2: Data<Elem = A>, A: Float,

Computes the Kullback-Leibler divergence Dₖₗ(p,q) between two arrays, where self=p.

The Kullback-Leibler divergence is defined as:

             n
Dₖₗ(p,q) = - ∑ pᵢ ln(qᵢ/pᵢ)
            i=1

If the arrays are empty, Err(MultiInputError::EmptyInput) is returned. If the array shapes are not identical, Err(MultiInputError::ShapeMismatch) is returned.

Panics if, for a pair of elements (pᵢ, qᵢ) from p and q, computing ln(qᵢ/pᵢ) is a panic cause for A.

§Remarks

The Kullback-Leibler divergence is a measure used in Information Theory to describe the relationship between two probability distribution: it only make sense when each array sums to 1 with entries between 0 and 1 (extremes included).

The array values are not normalised by this function before computing the entropy to avoid introducing potentially unnecessary numerical errors (e.g. if the array were to be already normalised).

By definition, pᵢ ln(qᵢ/pᵢ) is set to 0 if pᵢ is 0.

source

fn cross_entropy<S2>(&self, q: &ArrayBase<S2, D>) -> Result<A, MultiInputError>
where S2: Data<Elem = A>, A: Float,

Computes the cross entropy H(p,q) between two arrays, where self=p.

The cross entropy is defined as:

           n
H(p,q) = - ∑ pᵢ ln(qᵢ)
          i=1

If the arrays are empty, Err(MultiInputError::EmptyInput) is returned. If the array shapes are not identical, Err(MultiInputError::ShapeMismatch) is returned.

Panics if any element in q is negative and taking the logarithm of a negative number is a panic cause for A.

§Remarks

The cross entropy is a measure used in Information Theory to describe the relationship between two probability distributions: it only makes sense when each array sums to 1 with entries between 0 and 1 (extremes included).

The array values are not normalised by this function before computing the entropy to avoid introducing potentially unnecessary numerical errors (e.g. if the array were to be already normalised).

The cross entropy is often used as an objective/loss function in optimization problems, including machine learning.

By definition, pᵢ ln(qᵢ) is set to 0 if pᵢ is 0.

source

fn __private__(&self, _: PrivateMarker)

This method makes this trait impossible to implement outside of ndarray-stats so that we can freely add new methods, etc., to this trait without breaking changes.

We don’t anticipate any other crates needing to implement this trait, but if you do have such a use-case, please let us know.

Warning This method is not considered part of the public API, and client code should not rely on it being present. It may be removed in a non-breaking release.

Object Safety§

This trait is not object safe.

Implementations on Foreign Types§

source§

impl<A, S, D> EntropyExt<A, S, D> for ArrayBase<S, D>
where S: Data<Elem = A>, D: Dimension,

source§

fn entropy(&self) -> Result<A, EmptyInput>
where A: Float,

source§

fn kl_divergence<S2>(&self, q: &ArrayBase<S2, D>) -> Result<A, MultiInputError>
where A: Float, S2: Data<Elem = A>,

source§

fn cross_entropy<S2>(&self, q: &ArrayBase<S2, D>) -> Result<A, MultiInputError>
where S2: Data<Elem = A>, A: Float,

source§

fn __private__(&self, _: PrivateMarker)

Implementors§