Hi, I am a bit confused why we can not use variance for discrete values and we have to use entropy instead f variance. I appreciate if someone could let me know WHEN and WHY we should use variance or entropy
Both the measures: Variance and Entropy are used for uncertainly evaluation.
Variance is given by:
.
The variance obtains the spread of the values around the mean.
The entropy is given by:
The entropy is one that maximises when each of the outcomes occurs with the same probability, i.e., lot of uncertainty and minimises when there is only a single outcome. i.e., no uncertainty.
The difference between Variance and Entropy is:
Variance obtains the spread precisely, whereas entropy does not obtain spread.
In multimodel distributions, Variance is not suitable and Entropy is preferred.
Thus, the preferece between Variance and Entropy is as follows:
In the case of unimodal distributions, Variance is used. Otherwise, Entropy is employed.
Get Answers For Free
Most questions answered within 1 hours.