Entropy

Definition

Entropy is expressed in bits and defined as follows:

Example

Let's assume we have four containers, A through D, which are filled with balls that can be either blue or red.

  • Container A is filled exclusively with blue balls.

  • Container B has an equal amount of red and blue balls.

  • In Container C, 10.89% of all balls are blue, and the remainder is red.

  • Container D only holds red balls.

  • Within each container, the order of balls is entirely random.

  • A volunteer who already knows the proportions of red and blue balls in each container now randomly draws one ball from each container. What is his degree of uncertainty regarding the ball color at the moment of each draw?

  • Needless to say, with Containers A and D, there is no uncertainty at all. From Containers A and D, he will draw a blue and red ball, respectively, with perfect certainty. What about the degree of certainty or, rather, uncertainty for Containers B and C?

  • The concept of Entropy can formally represent the degree of uncertainty.

  • Using the definition of Entropy from above, we can compute the Entropy value applicable to each draw.

We can also plot Entropy as a function of the probability of drawing a red ball.

Maximum Entropy as a Function of the Number of States

This was an example of a variable with two states only. As we introduce more possible states, e.g., another ball color, the maximum possible Entropy increases.

As a result, one cannot compare the Entropy values of variables with different numbers of states.

To make Entropy comparable, the Normalized Entropy metric is available, which takes into account the Maximum Entropy.

Last updated

Logo

Bayesia USA

info@bayesia.us

Bayesia S.A.S.

info@bayesia.com

Bayesia Singapore

info@bayesia.com.sg

Copyright © 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.