Definition

Given a discrete random variable XX, which takes values in the alphabet X\mathcal X and is distributed according to p:X[0,1]p:\mathcal X\to [0,1], entropy is defined as:

H(X):=xXp(x)logp(x)\Eta(X):=-\sum_{x\in X}p(x)\log p(x)

The unit of H(X)\Eta(X) depends on what base was used for the log\log operation:

  • Base 2: bits or “shannons”
  • Base ee: nats (for natural units)
  • Base 10: “dits”, “bands”, or “hartleys”
This article is a work in progress.

See also

Resources