# Entropy Bound

Let \(X\) be a (discrete and finite) random variable with probability mass function (pmf) \(p\), and \(\mathcal{C}\) a \(D\) -ary code mapping elements in its sample space \(\mathcal{X}\) to \(D^*\), and define the expected length of \(\mathcal{C}\) as \(L=\sum_{i=1}^m p_i l_i\) where \(l_i\) is the length of codeword \(i\). The **entropy bound** states that
\[ H_D(X)\leq L\]
Where \(H_D\) is the Shannon Entropy with base \(D\) in the logarithm.