Entropy Rate

Given an information source \(\{X_k\}_{k \in \mathbb{N}}\) which is a discrete-time random process (whose elements \(X_k\) are called letters and the Shannon Entropy of each \(X_k\) is assumed to be finite), the entropy rate of \(\{X_k\}_{k \in \mathbb{N}}\) is defined to be \[ H_X = \lim_{n \to \infty}\frac{1}{n} H(X_1,\cdots,X_n)\] when the limit exists.

This limit definitely exists when each \(X_i\) is iid, meanwhile does not converge if each \(X_i\) are mutually independent and have their entropy defined as \(H(X_k)=k\).

Another natural limit to consider is \(H'_X = \lim_{n\to \infty}H(X_n|X_1,\cdots,X_{n-1})\). This limit exists when the information source is stationary, i.e. for any finite sequence \(X_1,\cdots,X_m\) is equal to any shift \(X_l,\cdots,X_{m+l}\) in distribution. In fact, for stationary sources we have \(H'_X=H_X\), which can be proved using Cesàro mean sequences.

Emacs 29.4 (Org mode 9.6.15)