Abstract:
Estimating the entropy of finite strings has applications in areas such as event detection, similarity measurement or in the performance assessment of compression algorithms. This report compares a variety of computable information measures for finite strings that may be used in entropy estimation. These include Shannon’s n-block entropy, the three variants of the Lempel-Ziv production complexity, and the lesser known T-entropy. We apply these measures to strings derived from the logistic map, for which Pesin’s identity allows us to deduce corresponding Shannon entropies (Kolmogorov-Sinai entropies) without resorting to probabilistic methods.