Abstract:
The concept of entropy plays a major part in communication theory. The Shannon entropy is a
measure of uncertainty with respect to a priori probability distribution. In algorithmic information
theory the information content of a message is measured in terms of the size in bits of the smallest
program for computing that message. This paper discusses the classical entropy and entropy rate
for discrete or continuous Markov sources, with finite or continuous alphabets, and their relations
to program-size complexity and algorithmic probability. The accent is on ideas, constructions and
results; no proofs will be given.