This revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information is defined for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space Ω) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of Ω, i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two concepts derived from it, designated as information and surprise, which describe...
This revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information is...