Abstract:
Information-spectrum methods focus on the non-independently and identically distributed setting, which have established source coding theory and channel coding theory.Typically finite and operations can be achieved only approximately, so this makes it necessary to consider the nonasymptotic scenario and even one-shot scenario.The information-spectrum relative entropy in one-shot scenario and related properties are studied. In particular, the relationship between the information-spectrum relative entropy and hypothesis testing relative entropy is given.The entropy rate,the conditional entropy and the mutual information with respect to the information-spectrum relative entropy are considered, and the equivalence relations of information-spectrum mutual information are discussed.Finally, the chain rules for them are given.