Advertisement

Information Measures

  • Raymond W. Yeung
Chapter
  • 620 Downloads
Part of the Information Technology: Transmission, Processing and Storage book series (PSTE)

Abstract

Shannon’s information measures refer to entropy, conditional entropy, mutual information, and conditional mutual information. They are the most important measures of information in information theory. In this chapter, we introduce these measures and establish some basic properties they possess. The physical meanings of these measures will be discussed in depth in subsequent chapters. We then introduce informational divergence which measures the “distance” between two probability distributions and prove some useful inequalities in information theory. The chapter ends with a section on the entropy rate of a stationary information source.

Keywords

Markov Chain Divergence Inequality Mutual Information Chain Rule Information Measure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 2002

Authors and Affiliations

  • Raymond W. Yeung
    • 1
  1. 1.The Chinese University of Hong KongHong Kong

Personalised recommendations