1. THE SCIENCE OF INFORMATION
2. INFORMATION MEASURES
2.1 Independence and Markov Chains
2.2 Shannon''s Information Measures
2.3 Continuity of Shannon''s Information Measures
2.4 Chain Rules
2.5 Informational Divergence
2.6 The Basic Inequalities
2.7 Some Useful Information Inequalities
2.8 Fano''s Inequality
2.9 Entropy Rate of Stationary Source
Problems
Historical Notes
3. ZERO-ERROR DATA COMPRESSION
3.1 The Entropy Bound
3.2 Prefix Codes
3.2.1 Definition and Existence
3.2.2 Huffman Codes
3.3 Redundancy of Prefix Codes
Problems
Historical Notes
4. WEAK TYPICALITY
4.1 The Weak AEP
4.2 The Source Coding Theorem
4.3 Efficient Source Coding
4.4 The Shannon-McMiilan-BreimanTheorem
Problems
Historical Notes
5. STRONG TYPICALITY
5.1 StrongAEP
5.2 Strong Typicality Versus Weak Typicality
5.3 Joint Typicality
5.4 An Interpretation of the Basic Inequalities
Problems
Historical Notes
6. THE-MEASURE
6.1 Preliminaries
6.2 The-Measure for Two Random Variables
6.3 Construction of the-Measure ч*
6.4 #* Can be Negative
6.5 Information Diagrams
6.6 Examples of Applications
Appendix 6.A: A Variation of the Inclusion-Exclusion Formula
Problems
Historical Notes
7. MARKOV STRUCTURES
7.1 Conditional Mutual Independence
7.2 Full Conditional Mutual Independence
7.3 Markov Random Field
7.4 Markov Chain
Problems
Historical Notes
8. CHANNEL CAPACITY
8.1 Discrete MemorylessChannels
8.2 The Channel Coding Theorem
8.3 The Converse
8.4 Achievability of the Channel Capacity
8.5 A Discussion
8.6 Feedback Capacity
8.7 Separation of Source and Channel Coding
Problems
Historical Notes
9. RATE-DISTORTION THEORY
9.1 Single-Letter Distortion Measures
9.2 The Rate-Distortion Function RD
9.3 The Rate-Distortion Theorem
9.4 The Converse
9.5 Achievability of RID
Problems
Historical Notes
10. THE BLAHUT-ARIMOTO ALGORITHMS
10.I Alternating Optimization
10.2 The Algorithms
10.2.1 Channel Capacity
10.2.2 The Rate-Distortion Function
10.3 Convergence
10.3.1- A Sufficient Condition
10.3.2 Convergence to the Channel Capacity
Problems
Historical Notes
11. SINGLE-SOURCE NETWORK CODING
11.1 A Point-to-Point Network
11.2 What is Network Coding?
11.3 A Network Code
11.4 The Max-Flow Bound
11.5 Achievability of the Max-Flow Bound
11.5.1 Acyclic Networks
11.5.2 Cyclic Networks
Problems
Historical Notes
12. INFORMATION INEQUALITIES
12.1 The Region Fn
12.2 Information Expressions in Canonical Form
12.3 A Geometrical Framework
12.3.1 Unconstrained Inequalities
12.3.2 Constrained Inequalities
12.3.3 Constrained Identities
12.4 Equivalence of Constrained Inequalities
12.5 The Implication Problem of Conditional Independence
Problems
Historical Notes
13 SHANNON-TYPE INEQUALITIES
13.1 The Elemental Inequalities
13.2 A Linear Programming Approach
13.2.1 Unconstrained Inequalities
13.2.2 Constrained Inequalities and Identities
13.3 A Duality
13.4 Machine Proving - ITIP
13.5 Tackling the Implication Problem
13.6 Minimality of the Elemental Inequalities
Appendix 13.A: The Basic Inequalities and the Polymatroidal
Axioms
Problems
Historical Notes
14. BEYOND SHANNON-TYPE INEQUALITIES
14.1 Characterizations of г2,г3, and гn
14.2 A Non-Shannon-Type Unconstrained Inequality
14.3 A Non-Shannon-Type Constrained Inequality
14.4 Applications
Problems
Historical Notes
15. MULTI-SOURCE NETWORK CODING
15.1 Two Characteristics
15.1.1 The Max-Flow Bounds
15.1.2 Superposition Coding
15.2 Examples of Application
15.2.1 Multilevel Diversity Coding
15.2.2 Satellite Communication Network
15.3 A Network Code for Acyclic Networks
15.4 An Inner Bound
15.5 An Outer Bound
15.6 The LP Bound and Its Tightness
15.7 Achievability of Rin
Appendix 15.A: Approximation of Random Variables with
Infinite Alphabets
Problems
Historical Notes
16. ENTROPY AND GROUPS
16.1 Group Preliminaries
16.2 Group-Characterizable Entropy Functions
16.3 A Group Characterization of гn
16.4 Information Inequalities and Group Inequalities
Problems
Historical Notes
Bibliography
Index