Comments on information theory
Stanford courses EE376A (taught by Cover) and EE479 (course on Multiuser Information Theory taught by El Gamal) offer a clean view of Information Theory. While the first course is more about the intuition behind information theory, developing some basic tools to deal with simple channels, the second constructs a solid framework useful to asses the achievability and converse proofs of most of the capacity problems. However both courses are the result of several years of iterations, at each of them cleaning the proofs and the concepts to make everything simple and elegant. In fact the notion of typicality, while useful to derive capacity in an elegant way, hides many concepts and machinery behind it.
Of course simple proofs allow to solve complicated problems. However it exists the risk that the students are not aware of the concepts hidden behind. For example, I got the impression that typicality arguments in a random coding communications scheme arise naturally when trying to derive capacity. It was surprising to find in [G69] that capacity is also the outcome when applying Maximum Likelihood decoding to a point to point communication scheme. Although a couple of apparently magic bounds are used in the derivation, it shows the concept of the error exponent which links the pure theoretical results to more practical schemes.
To complete this post I have to mention [H03], a book that tries to generalize many aspects of the classical information theory through the concept of information spectrum. While I did not have to read it in depth, the approach followed also gives interesting insights into the problem.
Thomas M. Cover, Joy A. Thomas Elements of Information Theory, 2nd Edition. Wiley. 2006.
Robert G. Gallager Information Theory and Reliable Communication. Wiley. 1969.
Te Sun Han Information-Spectrum Methods in Information Theory. Springer. 2003. (Original Japanese edition published by Baifukan, Tokyo, 1998).
Labels: information theory