Academic Catalog

CEIC 400 INFORMATION THEORY AND CODING

This course covers information theory and coding within the context of modern digital communications applications. We begin with a directed review of probability and digital modulation schemes. We then introduce: Information – Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memoryless channels – BSC, BEC – Channel capacity, Shannon limit.

Credits

3

Offered

semester 2