Information Theory and Source Coding

Subject description

The course provides basic mathematical tools and concepts of information theory, which are used to define amount of information, amount of transmitted information, capacity of a channel and other criteria used in the evaluation and planning of communication channels.  

The main requirements and properties of coding are introduced and some typical examples of lossless coding are shown, such as the uniform code, the Huffman code, arithmetic code and Lempel-Ziv code. 

The transmission of analogue signals over digital communication paths is examined, and implications of time discretization and the effect of quantization on the analog signal are analysed. 

In the last part, a few different ways of audio signal compression are shown, ranging from logarithmic compression (A- and μ-law) and differential coding (DPCM and ADPCM), to compression taking into account the psychoacoustic characteristics of hearing, typical of MP2, MP3, AAC, etc., and the basics of speech source coding (LPC, CELP).

The subject is taught in programs

Electrical engineering 1st level

Objectives and competences

Basic principles of information transmission and related backgrounds. Entropy as the basic measure of information. Source coding and basic data compaction algorithms. Fundamental limits of reliable communication over noisy channel. Properties of analogue signals that are important for coding schemes.

Distinction between redundancy and irrelevance.  Redundancy removal in advanced speech coding. Basic principles of perceptual coding of audio signals.

Teaching and learning methods

Lectures, tutorial, homeworks

Expected study results

After successful completion of the course, students should be able to:

– explain the difference between irrelevance and redundancy in the informing process,

– explain the expression for the amount of information of a symbol of a discrete source,

– derive the mathematical proof of the maximal entropy of a discrete memoryless source,

– calculate the matrix of conditional probabilities of the transmission of multilevel symbols in the presence of additive Gaussian  noise,

– compare the differential entropy of a random variable with the entropy of a discrete information source,

– describe the necessary procedures that are required for analogue signal coding,

– calculate the minimum number of bits of an A/D converter to achieve a desired signal to noise ratio within the required dynamic range of the input signal.

Basic sources and literature

  1. S. Tomažič, Osnove telekomunikacij I, Založba FE in FRI, 2002
  2. N. Pavešič, Informacija in kodi, Založba FE in FRI, 2010
  3. J.R. Deller, J.G. Proakis, J.H. Hansen, Discrete-time processing of speech signals, MacMillan, New York, 1993
  4. N. Moreau, Tools for signal compression,  ISTE Ltd. and John Wiley & Sons, Inc., 2011

Stay up to date

University of Ljubljana, Faculty of Electrical Engineering Tržaška cesta 25, 1000 Ljubljana

E:  dekanat@fe.uni-lj.si T:  01 4768 411