Subject description
Probability, random variables (trials, events, definition of probability, probability density function, mean values, central limit theorem). Stochastic processes (sample function, time averages, ergodicity, power density spectrum). Information (metrics, information sources, entropy, redundancy). Coding and data compaction (source coding theorem, entropy coding, Lempel-Ziv coding). Mutual information and channel capacity (mutual information, information channel, joint entropy of discrete sources, differential entropy, information capacity theorem).
Analogue signal coding – basic formatting (ideal and flat toped sampling, reconstruction of continuous time signals, band-pass signal sampling; quantization, granular and overload noise, dynamic range). Audio signals (sound and hearing, properties of audio signal, perceptual properties of human hearing, frequency masking, redundancy and irrelevance, properties of speech, vocal tract modeling, speech redundancy).
Speech coding (non-linear quantization, A- law compression, predictive coding; scalar quantization (DPCM, ADPCM), vector quantization (CELP). Audio signal coding (standard coding formats: CD, DVD-audio, DSD; lossy compression, MP2, MP3, AAC)
The subject is taught in programs
Electrical engineering 1st level
Objectives and competences
Basic principles of information transmission and related backgrounds. Entropy as the basic measure of information. Source coding and basic data compaction algorithms. Fundamental limits of reliable communication over noisy channel. Properties of analogue signals that are important for coding schemes.
Distinction between redundancy and irrelevance. Redundancy removal in advanced speech coding. Basic principles of perceptual coding of audio signals.
Teaching and learning methods
Lectures, tutorial, homeworks
Expected study results
After successful completion of the course, students should be able to:
– explain the difference between irrelevance and redundancy in the informing process,
– explain the expression for the amount of information of a symbol of a discrete source,
– derive the mathematical proof of the maximal entropy of a discrete memoryless source,
– calculate the matrix of conditional probabilities of the transmission of multilevel symbols in the presence of additive Gaussian noise,
– compare the differential entropy of a random variable with the entropy of a discrete information source,
– describe the necessary procedures that are required for analogue signal coding,
– calculate the minimum number of bits of an A/D converter to achieve a desired signal to noise ratio within the required dynamic range of the input signal.
Basic sources and literature
1. S. Tomažič, Osnove telekomunikacij I, Založba FE in FRI, 2002
2. N. Pavešič, Informacija in kodi, Založba FE in FRI, 2010
3. J.R. Deller, J.G. Proakis, J.H. Hansen, Discrete-time processing of speech signals, MacMillan, New York, 1993
4. N. Moreau, Tools for signal compression, ISTE Ltd. and John Wiley & Sons, Inc., 2009