Théorie de l’information

Master Parisien de Recherche en Informatique : MPRI (2003-2006) 8 \times 3 hours.

Originally, Information Theory was called a Theory of Communication : its goal is to characterize the very limits of relibale and efficient information transmission through unreliable media where signals udnergo various kinds of distortion. In those introductory lectures, we will focus on discrete signals, that is on texts.

Source coding

The simplest problem consists of transmetting faithfully and efficiently a text in a reliable environment (such as the hard disk of your laptop) : this amounts to perform a lossless compress. Information Theorye assumes that texts are produced by a stochastic source (a source is nothing but a probability distribution on the set of texts). This assumption paves the way toward an operational definition of the optimal achievable compression rate on a given source. The first landmark result of Informatoin Theory (due to C.E Shannon) asserts that the optimal compression ratio coincides with the entropy of the source (that is with a functional of the probability distribution that defines the source). Moreover, this characterization is constructive : there exists efficient lossless compression algorithms (Huffmann coding, arithmetic coding).

The next probleme consists of compressing outputs of an unknown source, this is the universal coding problem. To characterize the cost (or redundancy) of universal coding when dealing with a given family of sources amounts to solve a statistical inference problem. This is an opportunity to advocate mixture coding techniques. Basic arguments from convex optimization theory allows to assess the potential of Dirichlet coding methods. Universal coding may be considered as one of the greatest achievments of Information Theory. Some outgrowths of Information Theory have reached desktop operating systems, the most famous ones are dictionnary methods (gzip,zip,compress) and the Burrows-Wheeler transform (bzip2).

Lossy data compression (or compression with a fidelity criterion), is a horse of another color. Tackling this problem requires a new notion: the la rate/distortion function of the source. This functional is defined implicitly as the objective function of an optimization problem. The second milestone of Information Theory asserts that the rate-distortion function characterizn source. The proof relies on an innovative argument called random selection, it is not constructive, but provides guidelines for designing audio and video compression techniques (such as jpeg, mpeg, mp3, rpe-lpc...). Last but not least, computing the rate/distortion function leads to an alternate minimization algorithm which is of general interest.

Channel coding

Transmitting informations in an unreliable environment (a channel), is called channel coding. This problem is also tackled thanks to stochastic modelling: a channel defines a probability transition between input and output symbols. Here again, the limits of reliable transmission over a channel can be characterized as the objective functions of an optimization problem. As for the lossy source coding problem, the characterization of the capacity of a channel is not constructive. Information Theory set a challenge to Computer Science. The challenge has not yet been fully completed.

The series of lectures finally surveys two practical solutions to channel coding:

    • algebraic coding is presented though Reed-Solomon codes . This methods illustrates the relevance of sophisticated matrix algorithms. Algebraic codes have also been proved to be useful tools in Combinatorics, in Complexity Theory and in the design of algorithms.
    • low-density parity check matrix codes . This blossoming direction revisits in a subtle way the random selection method that is at the core of the proof of the channel coding theorem.