Information theory is a branch of mathematics that studies formal notions of information. It addresses and answers questions like: What is information? How to quantify information? How much can information be compressed?
In the course, we first motivate and introduce the key measure in information theory, known as the (Shannon) entropy. Intuitively, the entropy quantifies the uncertainty in the outcome of a random process. Information is then defined as the loss of uncertainty. After discussing some elementary properties, we show applications of the notions of entropy and information to cryptography, data compression, coding theory, and randomness extraction.
In the second part of the course, we address more advanced topics in information-theoretic cryptography. In contrast to standard cryptography (like RSA encryption etc.), which relies on the (assumed) hardness of computational problems, information-theoretic cryptography aims at offering security even in the presence of a computationally unrestricted adversary or, perhaps more realistically, one who has access to a quantum computer. We discuss several important cryptographic primitives that even withstand such strong (hypothetical) adversaries. As a corollary, these examples will also exhibit interesting connections with algebra, number theory, combinatorics and probability theory.
A (tentative) list of topics is as follows: foundations of information theory, source coding, channel capacity and Shannon’s channel coding theorem, randomness extraction and the leftover hash-lemma, perfect security, secure message transmission, authentication codes, secret sharing, secure computation, secret-key agreement by public discussion.
Basic undergraduate probability theory and algebra. Prior knowledge of cryptology is helpful, but certainly not necessary.