Definition. The Entropy of a discrete random variable is defined as
The entropy is equivalently stated as .
Entropy is a measure of the uncertainty of a random variable and is expressed in bits. We use the convention that .
In general, we note that .
Definition. The Conditional Entropy of a discrete random variable conditioned on another random variable is defined as
The conditional entropy is equivalently stated as .
We observe that
The following Chain Rule holds
Here denotes the entropy of the joint random variable .
In general, we have .
The uncertainty of a random variable can never increase by knowledge of the outcome of another random variable , i.e. with equality iff are independent.
We thus have with equality iff are independent.
Definition. The relative entropy between two probability distributions of a random variable is defined as
The relative entropy is a measure of the distance between two probability distributions. It can be thought of as the inefficiency of assuming distribution when the correct distribution is .
Note .
In general .
Definition. The Mutual Information between random variables and is defined as
The mutual information measures the information (in bits) we receive about the random variable when observing the outcome of the random variable .
It also describes the information we receive about the random variable when observing the outcome of the random variable .
Mutual Information is also the a measure of the price for encoding as a pair of independent random variables, when in reality they are not.
The Mutual Information obeys the following properties
We have that with equality if and only if are independent.
The Conditional Mutual Information where is the conditional relative entropy.
We have the key entropy
and the message entropy
The key entropy describes the uncertainty Eve faces regarding the unknown key a priori and the message entropy describes the uncertainty regarding the transmitted message.
The key equivocation
and the message equivocation
describe the remaining uncertainty after Eve observes the transmitted ciphertext.
We have that and .
This reflects the fact that the uncertainties never increase by knowledge of the ciphertext.
An encryption scheme is said to have perfect secrecy if .
In a system with perfect secrecy, the plaintext and the ciphertext are independent. When Eve observes the ciphertext, she obtains no information at all about the plaintext,
i.e., .
Theorem. For an encryption scheme with perfect secrecy we have .
— Mar 1, 2023
Made with ❤ at Earth.