Math Problem Statement

Exercise 5.4 (Entropy of the key) Show that, in any cryptosystem, it holds that H(K|C) >= H(P|C). Under which condition do we have equality?

Solution

Proof: H(KC)H(PC)H(K|C) \geq H(P|C)

To prove this, let's define the following terms:

  • KK: the random variable representing the key.
  • PP: the random variable representing the plaintext.
  • CC: the random variable representing the ciphertext.
  • H(XY)H(X|Y): the conditional entropy of XX given YY, which quantifies the uncertainty of XX given knowledge of YY.

Step 1: Using the Chain Rule for Entropy

The chain rule for joint entropy states:

H(K,PC)=H(KC)+H(PK,C)H(K, P | C) = H(K | C) + H(P | K, C)

Since knowing the key KK and the ciphertext CC fully determines the plaintext PP (due to the deterministic nature of decryption in any cryptosystem), we have:

H(PK,C)=0H(P | K, C) = 0

Thus, the equation simplifies to:

H(K,PC)=H(KC)H(K, P | C) = H(K | C)

Step 2: Relating to H(PC)H(P | C)

On the other hand, we can use the chain rule again for the joint entropy of PP and KK:

H(K,PC)=H(PC)+H(KP,C)H(K, P | C) = H(P | C) + H(K | P, C)

Since knowing both PP and CC reveals nothing more about the key (because we already know the plaintext and ciphertext), it follows that:

H(KP,C)=0H(K | P, C) = 0

Thus, we have:

H(K,PC)=H(PC)H(K, P | C) = H(P | C)

Step 3: Comparing Both Results

From the previous steps, we now have two expressions for H(K,PC)H(K, P | C):

  1. H(K,PC)=H(KC)H(K, P | C) = H(K | C)
  2. H(K,PC)=H(PC)H(K, P | C) = H(P | C)

Thus:

H(KC)=H(PC)+H(KP,C)H(K | C) = H(P | C) + H(K | P, C)

Since H(KP,C)0H(K | P, C) \geq 0, this implies:

H(KC)H(PC)H(K | C) \geq H(P | C)

Step 4: Condition for Equality

Equality holds when H(KP,C)=0H(K | P, C) = 0, which means that knowing the plaintext PP and the ciphertext CC fully determines the key KK. This is the case in a perfect cryptosystem, where every ciphertext corresponds to exactly one key and plaintext pair. In such systems, given the plaintext and the ciphertext, the key can be uniquely determined, leading to H(KP,C)=0H(K | P, C) = 0.

Thus, the condition for equality is that the cryptosystem is perfect in the sense that:

  • Every ciphertext uniquely determines the corresponding plaintext-key pair.

Conclusion:

H(KC)H(PC)H(K | C) \geq H(P | C) with equality holding if and only if the cryptosystem is perfect and the key can be determined from the plaintext and ciphertext.


Would you like further details on any part of this explanation or any questions?

Here are some related questions for deeper understanding:

  1. What is conditional entropy, and how is it used in cryptography?
  2. Can you explain the concept of perfect secrecy in cryptosystems?
  3. How does the chain rule for entropy help in analyzing cryptographic security?
  4. What are some real-world examples of perfect cryptosystems?
  5. Why is H(KC)H(PC)H(K|C) \geq H(P|C) a fundamental property of secure cryptosystems?

Tip: In cryptography, understanding entropy and its conditional variants is key to grasping the security guarantees provided by cryptographic protocols.

Ask a new question for Free

By Image

Drop file here or Click Here to upload

Math Problem Analysis

Mathematical Concepts

Entropy
Conditional Entropy
Cryptosystems
Information Theory

Formulas

H(K|C) >= H(P|C)
Chain Rule for Entropy: H(K, P | C) = H(K | C) + H(P | K, C)
H(K | P, C) = 0 for perfect cryptosystems

Theorems

Chain Rule for Entropy
Conditional Entropy

Suitable Grade Level

University level (Cryptography or Information Theory)