Math Problem Statement

Exercise 5.5 (Entropies in the affine cipher) Compute H(K|C) and H(K|P,C) for the Affine cipher when used to encrypt a single letter from the English alphabet. Assume that keys and plaintexts are uniformly chosen.

Solution

To compute H(KC)H(K|C) and H(KP,C)H(K|P, C) for the Affine cipher, we need to review the concepts of conditional entropy in the context of cryptography, particularly for the Affine cipher. The Affine cipher encryption function is given by:

C=(aP+b)mod26C = (aP + b) \mod 26

where:

  • PP is the plaintext (a letter from the alphabet).
  • CC is the ciphertext (also a letter from the alphabet).
  • aa and bb are the components of the key KK.
  • aa must be coprime with 26 (the number of letters in the alphabet).

Entropy Definitions

  1. Entropy of a random variable: For a random variable XX, the entropy H(X)H(X) is given by: H(X)=xP(X=x)log2P(X=x)H(X) = - \sum_{x} P(X=x) \log_2 P(X=x) where P(X=x)P(X=x) is the probability that XX takes the value xx.

  2. Conditional entropy: The conditional entropy H(XY)H(X|Y) is the entropy of XX given YY and represents the remaining uncertainty about XX after YY is known. It is given by: H(XY)=x,yP(X=x,Y=y)log2P(X=xY=y)H(X|Y) = - \sum_{x,y} P(X=x, Y=y) \log_2 P(X=x | Y=y)

Step-by-Step Computation

1. Computing H(KC)H(K|C) (Conditional Entropy of the Key Given the Ciphertext)

  • Let K=(a,b)K = (a, b) be the key, where aa is chosen from the set of integers coprime with 26, and bb is chosen from the set {0,1,,25}\{0, 1, \dots, 25\}.
  • There are 12 possible values for aa (as only 12 integers between 1 and 25 are coprime with 26) and 26 possible values for bb, giving a total of 12×26=31212 \times 26 = 312 possible keys.

When a ciphertext CC is known, the key KK is not fully determined, as multiple keys can map different plaintexts to the same ciphertext. This is because the Affine cipher is not a one-to-one function for keys and ciphertexts. Specifically, for each ciphertext, there are multiple possible key combinations that could have produced it.

Thus, the uncertainty about the key remains even when the ciphertext is known, and this conditional entropy is non-zero.

Given that keys are uniformly distributed, and because there are 312 possible keys and 26 ciphertexts, we can approximate:

H(KC)=log212=3.585 bitsH(K|C) = \log_2 12 = 3.585 \text{ bits}

This accounts for the uncertainty about the multiplier aa in the key when the ciphertext is known.

2. Computing H(KP,C)H(K|P, C) (Conditional Entropy of the Key Given the Plaintext and Ciphertext)

If both the plaintext PP and ciphertext CC are known, the key KK becomes fully determined. Specifically, given both PP and CC, we can solve the Affine cipher equation to uniquely determine aa and bb, the components of the key KK. Therefore, knowing both the plaintext and the ciphertext eliminates all uncertainty about the key.

Thus:

H(KP,C)=0 bitsH(K|P, C) = 0 \text{ bits}

Summary

  • H(KC)=log212=3.585 bitsH(K|C) = \log_2 12 = 3.585 \text{ bits}
  • H(KP,C)=0 bitsH(K|P, C) = 0 \text{ bits}

Would you like more details or have any questions?

Here are 5 related questions for further exploration:

  1. How does the structure of the Affine cipher compare to other classical ciphers like the Caesar cipher?
  2. How would the entropies change if the size of the alphabet was different (e.g., a larger or smaller set)?
  3. What is the role of coprimality in ensuring the Affine cipher’s reversibility?
  4. How would you compute H(KP)H(K|P), the entropy of the key given only the plaintext?
  5. How does the uniform distribution assumption affect the entropies in the Affine cipher?

Tip: In cryptography, understanding the relationship between plaintext, ciphertext, and the key is essential for assessing the security of the cipher. The more information a cipher leaks, the less secure it is.

Ask a new question for Free

By Image

Drop file here or Click Here to upload

Math Problem Analysis

Mathematical Concepts

Cryptography
Conditional Entropy
Affine Cipher
Probability Theory

Formulas

C = (aP + b) mod 26
H(X) = - ∑ P(X=x) log2 P(X=x)
H(X|Y) = - ∑ P(X=x, Y=y) log2 P(X=x | Y=y)
H(K|C) = log2 12

Theorems

Conditional Entropy Theorem

Suitable Grade Level

Undergraduate Level (Cryptography or Information Theory Course)