**
Fill This Form To Receive Instant Help**

Homework answers / question archive / (1) Consider a process (A) that generates one of the following characters every second: A-Z, a-z, 0-9, $, or %

(1) Consider a process (A) that generates one of the following characters every second: A-Z, a-z, 0-9, $, or %.a. __Characters Probability of one of the characters__

__0-9 1/16__

__A-Z 1/(8*26)__

__a-z, $, % 1/(4*28)__

What is the Entropy of this system? You can do it by hand or use a spreadsheet or code. Please describe or show your work.

b. If each character above is equally likely, what's the Entropy?

c. Now, consider process B that generates two characters uniformly randomly every two seconds to generate a two-character message, e.g. ZY or CC. What's the entropy of each two-character message? What's its relationship to your answer in part b and why?

(2) Consider a physical door lock with a 16-character keypad (0-9 and A-F). The lock stores a single 4-character password which a user must enter to unlock the door. Alice is in charge of generating passwords and programming the lock. Bob wishes to guess the password but has no information other than the general mechanism Alice uses to generate the passwords.

Calculate Bob's Expected Uncertainty (Entropy) in the password given Alice's following approaches.

a. Alice is numerophobic so she chooses uniformly random letters only.... no numbers.

b. Alice chooses each character by pulling the characters out of a hat and then replacing the characters before drawing another. (Cryptographically random)

c. Alice flips a fair coin 4 times to generate a number between 0 and 15. We will call this number, s. The first character is the hexadecimal representation of s. The second character is hexadecimal representation of (2s +1) mod 16. The third is the hexadecimal representation of (3s+ 2) mod 16 and the fourth is (s+15) mod 16.

d. Alice chooses four character l33t speak words at random that are representable using 0-9 and A-F ... see **http://www.datagenetics.com/blog/march42013/index.html **for a list of such words.

e. This one you cannot directly calculate.) Alice chooses words as in d, but instead of by uniform chance, she picks familiar words to her. What happens to the resulting entropy in comparison to d) and b)?

(3) Grab several paragraphs (200 hundred characters at least) of plaintext from the article,__ __* http://programming.oreilly.com/2013/10/security-after-death-trust.html* .

Use the C code at** http://www.ece.iastate.edu/~daniels/hist.c** to generate a histogram of the characters in that text. (Note: you can use ideone.com to run the C code if you don't have a compiler handy.) Copy or import the resulting histogram data into a spreadsheet and graph the histogram as a bar chart.

Use the site: ** https://sharkysoft.com/vigenere/1.0/** to encrypt those same paragraphs with the following keys: C, THISKEY, THISLONGKEYALSO

Turn in: Generate all 4 histograms, (plaintext and those of the 3 ciphertexts) and compute the entropy and index of coincidence for each histogram as well. What can you observe about the entropy of the ciphertext as the key length increases? Why? Compare the entropy in the plaintext and that of the first ciphertext as well and explain.

Problem (4): Consider if you generate a Vigenere key as long as the message from a cryptographically random source, what would you expect the entropy of the resultant ciphertext to be? Why? Hint: What would you expect the histogram of the ciphertext to be?