cipher knowhow…as it evolved


My familiarity with borel sets, Lebesgue “theory” of integration using set theory, and random processes pertinent to cipher making and cryptanalysis is slowly improving, such that I can go from the “hard” world of algebra requiring continuous modeling to that modeling in the discrete world of graphs and groups – that assumes the same “advanced” algebra architecture skill but less math skills

A decent, math-centric, but not math-pontificating rendition of the hard stuff (for me) is

image

unknown phd thesis.

image

…continuance, showing probability measure being “calculated” by an integral

 

image

…continuance, getting to transition functions, composition of such functions (DES round function outputs) and the elements such as “invariant probability measures that can generate a semigroup of such transition matrices.

And we recall the same architectural framework (without all the borel sets, etc) applied to mean-sets (a particular notion of expectation).

image

http://www.math.columbia.edu/~thaddeus/theses/2009/mosina.pdf

We also note the stronger theory presented by our known phd thesis writer, concerning averaging and probabilities in a generator matrix that … from which one can get a round functions transition matrix.

image

unknown phd

We also recall the excellent tutorial on cryptanalysis from http://www.engr.mun.ca/~howard/PAPERS/ldc_tutorial.pdf who clearly shows “non-theoretical” “difference table calculations”, for both the linear and differential cryptanalytical attack styles.

image

(I really feel like I’m reading about modernized Colossus-era bulges and 32-counts, here, note!)

I struggled a little with the E and W of the otherwise excellent max likelihood framework paper of Murphy et al:

image

http://www.isg.rhul.ac.uk/~sean/maxlik.pdf

but I got the feeling that they were alluding, with the choice of roe, pi, E and W to a background theory set, which MAY be that of

image

unknown phd thesis writer

and of course, we don’t forget the wonderful tutorial style of Baez, who got us competent in stochastic reasoning (vs quantum reasoning) generally

image

http://math.ucr.edu/home/baez/stoch_stable.pdf

We did study the quantum harmonic oscillator late one night, listening to an oxford professor. Understood very little of it, since got lost in the quantum math manipulation that he was (also) drilling folks in. He was bored, and so was I.

But, Baez successfully gets us to “think” in terms of exponentials, as formalism for quite linear things involving rates of positive and negative change. He also de-mystifies all the conjugates and adjoints, giving them now quite intuitive appeal. Even the commutator is quite intuitive now – as, merely, the difference in some combinatoric perspectives between adding and subtracting (multiplying or differentiating).

And he get us to sboxes, even:

image

http://math.ucr.edu/home/baez/stoch_stable.pdf, p53

we see now the linkup between the hamiltonian and the stochastic generator (and exponential decays).

image

http://math.ucr.edu/home/baez/stoch_stable.pdf, p53

This is all very good, and I feel we are getting to the “heart” of cipher design. Wha’ts more, by looking at the cryptanalysis angle, it even all reeks of Colossus and 1930s quantum math still (which gives me confidence we are dealing with the fundamentals).

Its all very good saying that there sill be balance, etc, but now we are gewtting a mental model of how a bit-flipping machine (DES) actually accomplishes such balancing acts – and WHY this then creates a practical scheme for permuting “message classes”, all of which produces the required effect upon maximum likelihood calculations, etc.

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in crypto. Bookmark the permalink.