My familiarity with borel sets, Lebesgue “theory” of integration using set theory, and random processes pertinent to cipher making and cryptanalysis is slowly improving, such that I can go from the “hard” world of algebra requiring continuous modeling to that modeling in the discrete world of graphs and groups – that assumes the same “advanced” algebra architecture skill but less math skills
A decent, math-centric, but not math-pontificating rendition of the hard stuff (for me) is
unknown phd thesis.
…continuance, showing probability measure being “calculated” by an integral
…continuance, getting to transition functions, composition of such functions (DES round function outputs) and the elements such as “invariant probability measures that can generate a semigroup of such transition matrices.
And we recall the same architectural framework (without all the borel sets, etc) applied to mean-sets (a particular notion of expectation).
We also note the stronger theory presented by our known phd thesis writer, concerning averaging and probabilities in a generator matrix that … from which one can get a round functions transition matrix.
We also recall the excellent tutorial on cryptanalysis from http://www.engr.mun.ca/~howard/PAPERS/ldc_tutorial.pdf who clearly shows “non-theoretical” “difference table calculations”, for both the linear and differential cryptanalytical attack styles.
(I really feel like I’m reading about modernized Colossus-era bulges and 32-counts, here, note!)
I struggled a little with the E and W of the otherwise excellent max likelihood framework paper of Murphy et al:
but I got the feeling that they were alluding, with the choice of roe, pi, E and W to a background theory set, which MAY be that of
unknown phd thesis writer
and of course, we don’t forget the wonderful tutorial style of Baez, who got us competent in stochastic reasoning (vs quantum reasoning) generally
We did study the quantum harmonic oscillator late one night, listening to an oxford professor. Understood very little of it, since got lost in the quantum math manipulation that he was (also) drilling folks in. He was bored, and so was I.
But, Baez successfully gets us to “think” in terms of exponentials, as formalism for quite linear things involving rates of positive and negative change. He also de-mystifies all the conjugates and adjoints, giving them now quite intuitive appeal. Even the commutator is quite intuitive now – as, merely, the difference in some combinatoric perspectives between adding and subtracting (multiplying or differentiating).
And he get us to sboxes, even:
we see now the linkup between the hamiltonian and the stochastic generator (and exponential decays).
This is all very good, and I feel we are getting to the “heart” of cipher design. Wha’ts more, by looking at the cryptanalysis angle, it even all reeks of Colossus and 1930s quantum math still (which gives me confidence we are dealing with the fundamentals).
Its all very good saying that there sill be balance, etc, but now we are gewtting a mental model of how a bit-flipping machine (DES) actually accomplishes such balancing acts – and WHY this then creates a practical scheme for permuting “message classes”, all of which produces the required effect upon maximum likelihood calculations, etc.