looking a “borel sets” as the ultra-complex 1930s way of looking at what we might now call state machines, we can non the less leverage the complexity to look at DES – another inherently complex analysis. In some ways, cryptographic algorithms are the last hold out of the first “thought experiments” based on continuous dynamics, pre-quantization.

Lets guess that DES uses keys to generate transition functions – that take one state in phase space to another. In general semi-groups are the construct that captures the relationship between a generative process and the operators (that are those particular functors that take some state s to t – the output). While the transition functor itself is an averaging procedure, the semi-group itself is a generative procedure – that calculate the functors that do the averaging.

This makes a lot of sense for something like DES, in which necessarily tables cannot capture all the possible function. So, one MUST generate them. Of course the generating procedure must itself be governed – by “dynamics” that ensure that the resulting functors maintain bounds, on average; and that a sequence of such functors will define a limiting function – that always maintains the bounds.

Now, in DES, the piece of super-secret “cryptographic” knowhow that belies all the above is that the dynamics are expressed in terms of the outputs armed in a pair of sboxes, ensured to be “neighbors” by the action of the DES P-permutation. The action of the permutation, with the sbox values ensure that there is a Hamiltonian condition on the space of all trajectories through GENERATING space so that all possible constructions of output wire relations (to inputs, for PAIRS) are “covered”.

DES is the culmination of the theory of SPNs, and thus reflects “high art” in that particular space of cryptographic algorithm design. Not only must the key schedule deliver dependencies between input and output wires of neighboring sboxes (enumerating fixed points, within the sigma-algebra thus generated), the schedule must also fully enumerate all the dependencies, to ensure that all keybits influence the generative process.

At a more detailed level, the generative process is enumerating the fixed points that then act as the local distance measure (with the cliques acting as a sharpening/blurring function). Not only that, the particular order of introducing the particular keybits influences the success of the limiting process, that controls the mutual relationship of the conditional expectation functors being generated (in sequence obviously), to “continually” minimize the conditional expectation set and thus maximize the likelihood that a given key is in a one-to-one relationship with the associated plaintext and ciphertext.

As a side-effect of all this (and this is the beauty of the SPN design), one captures how much data would be required in plaintext and ciphertext partitions to drive an iterative algorithm seeking the solution to the maximum likelihood problem, aiming to ensure that the amount of work in just handling those classes is tantamount to the trivial case of brute-forcing all keys (which define any plaintext partition to be of size 1 – the plaintext).

Another lesser beauty concerns the nature of the collection of local averaging functions, with their custom designer-measure that reacts dynamically to lessen correlations in the data. The SPN’s design – the particular hamiltonian’s own generator CONDITIONS, as expressed in the substitution permutation connectivity graph – ensure that each local averaging function being generated has a boundary – one that makes it hard for the random event to overcome the boundary’s bias AGAINST the occasional actual random event having a non-local effect upon the averaging of a COLLECTION(now) of sbox pairs.

(see http://math.ucr.edu/home/baez/networks/networks_5.html)

Now, while this all nicely capture the “strength” of cryptographic design against chosen and known plaintext attacks, it doesn’t look at a world in which the attacker (i.e. NSA itself, with its vendor in US cloud land) are compromising the key and thus the key schedule – so as to attack that limiting theorem. Assume NSA have pre-computed partitions of the key space so there are related keys, identified by indirect attacks on the algorithm’s particular computation of the order of fixed point distancing measures. This give them a way in to the sequence of round functions, much as do middleout attacks. But, there are advantage is the pre-figuring of a hidden-markov model that can predict which keys (bits) are related, and which have the best change of undermining the limiting function in a given unit of time.