## Linking Expander graphs to DES avalanche ideas

Having been away from 1950s cryptanalysis for a few weeks has helped – in the sense that we come to the material now somewhat refreshed; able to see connections that we missed. In particular, we look at expander graphs – linking modern terminology to (1) tunny-era statements, and (2) Turing’s “On Permutations” manuscript. The former has reasoning about “determinants” that might disappear (as one considers chains of bayesian factors in the ‘algebra of proportional bulges’), while the latter considers covering groups and quotient subgroups in a context relating such subgroups to a non-abelian parent group. In the first case we are interested in finding weak discriminators in an inner product world; and in the second within which one can create coin flipping operators for strong ciphering.

The so called isoperimetric constant of a graph (representing the theory of 1950s cryptorotor-era wheel wiring plans) is best thought of intuitively – as conductance. That is, the reciprocal of resistance. (Conductance is a feature of an actual resistance to flow when it fails to impede connectivity and fails to prevent causal dependency).

We are not concerned with the physics of “electronic component resistors’ – heating up as electrons are impeded. Ratherm we are concerned with a parallel concept set – in the math of uncertainty and improbability. To study this we need to graphs a modicum of engineering math.

Let’s recall the basics. The finite group gives us vertices, edges, generators and group operators – quickly allowing us vector spaces over a field.  This all rapidly gives way to the adjacency matrix – a first linear operator that allows walks consistent with the algebra of the group to becomes walks in the world of functions. If the group element is a enigma-era sub-cycle (such as (1 2 3)), then the function associated with subcycle is just the set of component mappings (1 –> 2, 2 –> 3, 3 –> 1).

If the L2 homomorphism is onto a L2(complex) space, then a vector made of of 3 ordered, complex numbers is mapped by the (ordered) components of the cycle, back when sending electricity through the wheel wirings of such as an enigma machine).  Rather than walk a wheel or a wheel cage with electricity, now one walks around nodes (specified by a complex number) on the unit circle drawn in the argand plane. The role of electricity is played by inductions.

Next, we get to summarize the differential operator – on finite graphs. Just think about what you were taught at 14, when considering the tangent to a line. Now generalized and see perhaps that the underlying idea allows us to measure flows, contrasting the vector space of edge-centric functions with those of vertex-centric functions. There is lots more to be said here, pertinent to differential cryptanlaysis. But there is so much to get to, we move on so we can focus on 1940s “proportions”!

When thinking now in terms of walks on the abstract rotor-wheel – namely the unit circle – we can consider the special case of that unit circle with 4 quadrants (pizza pieces). If the right most point is known as 1, walk in a anti-clockwise manner till one gets to the point labeled i (uppermost). Keep going, and we to to –1 (leftmost), and then to –i (lowest). Obviously one can keep going and get back to 1, in what was a 4-element walk. Any any point we have three choices of lines follow: i) proceed to the next quadrant, ii) go back to the previous quadrant, or iii) following the “quadrant-sectoring line” (shared by neighboring quandrants) to the origin (and beyond).

If our abstract ciphering wheel were to have 26 points (rather than 4), obviously our complex numbers at each point on the unit circle would have a more obvious mix of real and imaginary contributions. A walk need not be sequential, note; but like the DES expander/permutation graph might involve 3 generators – back, forward, and cross the wheel! Of course, we hinted at this above, already.

Now we must think more like Turing as as teenager, interested in ‘light cones’. I now think of an inner product space as I’m sure he did at that age – as being induced by the two vectors that define the geometry of the “ice-cream cone”, whose inner space is full of interesting stuff….that we put in the freezer.  Within the bounded space of the now crystallized ice cream, discrete branching occurs where ice crystals meet in a lattice – and this enables one to consider time-line evolutions of (discrete) quantum mechanical systems. In this space, distances between branching events on the same time line are measured p-adically, as are similar points in relatively future or past timelines. We are interested in causal dependency, connectivity and reachability – as our notion of “crypto distance”. And for that we need to consider rayleigh quotients, that bring in norms.

In particular, we will focus on that ice cream cone whose tip is at the unit circle’s origin and one of whose vector is that right-head line from origin to point 1. The other line that defines the is cream cone space is at an small angle above the first. Now, what is interesting is the relationship of that angle to the power-limited coding regime, to Colossus decibannage, and to products of eigenvalues to certain powers (thinking in terms of enigma wheel-breaking and bigram-breaking cryptanlaytical theory, and the theory of accurate convergence in Tunny rectangling based on products of “factors”).

The notion of a standard basis comes up next,  getting us to the idea that functions can be expressed in polynomial form.

next we have to bring in the eigenvalues of regular graphs which allows us to focus on a special cases of the polynomial world of calculating – that in which one basis function, L0 say, can be expressed in terms of all the other terms. For our purposes, which is not math, we just  said that its WORTH focusing VERY narrow  ice cream discussed above, as it’s a compressed world  the L0 space – that captures any other calculation.

So, we get now get to our point. And that is to realize that in DES and the avalanche concept, we are really dealing with the angle of the ice cream cone, above. For when the angle is less than 2 cost theta, we are working in the power0-limited coding world (see Forney) – which is of course the crypto world.  The special cases of the theory then kick in, so that there is causal dependency throughout the branching space from a first point to all other  future branching points – and the math ensure that one has minimum conductance as the time horizon move out. This ensures that each future event is dependent – which is what the avalanche process is all about. By limiting the initial angle of the wedge of the ice cream cone to the decibannage of a single bit, one gets the DES world – in which a change of 1 bit induces certain future changes, on average.

We finally get to Turing’s model, in which a given wheels upright is put to a power, and put into a product of n other wheels similarly, all using the same alphabet (and cycles). Now we are in a yet more special case in which the output value of the function is the same as the power (of number of walk steps). Now it makes sens that Turing requires that the powers sum to zero, as that’s the same thing as sayhing that the values of f(.) sum to zero – which gives us that special eigenspace of the ice cream cone known as L0 (squared).

what was interesting finally has to see the relationship between a sequence of tuny-era factors, represented in decibans, and multiples of eignevalues of the adequency/laplatian linear operators (to some power), as products of wheels are considered. One immediately things of the single bit case, in which the product gives one a very high precision discriminator of a particular walk through branch space.

Now it makes LOTS Of sense that the DES hamiltonians, creaing a 3-degree expander family in the timeline of the ice cream cone inner product space (whose unit measure is that of a bit) include multiplicities – since this what causes the  cones space to only be ever self-expanding and fashioning a reasoning space for per-bit plaintext/keying dependencies that, none the less, can be reduced to floating point numbers, at high accuracy

It also makes clear how ditance from the constant function plays well with the relationship between the laplacian of a function and the function, in fashioning the desired inner product space in the first place – once suitably constrained, that allows for the differentials in edge space when compared to weights associated with the output space to be converted into baysein factors that either facilitate ciphering  – the one way function on plaintext (or faciliate cryptanlaysis).