contrasting X method of cryptanalysis with differential cryptanalysis


In Turing’s On Permutations manuscript, circa 1954, he makes an argument about sequences of continuous functions (in continuous spaces). This enables him to reach a conclusion about uniform “limiting” distributions. We have a reasonable understanding  of this theory, now – a theory that, we might recall, was developed by Newman – sitting in the Cambridge chair of wheel wiring theory.

Any science faculty undergraduate doing a fundamentals of crypto course learns about the need to resist X cryptanalysis (where X is really “linear”). Unlike differential cryptanalysis attack methods linear-X is countered using the method of of expander graphs.

Back in Turing’s day, graphs were not necessarily abstract mathematical objects. You could also have in front of you two rather physical and very real enigma wheels and be being required to decide: so how do I wire them up, possibly as a pair, to resist linear-X?

The theory of graphs, state machines, Turing machine walks through configuration spaces, evolution of quantum states, unitary representations/gates and group theory generally, as taught to Turing mostly in the US (hint), all comes into play.

Now wheel wiring theory from the rotor era has to be one of the most erstwhile guarded crypto secrets of all – at least up till 2000. At most, you saw folks discussing the flaws of the enigma wiring plan, perhaps heard how the Russian M-125 machine addressed them to deny folks the Turing bombe attack, used the properties of isomorphs and alphabets split into 2 groups of 13 characters interspersed in a set of 41 pins/pads; otherwise discussed topics such as bi-partite sets, symmetric groups, how permutation groups *represent* other graphs/groups; and, then considered more advanced topics such as Rayleigh quotients.

All in all, one had to get familiar with university level math in norms, inner products and then argument about averaging and closure! As I keep hinting, rather than study it all in the math department, go study it today by learning the applied math needed for quantum mechanics!

I just think of computer science as that branch of computable math that eventually drives what a compiler does – spit out lower-level instructions. In the world of above, this means we need “compiler-math” theory – whose concrete methods take in high-level language input (ideally in the notation of polynomials) and should spit out long sequence of primitive adding, subtracting, and square -rooting calculations. I think of wheel wiring design, circa 1940, as requiring the use of early “compiler theory” – which of course is the ability of math to act on math!

I now look at expander graph theory, and Wikipedia’s article is as awful as the best at making it all almost intractable to the lay reader, as the theory of wheel wiring. But, it is. And its all it is. In one good turn of phrase, the author writes:

The original motivation for expanders is to build economical robust networks (phone or computer): an expander with bounded valence is precisely an asymptotic robust graph with number of edges growing linearly with size (number of vertices), for all subsets.

http://en.wikipedia.org/wiki/Expander_graph

We want, that is, is for the properties of (plant) growth to evolve to become uniform as the evolution process approaches the steady state of the limiting distribution – no matter which particular cells (of the plant tissue) we happen to be studying. Being less able, math types talk about vertices and subsets (rather than tissue types and the cells making up any region of tissue).

All  we are doing is deciding how to wire up groups of pins. To this end the theory helps us out by showing that certain wirings imply certain mathematical facts. Unfortunately, these require you first to re-wire your brain and now think more like a computer scientist – happy to let calculations and numbering be not on a paper but on the surface of a football – ALWAYS.

And that was Turing’s argument about continuity and uniformity. Footballs! For on the surface of the ball, when using a wiring-compiler to output long sequence’s of primitive operations that assume calculation happens only on its surface, certain theorems of meta-math come into play. And these just happen to address linear-X. Its also our contention that they (shush) also address differential cryptanalysis.

One learns eventually that one wants the points of the crypto space – now that they are projected onto the surface of a football – to be as far away from constant functions as possible – meaning that its hard to build an approximating model (that would undermine the security). Expander graph theory helps understand that rather-blah undergrad examination testing point. Football markings, and their “straight line markings”, give anyone a solid intuition for the distinction between constant functions … and points within them.

Certain results concerning what happens when one represents wiring plans in the form of adjacency matrices enables one to consider then topics such as irreps. This just means, taking a larger matrix, understanding how it may be conformed of smaller matrices that allow matrix calculation to be a surrogate for that more abstract branch of math : polynomials. But don’t get alarmed, we don’t need to go beyond deciding how to wire up our enigma wheels.

Certain adjacencies allow a certain kind of analysis, since they are models for calculation on the Reimannian sphere – there I’ve said it, and started to sound intellectual. IN short, one looks at two groups of wirings on the wheel, or between wheels, and one counts up the groups. If the ratio is just so, then all the continuity/uniformity theory comes into play.

I think of this ratio as the Rayleigh quotient, which if positive always allow one to build computerized or manual compilers that output long sequence of … terms …that calculate on the surface of a ball. The notion of continuity and closure makes sense then, as the ball has no coordinate system…and any calculation is relative to any other!

The final point to get across is how one gets from arguments about continuing to uniformity. And that’s hard to say, with getting fancy. All I can really say, simply, is that we are crypto-averaging! And what else is averaging, but … averaging … which means that things happen eventually in an uniform manner – as all the differences add up and spread out.

P.S. The fun part of X is that is NOT that different in theoretical basis to differential cryptanalysis (though not if you listen to the American or Jewish schools). From the attack on Tunny onwards, with colossus helping out or not, one needs to compute a certain form of the Rayleigh quotient, where on minimizes a set of maximized length difference induced by an electron wandering along one or more of the wire between the pins on an enigma wheel. If one studies this, which leads on to Viterbi decoders, one sees that X and differential cryptanalysis are really the same thing. One is interesting is to see the lengths to which the UK went to a) hide the topic of differential cryptanalysis, and b) hide how from the rotor era onwards cipher and coding design revolved around concocting those wiring plans that expressed well known symmetric and dihedral groups, fashioned expander graphs, maximized distance ON AVERAGE, and all in all made the observer see only a uniform set of probability that did little to aid their guessing!

P.P.S. Like Turing, I never said the word eigenvalue. One doesn’t need to. Seeing the theory in linear algebra terms does actually help, however, with the more intricate study of differential geometry and, thereby, differential cryptanalysis.

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in crypto. Bookmark the permalink.