Turing bombe, Newman Colossus, Delilah cipher


The more I study what used to be really advanced mathematics (in 1930) the more I seem able to get inside the heads of the core UK WWII-era cryptographers. The wonders of video these days allows good teachers – vs. posturing tenure-wannabees – to show the likes of me the meaning of the now not quite so advanced models. In such as measure theory and graph theory, they take us back to how the original thinkers thought about stuff, as they built up formalisms for the quite straight forward intuitions – before the second wave of professional bombers hit and made formalism a means to preclude most of us.

And so it is that I now look at the Turing bombe as a mix of two traditions: a tuned elimination-focused search for exceptional groups (and thus identification of the desired non-exceptional groups) based on searching out classes-at-a-time of conjugates; and projection theory – in which the menu of fix points is constrains the validity of the space of such algebras so one could parallelize the testing.

Similarly, I now look at Colossus as a pure cryptographic computer, built around the very notion of measure theory. One sees the parallels: a  pure event space (the space of all correlations), a sigma-algebra (the Colossus panel that specifies one or more (compound) Boolean expressions within an algebra) and a measurable function (the log-likelihoods in base to some small decimal aligned with the decibanning/scoring system for the sampling function).  The Colossus was not necessarily calculating with a measure that lead to a pure probability space, but was calculating more generally to effect any desired measure space – one that COULD work as a custom calculus scheme (such as with Lesbegue measures), performing convolution. Ones sees the weight classes used in Colossus algorithms used to measure those special non-linear decoders (aka cryptanalysis) that require that the sinks states of a lower weight class of source (aka) input bytes are the next weight class (a super secret, even as late as 1998).

We can see from this history of ideas that Turing was perfectly capable of bridging the world of the bombe with the world of Colossus – each a type of calculating apparatus. While the first was working in group theory and local neighborhood-based averaging aiming to leverage expectation measures as the basis for non-linear function estimation, the second allowed markov chains and linear functionals (i.e. L1 spaces) over probability density functions to get to chaos-based mixers (and non linear analyzers). From this one reached the world of cipher making (vs cipher breaking), enabling complexity arguments that could then use conditional expectations as a basis. This would allow “stepwise” partitions of measures (e.g. weight classes) over the plaintext to work with graph theory underlying a finite state machine to act as a “re-measuring” markov chain. This would “refine” distance and so concentrate expectation that even a little noise (presaging RSA) could then control the stochastic evolution to give a cipher with now provable security properties. The latter is important, existing in the sense that an entirely deterministic DES-like machine can used keys as a noise source that verily induces the complexity against probabilistic attacks.

Would be great it GCHQ research could go find more of Turing’s OWN Delilah design documentation, whose style (rather than specific content) would upon analysis of said style showcase how his ideas were evolving; both in general and when solving a particular engineering voice-privacy problem.

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in crypto. Bookmark the permalink.