## vanishing terms in Cryptanalysis

We finally understand some of the assumed background used in the Tunny cryptanalysis, circa 1944.

Chapter 1 – Introduction

Of course, we recall how Turing, in his On Permutation’s manuscript, went to some trouble to model parity – recalling our belief that his underlying mental model was one of attempting to apply the quantum mechanics model where he used group theory to model faltungs, group theory to model a hamiltonian, and group theory to create a coset-code that would represent the algebra constrained by the special orthonogonal group type (of symmetries).

And we remember the Tunny authors being concerned about when matrix elements would vanish, when calculating their correlation bulges using faltungs/convolutions. Well, now we have background!

When Tunny folk said:

http://www.ellsbury.com/tunny/tunny-130.htm

We should read this as that the “analytic” colossus run is applying a “selection rule” (with a dot-centric view of similarity and parityness). delta is in some sense the entropy of the run, which of course is related to the bulge due to the correlations between the two bases (Chi1 and Chi2)

The original statement is of course rather clearer: “Another way of stating the result is that the hypothesis ΔΧ12 = dot is Θij pips up, where a pip is 10log10ζ decibans.”

So, having represented the change of basis operator as a matrix (of probabilities), they then project the values onto a custom inner product space whose transformation of the coordinates of a wheel space into the coordinate of this inner space induce an averaging process.

http://www.ellsbury.com/tunny/tunny-130.htm

One sees how the operator/averaging distills the excess of evidence form the sample to a “summary sign” – which updates the current accumulator (of pips) of the cryptanalysts belief about whether the wheel bit’s sign is truly a dot or cross. one has to think of pips as having a units of h-bar (to be figured NUMERCIALLY, later).

Essentially, folks have symmetrized things (“and this is symmetrical with respect to the two wheels.”) using an actual numerical method involving coordinate transformation and convergence – much as we saw Susskind do, earlier, in theory.

While my math is still not good enough to really understand the formulae used in “accurate convergence”, I do see how proportions (in a custom inner space, based on non-linear relations) are involved:

The precise interpretation of the pippages of the characters of a wheel, as the result of a crude convergence is that they are proportioned to the decibanages assuming the pattern of the other wheel to be certain. In practice the relationship between the pippages and the true decibanage (assuming the patterns to be substantially correct) is not linear (see R3 p 132).

Folks assume that one vector (of wheel bits) is certain (i.e. an eigenvector associated with the eigenvalue), whereas the other vector (of the other wheel’s bits) is a projection or the first, with some proportion indicating how the two are linearly dependent. Here we see, of course, classical quantum mechanical thinking, based on advanced understanding of custom hilbert spaces and symmetry operators.

We do understand a little about the accurate convergence “form” from the simpler example given for a special case:

http://www.ellsbury.com/tunny/tunny-132.htm

Here we get the feeling that f(k,1) for the general case is related to f(1,1) for the special case; perhaps suggesting – given the way that the special case is special (RELATING 1 bit to 1 bit)… that the k all along referred to a function relation k bits to 1 bit in the general case. It makes sense then, that the pips/decibannage from k sources would be a power to the delta to the k), and the power of k+1 sources is delta to the (k+1)…

one starts to see the “wave function” (of cryptanalysis) show itself:

http://www.ellsbury.com/tunny/tunny-144.htm

and clearly one sees a general art of cryptanalysis (based on quantum correlation ideas) – that are not specific to Tunny.

One sees how, building out from wheel bit  folks spent enormous energies on the correlation functions of lots of tunny impulses. But, generally, one sees the classical convolution (in an assumed 32 dimension space, for 2**5):

http://www.ellsbury.com/tunny/tunny-074.htm

And one sees how then it also applies to conditional statements, too.

So, in summary, we see some background (which also relates to how, if I were GCHQ or NSA distributing DES, and wanting a backdoor that only manifests once one knows how to introduce “just the right perturbation” that facilitates a probablistic detector for key bits, etc).

Chapter 1 – Introduction