Tunny-era Rectangling and modern cryptanalysis through the eyes of modern physics


Let’s look at the really quite illuminating crypto-analytical argument Susskind makes when talking about false vacuums in physics, comparing it to what we imagine Turing’s and Newman’s conception to be about the convergence and rectangling process used for Tunny wheel breaking. Let’s assume that both of them thought in the same formal terms concerning the physics that Susskind discusses, when formulating their cryptanalytical methods. Also, let’s assume that this thinking was largely a secret from even the senior cryptographers doing daily breakin work at GCCS – “who didn’t need to know” the basis of the methods they were using.

Of course, Turing in the 1940s was more than “just aware” of physics formalisms using concepts such as future and past light cones – that model future and historic causality. And, he would have been familiar with relating such discrete notions to conic sections and hypergeometries, for continuous space versions of the same theorems. He was, of course, used to applying a 2-adic numbering scheme, with their before-the-point integers (future) and after-the-point decimals (past) characterizing flats (paths) in the graph. (Think of the the decimal point as the “point” used as “current time”, enabling a past and future perspective thereby); and obviously had the knowhow to be doing arithmetic of numbers expressed in this numbering scheme. From our look at his On Permutations manuscript, we know Turing was modeling markov chains and random walks in causal spaces with attractors, considering cayley graphs as his modeling apparatus.

Let’s now rewrite Susskind as if presenting Turing’s conception of cryptanalysis, noting his Turing-atypical manner! That is, we see Susskind’s at his best – showing his excellent technique of reading a math argument and indicating how each symbol introduced or how each step performed is to be first based in an entirely plausible intuition. This contrasts with Turing’s style of albeit written presentation.

So to Susskind. He introduces an almost purely thermodynamics world – which seems to assume that equilibrium is found only in states said to be in one or other “minimum” (as modeled in his functional picture). In the Turing world, one thinks more in terms of instability – or the infamous http://flutuante.wordpress.com/2009/11/14/turing-instability-cannibals-and-missionaries/. Moving between minimums “takes” some explaining since there is resistance to such change. Susskinds point is that one might break such thinking and realize that, like cannibals and missionaries, there may be multiple attractors to each minimum going on in parallel, influencing the dynamics.

image

image

His lectures goes on to introduce the space of possible evolutions (for a 2-ary branching tree known well to any computer science student).  he uses the term “boundary” for the possible leaf set – at the nth branching level.

image

image

Having modeled causality and used p-adic numbering to allow entropy-influenced calculations of thermodynamic equilibrium that can consider both future and past impacts on a current minimum of assumed equilibrium, he relates the branching graph back to the attractors. The associated thermodynamic state could have been found to have been in that minimum or “colour” known as ‘n’, earlier in the time/evolution. This is despite a particular walker of such a multi-verse being in colour ‘m’, now. Transitions of a walkers state from color n to m (or the reverse) are then modeled, thereby introducing some “dynamics.”

Susskind seems to hint that there is considerable “inertia”, working against changes of colors whose rate is modeled by gamma (for n to m).

image

image

These dynamics turn out, much as they were in Tunny cryptanalysis, to be related to the combinatorics – which leading naturally enough to the definition of probability functions. The latter consider change of colors (or the leap from one minimum to another, overcoming the inertia). The world of cryptanalysis has a related process in which potential of changes must overcomes the “cryptanalyst’s belief threshold”. At a given discrete time (where time is discrete, and equates to a p-adically labeled branching point), we get a probability function – of being in color/minimum M. This is expressed as a ratio of the count of the nodes actually in color m to the total number of nodes (events) that can have one of the colorings.

So we get to a differential equation – considering the same ratio but now allowing for a color change to color n, from colour/minimum m, at the next branching level. This we can read almost as in Tunny when folks used similarity evidence gleaned counting bits on a tape of cipher.  In Tunny, such counts would be working for or against a change of the cryptanalysts gut-belief (that hidden markov variables representing the unknown cipher wheel bit patterns have moved from one crypto-minimum to another).

As in Tunny math, we have a rate, gamma, of color changes (for each counting-against or count-for direction of change). This data can then be put in a matrix/rectangle form, with Gnm representing the collation of all the individual gamma terms for a system of color changes, for all ns and ms.

So, Gnm is the just the crypto rectangle formed by reading the cipher tape at certain periods. This reading creates a set of depths allowing one then to count the similarities of the bits in the tape positions found at those periods. Without knowing the underlying entropy driving the state of the machine producing the tape, 1945-era folks reasoned only in terms of the integer number of “pips” for or against dot similarity (where each pip would later be recast in a specific-entropy basis, to be found using cryptanalytical calculation).

image

image

Remembering that a Tunny rectangle for Chi1 and Chi2 was not even square (being formed from enumerations of two different periods), its interesting to note how Susskind decides to diagonalize his matrix, in an assumed entropy basis, getting to his desired symmetric form. The point of doing this really hits home, later, when considering quantum correlations between two walks.

image

image

In this form, he can apply his knowhow in eigenspaces, etc – which gives us stochastic processes. But it gives us more, again setting the scene for reasoning about correlations. A color, or minimum, can be cast much as in Tunny folks cast the world in which one of the 2 cryptovariables was all zeros (or all ones). he captures the notion that the largest eigenvalue (1) ties into a world in which there is no time-dependence (and thus one has a perfect symmetry, that he models).

image

image

where one must remember that his conservation or probability statement is about the nature of the underlying stochastic matrix , expressed back in terms of rates, i.e. image

Furthermore, he is interested in each eigenvector in the eigenspace corresponding to each eigenvalue. He wants to consider a delta.ij “combined eigenbasis” (read as Chi1, Chi2 in the Tunny cryptanalysis) that it itself formed from component eigenbases, one for each color change. (Again, read each “component” eigenbasis as each wheel bit.) So, we get to a vector space model, founded in terms of intuitions about thermodynamics and, for cryptanalysis, human belief systems.

We particularly get to look a a model in which the distance between two depths are considered, using a conformal coordinate system built out of his eigenbasis. He shows how to build a correlator, that fashions a custom-eigenbasis linking the two depths (as if two extremes of a causality path for the states of the crypto machine producing the two depths). He shows how this produces a class of such correlator algebras, which we can see in terms of what Tunny documents talk about, poorly, when combining decibannage from different (compatible) runs.

First, he teaches nicely how the tree can be thought of as an inner product space, with a “conformal” coordinate centric distance metric (that leverages his p-adic numbering scheme).

Screenshot (21)

image

Next, he overlays two paths, considering where in the past (ur) they diverged from a common path, and works in terms of the distance metric between the two (relatively future events, considered to be time zero)

Screenshot (17)

image

We then get to see how he builds a correlator out of two projections, onto each set of color changes that might have occurred in the future paths of the two cases. Then he takes his rate equation, as projected, and uses his symmetric basis, indicating that he can relate the decibannage on the run (as tunny would call it) to just the I eigenvalues of just one of the cryptovariables (I), tuned up for the distance measure.

 

image

Its time to go have a look now at the detailed math discussed in the Tunny General Report, that seemed inpenetrable, before now.

Screenshot (20)

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in coding theory. Bookmark the permalink.