from Tunny-era cryptanalysis to modern American cipher, suiting NSAs mission


In the papers on war-time enigma attach methods, Turing discloses how he originally calculated odds factors. That is, for an assumed urn-style sequence of experiments (abstracting guesses to be made about unknown contents of German naval enigma bigram tables for superenciphering the indicators), he compared the proportion due the bayesian  factors for a experimental choice, in a sequence of choices that are NOT independent, with the factors for a random event. Furthermore, then he cast both proportions in terms of odds, rather than probabilities. The resulting number can then be put on the deciban scale, so one has an additive measure – easily added by people thinking in terms of + 20 or –40 (etc) to an accumulator totaling the “evidence” collected from analyzing depths by walking sequences of fixpoints, repeats, or Tunny-era windows into the deChi.

In Tunny documentation, folk speak of the proportional bulge – a casting of probabilities in a form that suited reasoning about correlations and (quantum style) coherencies within and amongst  data streams – such as plaintext, deChi, cipher, etc. In some sense the bulge is just the bias beyond 0.5. But, its form allows algebraic manipulation and allows a PB functions to be defined on any expression – in much the same say that a probability measure is attached to the sigma algebra of a banach space modeling a probability space. From this the notion of the Colossus run is formed – a run being just one of expressions that happen to produce marked bulges …that are detectable (by colossus-era counting and approximation processes). you have a general-purpose quantum gate, implementing quantum unitaries, if you will, enabling various conditional-evaluations to be calculated (much as in CNOT…)

The evolution of the odds factors to the bulge is interesting. The way in which the very sequenceo f experiments is built into the formation of the odds factor generalizes, in Turings mind, to being a conditional-probability “operator”, and an algebra of operators including operators which are themselves operator-valued (think lambdas)

What is further interesting about bulges is their algebraic rules of syntax – giving one a reasoning framework to express and then to work over the quantum coherencies found between (Tunny-era) data streams. Of course, today, one might be applying similar thinking to proving the higgs boson (at an accuracy level unimaginable in 1945, but at a level that tells you something about modern cryptoanalytical capabilities for “discerning” events – particularly if “placed” in compromised-at-birth American cipher equipment that comes with “detecable” events – if you have the knowhow and the equipment to work at the required accuracy level)

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in colossus, crypto. Bookmark the permalink.