Sean Murphy–on pairs and triples


http://www.isg.rhul.ac.uk/~sean/ paper on Pairs and triplets is a good read.

image

image

 

image

since there are 8 neighboring pairs, and each is subject to the arrival of 4 bits of subkey giving 2 bits (when xored), is that how one gets 16 distributions? Should one think of the subkey bits as the catalytic arrival of external particles shot (in parallel) into the corresponding orbital of the atom?

Seems more likely that we have 8 rounds, and 2 streams (odd and even parity). but, in either case, note how one treats the calculation as one “whole” system of constrained interactions, with each pair of sboxes equating to a pair of neighboring orbitals. lets never forget that 1940-era wynn-williams started out making equipment to count particle arrival rates, using sampling and densities to then compute expectation curves.

So, we always have 2 common bits (per pair of sboxes) and in each case (from IBM disclosures) 1 keybit catalyses one of the s-box’s data output and 1 catalyses the selection of the active permutation (of the other sbox). What matters to the observer is that their xor shows up in the input (reflecting the group theoretic generator constraints).

We should also recall that for a vector space, ket = scaler1.subket1 + scaler2.subket2 + …, the subket can be a partial difference (being itself linear) or a density term (https://yorkporc.wordpress.com/2013/01/21/ulam/). Thus one can think of the DES system in any given keyed-instance as a vector space, with each of the 16 distributions being a subket, acting as a kind of basis in density land. Of course, the securing parties goals are to differentiate the streams against each of those 16 subspaces – to find that space is maximally distant from all of them (in an SVG sense).

While is is true that, in a design-sense, we have subkey bits constraining particular monomial terms (thinking of the 2 or the 3  or the 4 neighbors as the monomial). Let’s not forget the Hamiltonian level constraints that “govern” the differentiating-evolution of the system – as represented by the inner-space created by E, with sbox values and P. This group-theoretic world, of the 3 cycles “generating” the design-space, is the transducer that keeps the differencing (of densities) forward looking and converging to an almost-anywhere limit – that secures against the 1940s-era probabilistic attack.

One sees the keybits as a specification of the starting state of the evolution of the generator (of paths through the spaces). This makes it sensible to see the hamilton cycles – built into (E, S and P) as that which ensures all possible trajectories could be generated.

We also have to remember the era of DES – when folk expected (in secret) to be changing out of the sboxes, so the hardware was reusable. And of course one sees that, in other computation graph designs, where the box values are functions of a derived keystream. Thus, the role of the sbox in the path generator does not affect the determinism of the systems evolution. It may, of course, affect the limiting function (which may not bring the transducer sufficiently close to SNRnorm to interdict the probabalistic attack on 16 subkey bits). Of course, I can see how guessing of bits within the 16 can lead to an ever-increasing attack benefit, as evidence accures – looking at the DES backdoor as one in which the goal is to figure the 16 (after which brute forcing the rest is easy). one ha to assume, in 1940s thinking, that as usual it is the statistics of the source plaintext that is facilitating that guessing.

now, since “such” effort has gone into designing a probabilistic density evolution (based on an energy transducer concept), let’s look at the local functions:

image

image

can we look at this table as a bethe lattice, which is now a (generated from key/data) “meta-instruction”  – responsible for now imposing cliques? Doesn’t it just “feel” like a averaging filter designed to sharpen/blur (when used in convolution)? Can we see DES as an n-stage convolutional codebook generator? – much as Forney taught the theory?

image

http://en.wikipedia.org/wiki/Convolutional_code

Anyways, getting back to facts, we see the outline of the algorithm used in colossus – expressed in non=-colossus-era language (no faltungs, no bulges, no tunny-specific differential trials inking motor to psi, …). We do have their equivalent, however.

Remembering that Ei is a count (or a multiple of counts…)

image

image

image

The sampling is a product of 255 terms, 1 for each possible output value. Since we have generated a table whose elements can be counted to give the the incidence of individual output values, we can now invert the idea and say that since we have a probability measure for any given value (in terms of k, the signal that 2 key bits are the same, as in Tunny wheel patterns) so we get the probablity of a given output expressed in of the higher incidence of given output biasing its impact on the sampling probability. (remember the tag line of this blog!)

anyways, think back to 1920…when measures and probability were more in vogue when doing counting:

https://yorkporc.wordpress.com/2013/01/19/a-turing-text-book/

Since this problem is a binary choice problem in stats, which in Turings mind is a means to address probability of a state transiting to the next between two path options, so then we move from probability reasoning to stats:

image,

image

and we see how reasoning about (-1)i uses the resulting sign to add or subtract (as we saw in Tunny “advanced convergence” theory):

image

image

With saw this, earlier, in Tunny:

https://yorkporc.wordpress.com/2013/01/18/fried-fish/

This evolves, much as with Colossus counting, into a counting process..for possible outputs, where one is now interested in using SD tails, etc to find max likelihood etc. As we learned from Forney however, its more fun where two curves cross (one for p, one for 1-p), allowing one to build in reasoning about error minimization and SNR, when decoding.

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in crypto, DES. Bookmark the permalink.