backdoor a DES concept


Let’s combine our previous post with an older missive that looked at the 3 “permutation-generating” Hamiltonians of the des “master permutation and sbox design”, at https://yorkporc.wordpress.com/2013/02/12/des-permutation-and-keyplaintext-dependency/

We know that Tunny-era crytpanalysis leveraged conditional expressions with multiple propositions, working in a search space looking for the contradiction. When looking for that contradiction, one may choose to go looking for the all false case in an n-ary or truth table. This is that case which gives one an n-product expression whose probabilities multiply. In a bilinear-transform’s recasting of such product-probability as bulges we get 1/2(1 + PI1.PI2. … PIn). This is the tunny theorem cited as a chain of unreliable witnesses…

So let’s put the “permutation generator” of DES to work, realizing that its job is to formulate a world in which only one ciphertext can be the solution of a special n-ary or-based multiplexor truth table, parameterized by the plaintext and key.

LDPC code generators help us showcase this idea set – and how DES-class permutation-generators work. At each DES round, the goal of the “permutation-generation” phase is to fashion up a particular LDPC “Stub” relationship set. The interplay of this intermediate output with the next round then enables the subkey schedule generation process to impose a conditional ordering relationship.

While the permutation generator ensures that we are developing a multi-bit/particle trajectory process that never backtracks and whose evolution lines never cross, the multi-round process with conditionality ensures that any two intermediate outputs have mutual dependency relationship that are being driven, at a exponentially decreasing rate, away from each other. The combination of these two process is calculating and re-calculating a maximum distance between any two intermediates, where the dimensionality of the sapce in which distance is being measure expands by one, as any one round hands off to the next. This induces the affine plane property, resistant to linear cryptanalysis, which which any output particle/point is at a maximal distance from any constant function.

Now what we can expect the NSA cryptographers to have done, via the key schedule design, is to engineered injection of a subtle bias into the process – so that a savvy cryptanalysis can know that the intermediates are, RELATIVELY, a maximm distance to CERTAIN affine subplanes, at each round of the several rounds – as with the definition of the constant functions being  defined by the subkey selection process, And, we can assume that just as there is a master-equation that correctly generates LDPC code generators at each round, so there is also a “master affine space” generator that will – if you know the trick – give you an advantage. This enables you to know how (probability) distance properties between intermediate results and constant functions WOULD evolve, as the very notion of distance measure is progressively expanded.

Of course, to guard against differential trail-based cryptanalysis, the very same subkey schedule must also be correctly generating dependency relations. The design thus has to meet two goals: make the permutation-generator resist differential cryptanalysis, while engineering a back door into the linear cryptanalysis defenses.

What this will mean is, assuming you are good a guessing certain key bits, is that you can apply your “master affine plane knowhow”, since its “tuned in” to the “general” subkey scheduling design of DES, YOU GET an edge when guessing which keybits trajectory evolution is correct, in any intermediate round. A correct guess of one DES keybit at the outset progressively amplify your ability to guess with certainty further key bits. And one may need, by the usual contamination of network adaptors, drivers etc, only a few initial keybits, to prime the avalanche process.

The subkey scheduler concept of DES is core to its backdooring, evidently. On the one hand it has to work with the P+E+S process of the master equation to ensure distance is maximized, when generating ciphertext. On the other, it has to work asymmetrically to allow a most likely predictor of a particular set of distance generating evolutions to cryptanalyze a set of depths, such as those produced by DES-CBC.

Then you have your friends in IETF bias the telematic standards so there is lots of depth-producing CBC mode security services.

All in all, one needs the exploit in the cipher, the random number generator, etc, to give one the “start”… just as was the intended output of Tunny rectangling and convergence. That process indicates that wheel breaking, from cipher, could leverage wheel bits from the rectangling convergence process even though which of the signs were still wrong.There was enough information to work as a detector and amplifier on the next phase of the process, which is all one needed/needs.

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in DES. Bookmark the permalink.