dyadic models in number theory–would NSA *want* to suppress results?


image

http://terrytao.wordpress.com/2007/07/27/dyadic-models/

If NSA and similar agencies wanted the veto on certain results of number theory research (historically), in what areas would the veto apply?

It has to be those results that reduce complexity – having moved the problem from the non-dyadic to the dyadic space.

After all, that’s what colossus did… in reality.

What one wants, to attack  the problem supporting RSA, is an enigma bombe type design – in which its massively parallel. Yet, as with the bombe, the machine is eliminating search spaces (rather than solving the problem).

Assume you own a football field of CPUs, each measuring the size of remote control, and each has 400 cores operating at 2MHz (and even more impressive IO to a common memory bus ranging over a petabyte of (shared) RAM.

What would one do with it – to “reduce the search space”

Presumably the answer lies not in attacking one factor at a time, but at attacking a whole class – so that the work of any 1 benefits the other vectors of attack….until a phase shift occurs – and a “condensation” event happens in the complexity dynamics.

I suppose the obvious question is: well in parallel be computing a random sample of RSA keys yourself (so you have some “fixed points” in your search group).

Now we understand how a binary number was even in 1900 modeled as a binary tree of coset leaders from nested subgroups, assume that such as an RSA number is represented in such a data structure. Whats the better that certain areas of that data structure are shared with other instances of the problem. Can we imagine a large memory system, that recalls (in a meta-graph) which problem instance has evolved the state in that shared area (historically)? Can we assume that if per change enough passes occur through such an area by enough problems instances that it might undergo “a condensation” moment that is beneficial to many instances of search?

One think I note about the American crypto professionals is just how far they stay away from the search algorithms of the enigma era. As Rivest said at the ACTM Turing event, its likely that large math treatises (using the same kind of reasoning style as in the On Permutations manuscript) exist on enigma-era bombe work – complementing the folksy-recounts of hut-N people. So why, today, are there not a hundred papers using that set of tools – treated as a research area with the benefit of modern “computational aids”. Probably the answer is… its far too close to the truth for them to risk their reputations.

The fun part of reading the material on dyadic lines was noting JUST how 1930s ish it all is –being an abstract computing/calculating model in its own right. Furthermore the parallels with a Turing machine “computable number” and the representation of a dyadic fraction are obvious – should a machine program be working on (what was probably the classified portion of the turing work) computable numbers (more than the computable sequences). That dyadic representations facilitate hyperplanes (and all the tools of linear algebra) obviously parallels the work on Tunny break, and obviously exploits the computable sequences aspect. And, finally, that dyadic representations can work in a heat/diffusion type process, with heat operators, and that this leverages conditional probabilities is very interesting.

About home_pw

Computer Programmer who often does network administration with focus on security servers. Sometimes plays at slot machine programming.
This entry was posted in webid. Bookmark the permalink.