we tend to think of ACE as the computing engine designed by Alan Turing (and others). But, knowing Turing, it might have a second meaning – particularly for someone trained in algorithm complexity (pre-war) and cipher complexity (during war).

Coset enumeration is known to be something Turing worked on. Now we know that it’s a fundamental problem in complexity analysis, one likely to be a “study” by those familiar with group theory, limit theory, functional spaces, and measure theory (and the mix thereof in formulating a cipher-based mixers using enigma/typex style wheels/drums).

What is interesting is that there is an “element” of public key’ness in the nature of the problem. This comes in the sense that only if one knows a particular trapdoor“presentation” does the same problem faced without the particular presentation become amenable to computation satisfying known-time or known-memory runtime requirements.

we do know that Turing was able to put together several tools of modern science/math. He was able to model a crypto wheel as as graph, with generators, making cayley graphs (and induced algebras). He was familiar with pullback relations between a graph-algebra that acts as a cover for coset-graph. We know he was familiar with modeling an enigma wheel as a convolution, with the graph acting on probability densities. And we know he could argue in terms of representation theory, being able to cast the action of a specifically non-abelian algebra in terms of discrete fourier invariants. We also know he was familiar with the need to perform a search to find “those conditions” that produced certain types of mixers in which the parameters could control how fast a random walk would converge to the stationary distribution. we also know, finally, that he was familiar in this analysis tack with the opportunity to apply automorphism group theory, as part of a scheme for relabeling nodes. Finally finally, we know he could apply semi-direct products. – using a cycle graph to induce random permutation that generate the desired quantum randomness in the algorithm.

IN short, its very likely that Turing knew how to search out conditions in which a certain amount of actual entropy could be used in a discrete quantum computation, modeled in discrete graph theory as above, to induce an expander graph addressing the desired conductance of a network. Furthermore, its very like that Turing, in classified papers, was able to analyze particular setup parameters in terms of the space/time implications upon algorithm complexity, unless one happened to know a trapdoor that truncates the work factor. It seems like that Turing was able to reason about the effect of signal processing transforms, effected as matrices, upon norms