One thing Turing was known for, in the first computer period to follow WWII, was his showcasing how to add large numbers, having written “numbers” out, unusually, from right to left. The carry rule would move right to left still, yet enable correct arithmetic. Of course, any first year computer science student studies one’s and two’s complement “representations” – that computer representation of signed integers that makes them amenable to digital electronics doing adding. The study of all this has much deeper background, however, coming from the historical interplay between the art of machine based cryptanalytical methods and the study of the limits of machine computability – a historical topic still largely un-discussed and probably formally viewed even today as being treated as if the theory was still a “secret weapon.

The presenter of the video at https://www.youtube.com/watch?v=vdjYiU6skgE does a good job of stating what folks new in the 1930s about the mathematical toolkit of “techniques” underlying cryptanalytical computing. The cryptanalytical “math secrets” of the WWII period also concerned this ”theory” of computing/cryptanalysis – founded as we see from the video presentation in projective geometry, p-adic distance metrics, special relativity (for timebases), conformal field ideas, cauchy sequences, and notions of causality between regions of certain geometric spaces. The use of such apparatus elements to represent large numbers had a simple initial objective: express number and fractions at high accuracy so as to finesse colossus style processing of correlations to be found in noisy signaling channels. This would have been enabling cryptanalysts even in 1947 to be having , say, 200-digits-of accuracy decimals, stored in the flexible 1946-era drum memory used by the first manchester computers. The resulting adders could then do the p-adic arithmetic – or two complement (when p=2). And “this was the secret”.

But p-adics are more fundamental to cryptanalysis than mere twos-complement – being the “raw theory” of branching. When the Yanquis claim that only their own first research machines did the modern (ie Turing) “stored program concept” based on on a tape’s stored instruction indicating a branching instruction, we might ask if only that only one form of the underlying branching theory – not that you will hear such a question posed by the NSA-crypto and therefore american historians.

P-adics should be thought of as Turing was taught them in Newman’s “special” Topology classes – being the nesting of certain geometric relations and then the mapping of the nested spaces onto 1 dimensional lattices and tree structures underlying space-filling curves – where the branching of such trees is rather intuitive. Branching in this sense – through implied spaces – is rather more “algorithmic” that merely having some stored processor recognize the ‘conditional jump’ bit pattern on a tape!

The art of cryptanalysis is now, and was in Turing’s day, all about having computing models that noone thinks you even might have…let alone think you have reduced to practice (and certainly not have made into cost-effective technologies). Iterative algorithms for updating colossus style calculations of conditional probabilities – teasing out the most liable candidates for solving some crypto puzzle – all have their roots in discrete schrodinger equation calculations, hyperbolic surfaces, conformal coordinates, and leverage symmetric matrices to update the convergence-based newtonian “root-finding algorithms” – as more and more depths of information were fed into the graph-search.