Having seen how his work on an imaginary computing machine helped develop the digital computer by classifying the part of the mathematical space such an apparatus could decide, Turing turns to proposing an imaginary biological machine. This machine is concerned with letting the decidability of computable numbers support the simulation of non-decidable processes – found in biology. Computers can “think” (like brains) … if they can simulate that which they cannot decide.
Its kind of hard to summarize what the author so properly already summarized so well in his opening paragraph — a fragment of which we now quote:
This biological machine builds on Turing’s earlier imaginary machine in the sense that it explores using numerical algorithms running on the first digital computers to let equations model the physical properties of simulated machine; and explores how structures in the programming of the machine could support the geometrical aspects of that machine too, applying decoding algorithms leveraging reverse probabilities to emulate backwards reasoning. matrix based and automata-based coding and decoding methods were thus built on the imaginary computing device in order to let it emulate the new imaginary biological machine. We can guess that math of that eras understanding of (de)coding was applied, letting structures decode based on the weights of sequences or code using generative (linear) programming methods .
What he says in in own words is rather more clear (and somewhat more accurate!)
Considering whether what Turing had learned about coding theory from his GCCS and other WWII security “research” days could be related to what he has to say about genes and rates of reactions, we note what he says:…
Obviously, the biological reality of genes is only an representation artifact – for turing’s purposes of postulating a new imaginary machine. If we look beyond Turings own writing and consider the commentators, we see a nice explanation of what’s going on (stripped of its history of association with coding theory, rotated-upright generators, cyclic code generator matrices, roots of unity, etc). We get to the modern conception… of “Turing instability”
We recall how in DES we say length-scales being applied, built on the geometry of the stretched time *line* formed by the two parallel processes built into the feistel cycle acting in collaboration of the perturbations of the key (amplified by the expansion register’s own dynamics). Slowly, a unique distance measure is formed that is a function of the plaintext; and how a conjugate measure forms up from the key, as a function of the plaintext. The net result, as explored in difference set studies, is a characteristic that contrasts local with remote effects – producing a unique mixer, which non the less exhibits the properties of its class.
Open research on PDEs estimating large growth fib sequences would not have gained Turing any friends, in the period in question – given its applicability to codes and ciphers and given the methods of numerical simulation that it was using (that are not dissimilar to the UK work on simulating nuclear explosion evolutions also being calculated on the same class of digital computer).
The net result of growth constrained reaction-diffusion on the sequence of fib ratios is clearly illustrated here:
In short, if the local/far transport theory holds then symmetry is perturbed on a predicable (though probabilistically-defined) evolution path leading to the decision to spot two points (say) … from which 2 cylindrical legs grow from the cylindrical/circular symmetrical embryo (much as two leaf stems spot off from the nicely cylindrical main stem). The subsequent spotting of 4 fingers of the arm “stem” is the nth evolution of the sequence, with fingers logically spotting off from the “cylinder” of the wrist etc.
OK. we see the general reasoning style – in which numerical algorithms computing PDEs allow the limited computing machine to emulate a machine that it cannot “represent” directly. But, math comes to the rescue, as an intermediate language of expression, and relaxation algorithms tuned up for approximating differential equations let the machine do work that emulates nature = that too is reduced to discreteness. In this sense the machine is limited to what the brain does, and has the power of the brain – having the same fundamental limits and possibilities.
So where is Turing going, in his later manuscript? Sidelined at Manchester (as folks in the core are interested in the computer, firmware and libraries, and only peripherally on the application of computations), what is the thinking? how important is the math (to pass the review standards of a math journal, say)?
Anyways, those questions we can table pending additional reading. FOr now, we can simply collect basics:
which gives context to modeling phase as treated in/on the complex plane, as taken from Turing’s own published paper:
Now our own leap may not seem particularly intuitive (yet). First, we set the scene:
and then focus on how we can link A and A’ in the complex world to correlation matrices in the DES/AES world where plaintext == homogeneity (in the phyllotaxis world)
Now, the analysis in the Deamon paper goes on to show how the tunny-era sign function is a basis – of the WHT. In the world of cryptanalysis, folks are interested in the dependencies between states, as “measured” by a custom measuring stick that is “correlation aware”. Probably on the basis of group function parity analysis, folks have figured by 1945 that one can relate the weights of codes to the sign (aka n-dimensional parity) function of the code in question.
Now, for a Turing, who is perfectly at home moving from real to complex to wht spaces, he knows that the kinds of rotation operations he was doing earlier (in the permutation group representation of any group) correspond to the dyadic shift operation – once the world of Boolean operations has been transformed in the dyadic world via the wht. but, this is not enough – since that is an entirely ‘linear’ concept set.
Applied now to phyllotaxis and PDEs, remember how we saw (in the complex plane) an evolution of A from A’ (given a phase change), much like we saw in the permutations paper an evolution of the Rf ( a composite of many Rfs, once normed)? If the condensation effect happens on the foundational permutation group, one may assume (via correlation matrices and weights, and distributions of weights in the support space) that there is an equivalent condensation effect in the phyllotaxis world. There are a couple of “phase transitions” in the works…
He explains this via the metaphor of the cannibals and missionaries, and the evolution of the change operator that “condenses” in patterns of nears and far transportation of localized changes. Once the food runs out (for a runaway condition), the system still finds an equilibrium, limiting generational growth. In short, a leg (once budded) only grows so long (upto its max, in a ~12 year old). Thereafter, we are in a space Turing doesn’t address (where the organism shifts fundamental basis … from child to adult)