von neumann algebras–or subroutines!



Things are getting worrying. Whereas once the above text would have seemed like nonsense, I can almost hear now the writer (and his passion for the topic). What’s more, I can start to hear the message – despite all the formal terminology about rather abstract “constructive” computational models. Math is fun (when delivered in essay form).

For context, lets quote



We can build an intuitive interpretative model – that just uses simple computer programming analogies.

If a Lie Group G acts (smoothly) … means G is a program, like the one running in the page you are probably looking at.

“its Lie algebra acts by differential operators” means the program has certain subroutines named A, B, C, … that can each compute a particular formula, expressed using some horrendous expression probably. Perhaps, the expression is a stats function…getting to a (differential) density.

The thing that these are “acting” on … is the data set – a input file of numbers, in all probability. Of course the file has a format, the numbers have types, there are ranges, there is a theory of the “file format”. And the file may even have its own algebra (e.g. the “sql tuple-centric” relational-algebra). But, at the end of the day it’s just a stream of numbers, whose inner-format the program has to know!

The process of acting on “the space C^{\infty}(M)” just expresses the “file format” and the model behind the numbers within. M is the underlying type of the numbers and the “inner-format” governing the stream, and C is the complex form of stream-readable terms that folks have decided to use (because, contrary to intuitions of a 14 year olds doing school math, complex analytical forms are actually easier to work with, as are polar numbers and the forms of exponential trig!) (what does the C notation  refer to, by the way? It’s probably not the complex number type, given the font.)

There is, then, an infinite length stream of such readable-terms…so don’t bother programming using finite size buffers! Think more in terms of “convolution” programming in which you may well maintain state(s), from which you generate your own stream(s), that then interact term by term with terms from the common input stream… If it helps to think in terms of convolutional encoders and turbo code stream generators… built from finite state machines and state spaces themselves represented using shift registers and feedback loops (and the shared meta-state of the loops as a group) …then do so!

That ‘operators are the “infinitesimal generators”’ that can “give conserved quantities” concerning ”the evolution of a quantum system on M” just means that your set of subroutines (each feedback loop producing one computed-stream due to a particular set of feedbacks of the shift register inducing a particular cyclic finite group, perhaps) work cooperatively. That is, the (perhaps) xor’ed output of the collected substream outputs, for example, might represent, in fact, a (computed) higher-level computational-algorithm result. Perhaps the result is a “representation” (my term) of a matrix that can itself transform some (other) input bit-vector into an output, that the inverted matrix (in its “computed form”) can transform back into a value identical with the original input bit-vector! We are used to thinking of a matrix as a kind of program too! We have a program computing a program…or, rather, a codebook whose generated codewords define a high-level codebook (which is assuredly how Turing thought about all this, though keeping such “simple” thinking secret)!

So, we have a manifold (data set!) and complex representation (type of datums in data set) and algebras (subroutines) of a group (program) computing a set of intermediate streams that convolve with the input stream to give a interim combination (a representation of a matrix “program”) that it itself a “data-program” that converts inputs to outputs (or vice-versa).

In the case he is interested in, the math is am embodiment of the computational model only when one picks certain groups (so that then symmetries in nature’s way of handling energy defined special case rules governing diffusion  in random or quantum walks)

That the operator algebras (subroutines) can exist outside the group (the program) is known as… a “library” of interesting/useful stuff. IN our terms, there are certain finite groups more useful than others! In Turing’s (GCCS-era) thinking, there are a set of cyclic “base” groups that its worth keeping in mind, as one builds layered code-word/code-generating systems

We have to remember than Von Neumann and Turing – while doing pure math applied to cryptanalysis – were very much focused on using the only tools they had to define what we now call programming/computer science. Its tempting to assume they were engaged in some “higher plane” of pure Platonic research known only to those with Math brains. But… its more likely they were just using (higher) maths “apparatus”, and known special cases of “constructive proofs”, because that’s all the theoretical tools they had to define what we now call “making software”

One has to realize that Turing study of the Riemann zeta function was more about studying random walks (for 1930s theory of cryptanalysis purposes), than for finding this or that algorithm. Similarly, he studied different manifolds because of their properties being self-defining structures (having intrinsic coordinate systems, yada yada). He is interested in quantum algorithms (1920s physics , 1950s “held secret” cryptanalysis method, rediscovered publicly 1988) and particular the extension of the central limit theory to the quantum data model as a “computational-method” for asymptotic computation – working on streams, all along. The study on enigma is a side-line (to this intellectual work) – but along with Colossus (leveraging Newman higher understanding of topology-driven algorithms, as applied to Tunny break and the design of the Colossus cryptanalysis computer) this helps “show” grounded’ness of all the theoretical modeling.

Now the question we are left with is this: was the Phi (Q?) function Turing uses all along merely a reference to the Phi map being discussed in this topic area? That abstract map that, for any algebra, allows one to define some or other instance of an “infinitesimal symmetry” (and thus capture very generally the nature of Hamiltonian dynamics, including those found in DES …and permutations-based crypto generally)



While we tend to read what Turing was writing in the sense of markov chains (and the way in which 1940s cryptoanalysis teased out the marginals from conditional probabilities in communication channels to control the search space of keys), we have to remember JUST HOW FASCINATED Turing was with mathematical physics. That is, he’d be perfectly happy taking from the physics domain an pure math argument about Hamiltonians and time-independent motion and casting the result into the specialized domain of cryptanalysis and coding.


About home_pw

Computer Programmer who often does network administration with focus on security servers. Sometimes plays at slot machine programming.
This entry was posted in coding theory, crypto, early computing. Bookmark the permalink.