http://qip2011.quantumlah.org/scientificprogramme/abstract/1002.2419p.pdf (slide show)

http://qip2011.quantumlah.org/scientificprogramme/movie.php?id=1002.2419 (slide show with context…)

http://arxiv.org/abs/1002.2419 (paper version)

Does a great job of putting into simple geometric diagrams notions of eigenspectrums and the change of “representation” when one uses the quantum apparatus of analysis.

This is is actually a return to the 1930s – when folks were far more comfortable “modeling” on the complex (unit) circle.

Perhaps its inappropriate to project this material back on to intepreting “whats going on in “Turing’s On Permutations manuscript, but using this (excellent) minimal set of concepts we can do so. The parallels are quite amazing.

First, Ill assume its undoubted that Turing dominated graph theory and the theory of halting; and that he also dominated the central limit theorem. In the topic set, rather than define a Turing machine that so interprets graphs and halting on “marked states” as a number whose binary fraction recurs forever (once one has reached the stopping/halting state), we just work with the graphs themselves. The random walk, using normal cbits, allows us to model a stationary distribution – and capture homogeneity. Whereas Turing said K is the constant density, folks here say: tall it 1. it’s the largest eigen value in the spectrum. Much as in linear cryptanalysis, every other component-density making up the overall stationary distribution is (on the “eigenvalue scale”) some distance from K (or 1). in cbit space, one is projecting onto the constant vector.

When one now uses a quantum walk (instead of a random walk), one is thinking in terms of qbits (rather than classical bits, or cbits). Now we need unitary processes (i.e. reversible processes). Of course, Turing built such a model out a sequence of engima-style rotors, considering the sequence of several inputs and outputs in a chain, to give him his model of a unitary process.

Now what is interesting about the modern material is that the author constructs a “channel” –an “interpolation” under the weight function of probabilities that either the marked denstity is in operation (P’) or the unmarked denstity is in operation (P). The join density (of the channel) can then be modeled (as Pi(s))), and one can project onto this “state vector.

Now, its highly intuitive that the density of U and that of M are a bit like in the applied math we did at 16 – the sig() and cos() contributions – allowing projection distances to be calculated.

Now “thinking in phase space”, we can leave behind the cbit analogy and just THINK in terms of phase angles – since we just MOVE TO an measure of the set (of marked/unmarked vectircs) which is in the eigenvalue basis. Now the angular rotation from the stationary distribution of the marked covering graph is the “distance” of the projection onto the current state of the joint density, of this channel.

Then we wee what we say in Turing, considering the quadratic relationship. For turing he was acutlaly focussed on making a code (rather than the corresponding cryptanalysis problem). So we see him using particular properties of normal subspaces to create a particular set of marked nodes, and he wants to show that indeed in superposition space there is a uniform probability of any state evolving, at some limiting distribution of his linear functional.

Ok, That essay gets 10 for content, 2 for style!