tt


image

https://terrytao.wordpress.com/2011/12/06/254b-notes-2-cayley-graphs-and-kazhdans-property-t/

Ur is a generator

image

http://turingarchive.org/viewer/?id=133&title=27

so for turing all elements are generators (since the normalizer is 1/h, the order of the group).

image

now we see k/d, the density. and we see compact support

image

http://turingarchive.org/viewer/?id=133&title=28

Giving decay on convolution space (and thus expanders, in 1954 or earlier).

We can take the mean f and mean g to be indirect references to a scaled L1 norm, and the choice of 1 being tantamount to requiring a stochastic process (sum or column entries is 1, once scaled for the rank). His adjacency is really a weighted digraph!

image

http://turingarchive.org/viewer/?id=133&title=28

when mean f is 1 (i.e. we have a stochastic process), then we have decay, due the wheel doing an act of convolution. However, there would be no decay (in the norms) if one has the markov condition, already, in the plaintext (nth round) input. The input data must not be yet close to uniform (else convolution has no effect).

image

http://turingarchive.org/viewer/?id=133&title=28

if we take two step walk (power of 2), note the the neighborhood for the averaging is expanding.at the square of the power. This is his point! It’s a VERY INDIRECT way of talking about expander graphs, reasoning purely geometrically and topologically.

image

http://turingarchive.org/viewer/?id=133&title=29

Here, K is a k-decay-accumulator– being calculated over the plaintext (of round state). Q is the neighborhood of the (left-recursive) elements of the  plaintext not yet considered, in the decay-accumulator calculation. Take “domain of definition” to mean that as one thinks backwards in the walks through the graph, for each of the possible plaintext sequences (not yet considered in…) so one is walking through particular subsets of the graphs vertices, and there are many of theses. But, the point is, that they overlap…being powersets.

Now, it makes sense to say that the Q are over the group product (of the powerset elements, or the possible walks through the graph based on Sn), where a walk’s components (as one takes more recursive steps) are the elements in the graph product.

image 

http://turingarchive.org/viewer/?id=133&title=30

We can see K(x) as the cumulative-decay to date, in which any additional walk step adds (in the log sense) alpha amount of additional decay (which is the reciprocal of the expansion of the correlation measure of Q – the borel sets (a group recall). and are measured against uniform, with 1 being close (since H1 is balanced) and 0 being far (its the spectral basis, recall).

So the function of the (self adjoint) H1 is to talk backwards (through left-recursive nestings of plaintext), and help calculate a numeric proportion-decay (alpha, or a quantitization of the spectral gap).. But, it’s a restrictive walk, through some but not all the possible sequences of k-ary-symmetric generators – remembering that the power of a generator in a word/walk equates to its particular wheel/upright instance being rotated, relative to preceding and succeeding wheel/diagonal/stecker. In some sense, the “generator” is reflected in the setting of the wheel – the relative-rotation at setting-time.

As we know, the main function in crypto of an expander graph, relative to mixers, is to be entropy conductors, in a “network of channels” specified by the graph whose design offers minimal amounts of “resistance” at any particular point, in the RMS sense.  We also see how he has a model of conditional entropy and conditional probabilities, expressed geometrically as the ratios of such as Norm2(R.f.n.* k) / Norm2(k) – where the denominator is the absolute spectral density of the graph itself being contrasted (to form an “operator norm”) with the entropy of the operators action.

Now one can have 3 of these machines in sequences, acting as a zig zag! This allows one to reason in terms of conditional probabilities, and it gets one to a (practical, probable construction of) shannon’s probabilistic algorithms for capacity-achieving codes, back in 1944. It also allows one to invert the process and search fo needles in haystacks, using much the same process as the Colossus method when attacking Tunny. One can also see how its used in Turingismus, with an alternative baysein heuristic weighting “feedback” guide backtracking-based searches of key spaces.

One can look at the need for H1 to have exponents that sum to zero as being equivalent in qudit-world to having a balanced binary sequence (in the qubit world);, and thus H1 tests the plaintext in much the same way that convolving with WHD tests a sequence (against all possible balanced sequences), and put it into unitary form. Turing is just working in qudits, again, rather than qubits..

If one is trying to create a sequence generator (for use with a voice coder), then one sees how then –1 to the power of the inner product of plaintext vector against WHD gives one as output a set of –1 and 1, each terms being unrelated to the next. This is the part he avoids talking about (to avoid the military censor).

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in early computing. Bookmark the permalink.