## Turing, invariants, fano planes and shift-generators

http://yorkporc.wordpress.com/2011/10/05/7point/

http://www.turingarchive.org/viewer/?id=133&title=12

Looking at the above, we see how Turing was thinking about the fano plane, having generated it from a 7-element wheel.

Projecting group theory onto binary field, We can look at states corresponding to weight 1

states with weight 2

states with weight 3

Now Turings point was that he had identified, for his 7-element upright, two permutations (unitaries) that would generate the fano plane. The point of showing the “invariants involved” is that after conjugation, the distance-measure is the same as that before conjugation (and corresponds to a shift of the digits of the measure, rather like a colossus shift-register). For this wheel wiring, he is figuring that he can generate a fully connected graph, giving him reversibility (necessary for decoding). He also gets then to apply quantum theorems… concerning the properties leading to the average (expectation value) of any hermition operator- measuring the states from ANY angle (coordinate system, direction).

http://www.turingarchive.org/viewer/?id=133&title=12

We see that once he reflects on how shift-enabled-invariants “generate” groups (of fully connected graphs) he adjusts his notation of the points, aligning his final point (g) with his center.

http://www.turingarchive.org/viewer/?id=133&title=17

we can assume he knows that some point sets are independent and others dependent, and how this is a weaker condition from orthogonality – about which we get to say more, later.

For now, we simply note what he says:

http://www.turingarchive.org/viewer/?id=133&title=12

for each of the constraints, he show that each generate a reversal of the distance (giving him the ability to form expressions of the form A.U.A-1 note. He is also interested in the invariants of the normalizer of his rotation operator R. This gives him  the means to calculate the number of exceptional wiring plans (that can retain the fully-connectedness properties of the resulting walks, even after conjugtation/stepping).

but at least I can now understand the following argument (and why its being made):

http://www.turingarchive.org/viewer/?id=133&title=18

establishing how the alternating group plays in…to the theorem.

Posted in crypto

## tunny era rectangling contrasted with quantum entangling

We know that folks operationalizing the Tunny break, circa 1944, are performing “rectangling” – mostly by hand processes. In them, a person takes a matrix of values, each of which represent a small oscillator that cooperates with all the others, to represent the dynamics of a system.

http://www.first-quantum.net/symposium/2010/pdf/koashi_summerSchool2010.pdf

They all interact, and any change in one induces waves of change in others – due to the field effect. The process requires that user induces change from one direction. For tunny, this means introducing the “start” of wheel bits of the first wheel χ1 - figured by skeletons, etc.). Then one calculates its impact when transformed by the transforming matrix, and an inner product is calculated to update the projection – which update the other directions wheel bits (the second wheel χ2 ). The transformation expresses the correlation know to occur on AVERAGE, between the two directions (or coordinate systems). it produces an action that embodies that bias, updating the approximation of χ2.

http://www.ellsbury.com/tunny/tunny-033.htm

http://www.ellsbury.com/tunny/tunny-039.htm

Now, we can assume that the writers of the Tunny report are not aware of Turings and Newman’s other life (on nuclear bomb projects). Thus, folks reason in the terms of statistical logic, perhaps largely unaware that such is only a specialization of the quantum mechanics logic.

When folks discuss THEIR view on biases, they use language such “weighted average of factors). We can recast this, probably more a Turing though of it – as the amplitudes of the various  basis states in superposition, each state corresponding to a wheel bit in a coordinate system “tuned up for the Tunny geometry”. This allows us to contrast the first use of “proportional bulge” ½(1 + ζi) with its more general form – from quantum mechanics

http://www.ellsbury.com/tunny/tunny-041.htm

http://www.first-quantum.net/symposium/2010/pdf/koashi_summerSchool2010.pdf

where for 1, we have P0

http://www.first-quantum.net/symposium/2010/pdf/koashi_summerSchool2010.pdf

Similarly, when Tunny stats people talk of a chain of such handoffs (), we can think of it as a co –dependent entanglement – or a composite system (where Alice is Delta-P given delta-D, and Bob is delta-D, below)

http://www.ellsbury.com/tunny/tunny-041.htm

http://www.ellsbury.com/tunny/tunny-043.htm

and

(a) The Problem: Recovery of ΔP from ΔD.

It has been pointed out in 22H that the expected letter count of ΔD can be obtained from that of ΔP by means of the equations

(X1)

The problem of solving these equations for P.B. (ΔP) given P.B. (ΔD) led to the ‘algebra of proportional bulges’.

Now, folks also indicate a quantum basis, when saying

http://www.ellsbury.com/tunny/tunny-043.htm

This we interpret as, for proportions of each I’th p (q) as being the components of the complex vector of quantum mechanics (a weighted set of basis vectors). And the “overlap” concerns the measurement theory in effect, since overlaps in cryptanalytical thinking generally (see earlier memos) alter how much the odds of the event are worth, depending on how big the (enigma) overlap of 2 tapes had – since the overlap distance “degrades” the weights of certain fixpoints/repeats.

Folks also say, making a “continuous, or infinitesimal change argument, for quantum system evolution under a unitary,

If one has a continuous sequence of possible theories depending on a parameter x, it often happens that one has very little knowledge about the prior probabilities of the theories.

http://www.ellsbury.com/tunny/tunny-044.htm

So, the process of rectangling starts with the assumption the initial matrix elements, each a particular oscillator rotating at a particular frequency, is derived from an  analysis of many depths. That is, for each period of a wheel, in the ciphertext, assume that the characters from each such period are overlaid and then summed, leading to the initial oscillator weightings.  As the Tunny report says, this matrix is created by diagonalizing a two-dimensional walk on the set of depth tapes.

Now review a result of quantum mechanics, in which a matrix is responsible for casting a value in one basis into another (basis).  ⟨φ1|M|φ2⟩ should be read as the projection of the tunny-matrix M-modified 2⟩ onto ⟨φ1|.

Now consider the contrary of the usual interpretation (M is the expected or average value of lots of trials, for the measurable M) Now consider it to be that M is the sum of all said trials knowhow. When that sum of all that “history” is filtered by looking at just ⟨φ1| then the resulting filter can be applied to 2⟩. Looking at the initial filtering of the history itself is a kind of transformation (that induces a particular action), one captures the unitary-inversion of the correlation between 22⟩ .

So imagine a process in which the  M acts on Chi1, using an (inter Chi impulse set) correlation-specific transform, and then the result is measured, using Chi2., This induces an inner product calculation, that updates each bit of Chi2    as one takes the row of “confident” Chi1 components through the rectangle, in order to update Chi2. Then one reverses the process, adding it ( a reverse process) to the sum. The net result is a quantum simulation combination of the ⟨φ1|M|φ2⟩ + ⟨φ2|M|φ1⟩ where the second term might well be, rather, its adjoint.

But the point is clear. We can see rectangling as derived from a a classical basis change operation, inverted. For a guessed set of chi wheel bits, for each of χ2 and χ1, consider them as imperfect measurements of M. Then, use data from χ2, first, as an operator on M, and re-measure χ1 inducing an action. Repeat, in other direction. Repeat until there is no change in the χ2 and χ1 upon measurement-action – since all its redundancy and correlation has been transferred back to the Chi wheel bits, that now better reflect the depth information.

ok. we are left with a couple of open questions.

First, consider the following:

This gives P.B.*(ΔP) in terms of P.B.*(ΔD) and P.B.*(ΔΨ’) and hence P.B.(ΔP) in terms of P.B.(ΔD) and P.B.(ΔΨ’). The process is not as laborious as it sounds in virtue of the rather simple interpretation of an F.T. For example if Θ is the T.P. letter J or vector (1, 1, 0, 1, 0) and if F is a P.B. function, taken as P.B.(ΔD) for definiteness, we have

since F is assumed to be a P.B. function.

Thus P.B.*(J) = P.B.(ΔD1+2+4 = .)

Using the support bits of the vector (those 1 bits), one can view a givcn datum as the Fourier transform of a proportional bulge operator. What does this tell us about the equivalence with quantum mechanics?

Second, Susskind gives us a feeling for the “two” oscillators for the average of an given measure.

see 1h 12m

What is the cryptanalytical significance?

Posted in crypto

## enigma repeat rates – 4dummies vs 4professionals

In the version of the enigma cryptanalysis story given out by the UK for general consumption (which means its full of deceptions) we learn from the following the sense of the story:

http://stoneship.org.uk/~steve/banburismus.html

In short, folks count repeats per amount of overlap,. comparing the count against the expected value of a count for a random case.

IN the more professional version of this story, less likely to be full of old-man-itis and military deception attempts, we learn that the story is far more about quantum experiments – and general experimental method.

http://yorkporc.wordpress.com/2012/09/29/expectation-from-inner-product/

Now, the “magic” of repeat rates is just a simple statistical statement – about repeated observations.

From this, we can get to the scoring system of banburismus – based on empirically accurate predictive expectations – that WHEN augmented with Bayesian logics, support measure theory and a rather pure approach of quantum entanglement theories about measurements in composite systems.

Posted in coding theory

## Turing’s global “quantum” measurement process for cryptanalysis

Our marked up version of the slide shown below indicates how we learned to think like TURING  – about quantum cryptanalysis. For the first time, we feel like we are cracking his secret code. This is not the secret of the classified process later revealed as Enigma-style banburismus  (http://stoneship.org.uk/~steve/banburismus.html) or Fish-centric rectangling. Rather, it is the secret of how his mind worked in getting to such definitions – based on a rather pure and very theoretical training he received in his Cambridge math courses. While Turing clearly was immersed in the physical sciences math of the day, one should note how he does NOT apply it – to cryptanalysis. Rather, he applies the underlying logic of that math.

4-2. Distinguishability

What we see above, in a style of Dirac notation that Turing did NOT use, is a formal model of a composite system in which the 2 sub-systems are dependent. The left hand sub-system, denoted A, can be seen as the  “measuring” apparatus, for a variety of measures. (Yes! That’s measures as in “Measure Theory”.)  In particular, it will measure the “scores” of bigrams, trigrams and tetragrams, etc (just as in Enigma cryptanalysis) found on the “input” Turing tapes given a pointer to a current tape position. And, the reason one needs the apparatus to support a “variety” of measures is because in Enigma’s particular manifold and geometry, there may be different scoring tables for each “overlap” (for certain n-grams).

Being a general apparatus, the measure-capable subsystem “A” can adapt to different scoring tables, as required. The right hand system, labeled B, is entangled with this “measuring potential” – and the measuring apparatus. One can think of B as the rotors, in the rotor cage, whereas A is the indicator and other keying setup.  The relevance here is that we can perform a “global unitary” operation on the composition upon which completion the measuring apparatus is prepared to do its job. It is only through the mystery of the global unitary specifically acting on a composition that allows measure-potentials -  revealed by A – to be actually reduced from potential to the actuality of a pointer moving on a dial.

Looking at the formalism, we see the “degree” of entanglement-potential characterized by use of the k variable. Remember, the state is being simply decomposed into a weighted set of k basis states. And this role of k is interesting , as it turns up in Turing’s own writings on the topic – albeit with different modeling goals. its worth pointing out just how k’ness is used across several mathematical concepts. There is a common k’ness notion, that is.

Let’s look at Turing’s most obvious ‘k’. Think of this K function as the second subsystem, B, which implies that the first subsystem A is modeled by Turing’s φ(n)(.)

https://yorkporc.wordpress.com/2013/11/29/mapping-turing-onto-quantum-walks-and-pianos/

We note that in changing from notation g() to K(), Turing is indicating that k is a probability distribution (related to the g, that it replaces, in that g was the “input distribution” while k is the entropy of the input distribution). Furthermore, K applied to the 2 character sequence of (y1 x) is the product of a scalar alpha to the function K applied to (y) alone. We have a quantum system that is – in that a given distribution can be decomposed as a weighted distribution of more basic distributions. If x is the current input byte on the Turing tape and y is one or more of the preceding characters (in the bigram, trigram, tetragram, …) then alpha (for 1 y character) is the result of measuring the predictability of y to guess x. It contains the “information”, uncertainty, mutual information, … that is. It is the conditional probability between y given x. The global unitary reveals – by considering the dependency between B and A, the uncertainty of A that is unmeasurable by local operation only.

Now why did Turing move from using g() to K()? We can guess that it relates to a subtle point concerning complexity – when doing acts of ‘distinguishing’. In his arguments about norms and continuity, Turing fails to model K directly as an upper bound upon the norms resulting from applying a transformation to a Hilbert space and its wave functions. He does use it indirectly though when pointing out that the ratio of the norms is 1, when considering the limiting or ‘kth’ compression (contraction of the angle between the ever more indistinguishable vectors towards zero)  in a CONTINUOUS space. There will always be a minimum angle, for continuous spaces, beyond which we cannot compress and further refine our measurements of the information content. (One should think of Shannon limit here, without using that American intellectual product otherwise).

Turing links the certainty/complexity bound on distinguishability of states in a Hilbert space – expressed with a k when considering vectors with his function K() – when now considering actions on Turing tape characters.  The limit of the inner product may well be a distribution whose character is uniform. But note that mere fact that there is a certainty/complexity bound does not stop uniformity occurring! It means that upon reaching uniformity one has reached a maximum potential of having certainty in the distinguishing power, in a cryptanalytical sense.

Turing indirectly uses K in the sense of eigenvalues, too. But, being trained in 1930s topology by original thinkers like Newman, he thinks and reasons geometrically; in terms of complex geometry (Bloch spheres).  In just a 2-d complex vector space he knows he really just dealing with k points on the circle which represent points in phase space over which his (measurement) system’s dynamics  can act. And of course, it just so happens that he can now model the dynamics on such a point geometry using a Cayley graph, based on a suitable “circular” group whose non-Abelian nature allows one to capture the dynamics in (classical physics) phase space. For Turing, he simply recasts k points as k terms in permutation group whose factors are expressed in cycles/circles. The elements are isomorphic – in representation theory – to the eigenvalues that the system can measure, which in turn are points in the phase space, all of which allows him mentally to move from classical to quantum mechanics reasoning models. He thus has k elements in his group, over which his group operator can act. And the resulting actions are analogues of quantum mechanics in complex vector spaces – but using real fields (which simplifies out all the adjoint operators, complex conjugate worlds, etc)

Of course, his geometric circle of phase points is just a reasoning model for something practical: an enigma wheel, that can rotate/step of course. Such wheels can also be reflected, of course. Between points, group elements, permutation groups that give us easy-to calculate conjugates, rotation and reflection transformations, and invariancy of distances we seem to have group algebras in the making! This all allows Turing to allude to using symmetry and asymmetry for cryptanalytical measurement purposes. We can even reach a bit and see how multiplicities were applied, looking at the permutation group’s automorphisms – those multiple eigenfunctions for a given point, all of which allow for certain (i.e no uncertainty) measurements.

Let’s now link the two pictures. Turing’s φ is subsystem A, and the α weighting of a basis vector – attached to the prefix of x – is the value of φ().

This is all backed up later when we see how each k’th component of the locally-unmeasurable A is recast as a probability weighting of the projection operator form of the k’th basis vector. That is, something of the form |φ⟩⟨φ| which if course is a group element product, to Turing, in his H1 group, that reduces in value space to α1αsince we have only real values to worry about (eliminating ordering issues, or conjugates). We have to remember that a projection operator is…. just a geometric projection (just torches shining upon objects in the center of the room, while considering the resulting shadows falling on the corners of rooms, formed of two walls… known as the ket and bra!) Turing helps out a lot here, since H1 is probably a hint symbol indicating first that it’s a subgroup (of H, as stated) and its also an entropy-style measurement (in an L1 space) – where entropy is a kind of uncertainty measure. And, furthermore, he specifically tells us that φ is, in any case, a projection onto a 1d line, giving a real α as required and thus a sequence of 2 α terms under homomorphism and the group product that become simple multiplication of real factors … that of course were used (when augmented with a Bayesian value logic ) to calculate the scoring tables used when scoring the “predictive power” of bigrams, trigrams,… during enigma “cryptanalysis”.

This analysis is reinforced when you consider the very nature of projective spaces, in which necessarily P2 = P (giving indempotency). But more than these facts, the space reduces the domain of the factor to 0 and 1, only. Fully indistinguishable, or certainly distinct. We recall how in the limit Turing argues that the value is 1 ( or indistinguishable, being as far from orthogonal (cos(90) = 0) as possible) from which fact he can derive claims about uniformity of the distribution (for all values in his coset that are have as little conditional probability from/to each as its possible to have).

Our final k concerns Turing’s example, based on the symmetric quaternion Cayley table, with its identity and 3 symbols (i, j , k), and their signed alternatives. Remember that H1 uses just elements { 1, k, –1, –k}. its our conjecture that Turing is saying that we have at 4 combinations, which aligns with a Hadamard transformation that changes one basis into a superposition basis, with a suitable weight for each symmetric ({ 1, -1} or {k, –k}) or anti-symmetric pairing (e.g. {1, –k}, {-1, k}).

Ok. We see a clear and pretty convincing link between his Cambridge training, enigma analysis, and what we know about scoring (for enigma menu construction) and Tunny-era rectangling. All of which tells us to go find out all about what Newman, the hidden figure, was doing – in secret.

Posted in crypto

word screen, newly installed from office365 (#1 of the 5 authorized devices to have installation…)

note how it works, when one uses the non-custom *.onmicrosoft.com name form.

Image | Posted on

## types of representations

1. Basic rules of quantum mechanics

Thinking of Turing’s 1-d projection, whose value is 1…., having already been labeled as positive (and real).

What a simple diagram does, that yards of theoretical papers did not!

I think the unitary is part of the normal, except the product is real and positive (and 1). that is, complex numbers whose product turns out is also complex but with zero imaginary (i.e. effectively real).

So what is the space of unitary and projection called (ignoring their intersection)? Can we assume that unitary – in complex coordinates – is the analogue of projection space – with the particular property that the value is identity? That is we have two ways of expressing idempotency (in the real space, and the adjoint spaces)?

looking at that intersection, did Turing – having projected – intimate that the process was unitary and THUS 1 (given a choice of 0 or 1)

Is the point of 0 versus 1 to signal wholly indistinguishable for 1 whereas  otherwise (not wholly certain, the whole class of which gets assigned 0)? See http://en.wikipedia.org/wiki/Projection_(linear_algebra)

Posted in crypto

## Lie groups, algebras, Angular Momentum and Turings generators

One of the things we have been learning about is the “degeneracy-busting” power of symmetry operators. Or, should there be multiple eigenkets (certain states, within a superposition space) that have the same eigenvalue (and the same energy in particular, for the energy operator), we can use lie algebras to bust these correlations.

In crypto, this is what we would want to be doing! Think of it has busting the hidden correlations hiding within the data model, that allow eigenvectors to be considered “same”. We want them all “different”.

http://turingarchive.org/viewer/?id=133&title=05a

In the section of his manuscript on permutations concerning with using search processes to find exceptional groups (no annoying cycles, within the group), Turing wishes to establish that the desired group property holds, once the generators are spread across several enigma wheels in a cage, each contributing a (tensored) generator term to a multi-component wave function. And let’s not forget his big picture argument, either: that the normal subgroup (acting as generator) is the SYMMETRY group (or the alternating group). Hmm..

At this point, we think in the terms taught to us by the physicists. That is, our symmetry-group “exists” to help “generate” sequences of symmetry-operations, that bust the degeneracy. And, we recall how a lie algebra – a product of two generator terms (each being a functional symmetry-operation element of the group) has appropriate properties – that don’t interfere with rotational invariance of certain underlying systems -  but still allow certain classes of calculation. In particular, we saw one class of calculation that enabled enumeration through the ordered set of eigenkets of a certain eigenvalue.

An important element of the “algorithm” setting up the lie algebra consisted of calculating each of the new (non-commutative) generators, and showing that the product of any two – be they original or calculated or some linear combination – was closed. That is, it identified another of the generator set.

We can think of this generate-the-generators phase as a way of calculating an instance of the degeneracy-busting algorithm (or custom-operator). Since the net result is a function of the group used in the baseline, we have n of them.

what we notice about the case Turing diagrams, above is that we see a example of two generator-operations (that induce an actions). If we continue the metaphor that the (α0 α1) cycle is the “coin” of the quantum random walk (whose impetus induces actions) and the (α2 α-2) are the eigenvectors of the energy operator, then we note how a geometric “rotation” gives (α-3 α-2) as the new impulsing/clocking mechanism. (Just rotate the largest field-numbering line clockwise a bit). And if you now reflect, the second cycle becomes (α-1 α-5).

This seems just a little unlikely to be concidence!

He seems to be showing, without saying it as usual, that his exceptional group has invariant properties, such that one can rotate and reflect…

I think we should go read 1930s book on lie algebra, and note how “exceptional” was used (and for what purposes in those early gedanken experiments).

Posted in crypto

## Cryptome goofed redact

Nothing goofed. Names removed for actual legal procedural reasons – to remove non factual elements and basis for challenge (since opinions are leading)

Normal  legal (not national security) use of secrets

What u can ask is who first redacted. Ie did natsec redact in order to bias outcome of trial – an actual natsec secret (covering up the illegality or inequity).

Posted in coding theory

## looking at DES sbox diffuser as quantum system with hadamard mixer

We have seen how to use the Hadamard transform to make a coin flip into an operator that induces a shift operation. And we saw how Turing, back in 1954 or earlier had a shift operator for a unitary swap (called an upright in GCCS terminology) – comparing the inter-point distances before and after conjugation, noting the “invariant distance” before related to that after, by a shift of the digits. Lets put the 2 notions together, in the context of understand DES.

Fortunately, someone seems to help us. For we recall that the P() function of the DES basically encodes 3 graphs, the first acting (by diffusion) on the sbox to the left (of current), the last acting (by diffusion) on the sbox (to the right of current) and the middle one (we assume) acting as a a way of putting the two previous results into superposition – ready for the next round.

The following explanation does better!

We see H transform applied not to put a coin flip state into superposition but to the n-1 phase (sbox) and the n+1 phase (sbox) and the current phase (sbox) at the current time. Lets assume that it acts similarly to the coin flip = putting the three phases (before, current and after) into superposition. Lets note how this allows one to transform the wave equation itself such that we now have a statement of what the current position will be (having diffused out, to a couple of degree in various directions) at the next point in time. WE think of this as a special-coin that, post diffusing and loss of energy, takes the now-lesser energy state and rejiggers it back …into the superposition form suited to the next round of  calculation.

We see how the “operator form of hadard” that acted a single classical random bit has become a more-evolved hadamard (for want of a missing term): with now the left and the right forms of Phi acting on state 0 (and state 1). That is we have the before-after-sbox-coin, as a class of randomness that will act as a coumpound-generator, inducing a compound-action on the compound-hamiltonian of the boolean/bent functions stored within the sboxes.

Posted in crypto, DES

## mapping Turing onto quantum walks and pianos

To a degree, we were able to map Turing’s on permutations manuscript model onto expander graphs. But, it didn’t quite work (no matter how we forced interpretations). Do we do better if we  map things onto quantum random walks, instead (recalling that expander graphs and quantum walks are related, in any case)?

One author does a good job for us of laying out the model:

http://arxiv.org/abs/1201.4780

we see that the H – as an operator – is specified as a set of projection terms (one for each element of 2×2 matrix). We clearly the 4 terms (and the sign bias in the last term).

Other authors prefer to present the coin state as two values (head and tail, logically), each one being one of the 2 values of the polarized basis. They then have us think of H as that which puts such a singular value (of heads say) into a superposition of computational-basis states, instead. This change of basis gives rise to the intuition that we are, upon applying the operator, going to be “refining the combinatorics of the wave function”, so it evolves to its next set of combinatorics – as the components of the function construct or destruct to “generate” then next wave function.

We have what group theory calls an “action” – a functional-property, built upon the “impulse of the coin flip.” We can view said flip as a finger perhaps striking a “piano key” – which induces its hammer to strike the associated “string” – which produces sound pulses – whose space-compression waves hit neighboring strings made at resonant frequencies – which also start to vibrate given the impulse from the inbound wavelet – which induces harmonics – whose compression waves add to the sound “mix”).

The author does a better job at relating the 4 terms of the “H operator” to the “2 values” of the superposition that results. We see below how the 2-term-superposition act on each “k’th” component of the current wave function, which incudes the generated-terms and the indirectly generated-terms (the harmonics). Below, we see a harmonic-induced wavelet. Even though a coin flip as tails (say) has induced state 1 to move to 2, and the other flip to heads (say) moves –1 round further to –2, we also have to remember that there are ‘harmonic’ kth components of Phi.1, too The coin flips (in superposition) also act on the ‘state 0 harmonic” (since a series of flips inducing +1 or –1 actions could be collectively resulting with our token being currently at harmonic-0, as the generators cancel), with (+1) + (–1) = 0.

http://arxiv.org/abs/1201.4780

Perhaps we can now go back to the Turing model, and see his H1 “generator” as the coin, where the 4 terms (1, k, –1, –k) are thought of as the H projection operators (1 is , –1 is , etc for k, and –k). This is generated by k, to the nth power, of course.

Now Turing models K(y,x) as a scalar multiple of K(x) – which immediately makes one think of eigenvalues and eigenspaces as alternative coordinate systems, when thinking “directionally”. But Turing says no such thing (unless its implied by the conventions attached to symbols). What do see is…

http://turingarchive.org/viewer/?id=133&title=29

a series of measures of y being the (2) “probability amplitudes” of the (4) “projection terms of the H1 group, once reduced to 1-d. Of course in L2 space, we need the square of the 2 amplitudes to sum to 1, as required by Turing., meaning that root(2/3) and root(1/3) would be our H1 amplitudes if the following adapted-picture was our 1d projection:

http://arxiv.org/abs/quant-ph/0303081

Now  recall that our re-modeling of Turing’s Phi(y) is based on his statement that y is the previous 2 characters of the plaintext (two memory states of his turing machine tape, that is). It is the relatively-randomness (entropy) that act “as the Coin”. So its not a hadamard coin; but a defined-coin-operator whose state is defined by the “2-previous positions” on the tape. Doing this type of “memory” thing is of course entirely plausible and crypto-ish thing to do (being seen in practice in the go-backs in 1943-era Tunny machine). The coin is a very “quantitative” source of randomness!

ok. so if the 2 plaintext-memory cells are controlling the furtherance of the walk from the current position (on the Turing tape!) we do have a very “conditional branching” concept!

So, we know that Turing is using H1 to give himself a quotient/factor logic. He wants to be able to think in terms of the cosets (equivalence classes), allowing several plaintexts to map onto a single generated value, causing the compression in combinatoric space for each round, while at the same time allowing the H-superposition and entanglement to spread and diffuse the probabilities through the harmonics – making the resulting distribution of terms within the evolving wave function directly a custom-density of the plaintext, though the resulting range value tend to being uniform distributed.

While this interpretation has a lot going for it, we have to see how Turing says it, when stating

http://turingarchive.org/viewer/?id=133&title=30

It could be that when “g(x) is constant throughout each coset of H1”, given H1 = 1 in its projected  amptlitudes, this means that – at this nth round – the input characters have a equal probablities (1/n, that is)  resulting from the fact that their mutual distances (as point in L2 point-space) is now all the same.

ok. the final part of the puzzle, since that interpretation of Phi(y) is rather more convincing than my last attempt, is to consider the sum of the powers of the generator being zero. We can look at that semantically, as being one of two things. First, its just that condition that ensures that the rules of normal subgroups hold, giving closure under conjugation. Secondly, it’s a capture of the more semantic notion that we are interested in viewing the walk from the perspective of net-effect of all generator actions being, happenstance, the net-zero case. Thirdly, since the two rotations around the uprights swap can be seen as basis changes, we can see the enforcement of zero as being the final and only “measurement” performed on the  set final wave equation.

In piano terms, having struck many keys while the sustaining pedal is held down (allowing harmonics and superpositions to build in the sound field), “measurement” is what happens when you suddenly remove the sustain. What the “ear hears” our of the sudden change in the cacophony as the tonal center is the (tonal) measurement. Presumably, one wants that “no tonal” center is sense, by the ear and its supporting billions of (in my case) of very well trained neural networks in the brain set to do recognition of tonal centers.

So let’s summarize the thoughts about the system.

The function of H1 is to act as a 2 bit register. It’s the result of the 1 way function, that takes another character from the plaintext and conjugates the current register by a function of that value, resulting in athe storage of the calculated value (1 of the 4 possible values) in the register.

To calculate the function (of the next plaintext character), we have said character act as input to a set of wheels, each of which is coded to add first I (generator), and then second the J (generator). The wheels are individually rotated, but in a manner so that the powers are zero, such that the node at which the walk lands when translating the H1 subgroup identifies a particular coset, in which set all the probability amplitude of the elements are eventually equal.

The conjugate value is stored in the register

Since the system is working with only 8 values, we can perhaps assume any one wheel can process 3 streams at a time, having 3  lots of 8 pins = 24 pins (out of 26, on a typical enigma wheel). Perhaps the quantization of the analogue value is stored in 3 levels, each of 1 bit, and thus there are 3 H1 registers (one per quantization level).

So the actual output stream would consist of the states to which the walk had proceeded, at that point in the input stream, but one notes that the wave function that is guiding selection of the output value is evolving, as a function of H1. Thus we get an operator-based averaging process (that acts on the probability amplitudes, related as proportions) but we also get a directed walk that is a function of the plaintext, where the function is evolving the terms in the wave function, forming up the dirac oscillations that characterize the final density.

ok, so we get to exploit quantum logic, that the quantum channel uses conjugation, but the distinguishability (as the inner product spaces compresses, and the norm of the compressed space gets closed to zero) only gets worse (as the alpha in cos(alpha) between the two vectors goes to zero).

4-2. Distinguishability

Now, whereas the quantum random walk is modeled as a tensored product of the coin and position vectors, Turing’s system feels more like a group ring, where the 2-bit H1 helps generate a unique codebook. But the point is that the system (i.e. the wave function) is continually evolving, and guarding which output wire is active. After a burn in period, there is no predictability between two ciphertext chars, unless one has the trapdoor that allows one to regenerate the evolution of the functional/wave function.

In his summary of quantum mechanics lecture (that is rather clearer that the original course!), Susskind nicely introduces unitary operators working on wave functions parameterized by time – distinguishing them from Hermition operators (such as x and P) and the world of representations based either on position or on momentum, upon which such operators induce actions.

See minute 30…

He also nicely characterizes just what an inner space is, in terms of distinguishability. He nicely also motivates what a wave function is – and what a linear functional is (when it transforms the function itself). He makes it clear that state-space – in the world of computer science – is phase space, which whose eigenvalues just happen to align nicely with the “points” in the unit circle, which sound awfully like the pins/pads on the enigma wheel!

Posted in crypto

## NSA BOUNDLESS INFORMANT Explicated

http://cryptome.org/2013/11/nsa-boundless-informant-explicated.htm

Largely bullshit. Cryptographic search, repuposed for metadata, leverages semantic web models of data organization, not federated sql queries. Now one use statistics to guide the search, leveraging machine learning algorithms (in which nsa is expert, particularly when supported by custom hardware – in which manufacture  nsa is cnsiderably more expert than gchq)

Posted in coding theory

## Absorbing vs vanishing

Whats the relationship between the conditions of an expander graph (stay up from zero gap)  and “a non vanishing probability” of escaping from an absorbing state (in a quantum walk)?

Posted in coding theory

## Relatings turings use of norm (statistics) to eigenvalues

http://terrytao.wordpress.com/2010/01/09/254a-notes-3-the-operator-norm-of-a-random-matrix/

One recalls how in quantum random walks (on the cryptowheel treated as a circle group) the expansion and the mixing time is not dependent on the second eigenvalue (but all of them, which are sumsuned in the operator norm).

Turing uses the correlation of plaintext as his quantum coin, to refine his graph theoric measure of its information (number of quanta).

For the quantum walk on his cycle graph, he can calulate the expectation value of a walk step between 2 nodes of hamming difference of 1.

Posted in coding theory

## intuition in crypto of being “orthogonal” to all constant functions

Studying quantum random walks has shown us that the “function of a coin” is to help choose a direction for the walk. If you are at node x, having come from ‘x, then the coin can help choose next from x1, x2, … edges, each one associated with a member of the (symmetric) generator set.

When Turing and others discuss the compression of the lattices underlying point spaces using round functions they characterize the notion of ‘x (the node FROM which the token came before landing on x) as the space of constant functions. That is, that volume of current space in which the coin has no effect, in the future. They then characterize the spaces orthogonal to the constant space. That is, further, the volumes that are not currently constant and in which a coin toss might affect the evolution of the system, in the future.

For the cayley graphs and its generators – that define edges that might be followed as token goes on a random walk, we can think about differential cryptanalysis. If each edge is just one of the possible “directions” that lead to the subspace resulting from partially-differentiating the cryptosystem in that direction, then the set of edges over which the token might travel can be seen as the d-ary coin. The toss of such a coin lands on one of the d-sides and thus selects the associated direction edge on the graph. The future of the token belongs to the associated sub-space – into which the token now moves. That is, the action of the coin is to choose an outgoing edge of the graph and – due to the “de-correlating effect” of what is an act of measurement – to influence the resulting distribution seen in the future.

We also know that resistance to differential cryptanalysis comes from ensuring that there is little or no memory of which choices were take, where the next round’s compressed lattices of its own point space – with its newly-concocted set of directions pointing to its partially-differentiated subspaces. No memory   in any sense – takes the form of ensuring that one has maximum (mutual) distance from therefore any linear structure. This results in there being no predictability in the output value (v), given the “relatively-constant” input value (c). Said quantitively, the output bits that flip should do so with a 50% chance, given a impetus provided by a flip in the input.

We can look at each distribution in polar thinking, as a compression of the circles (representing the distribution of weights assigned to each generator, in the configuration space resulting from applying each stage of the lattice, Rfn). During  “functional space” compression, the mutual “geometric” distances are altered, tending to a uniform mutual distance. Folks are simply applying a property of the L2 norms – which distinguishes them from L1 norms

It is “lemma (a)” that induces this particular quality, in Turing’s mental model. And note how he thinks in “modern: quantum terms” implying one should be measuring correlations (i.e. wave functions cohering or destructing) in different volumes of space.

http://turingarchive.org/viewer/?id=133&title=29

It depends on the mean weight of the (now stochastic) operator’s outgoing edges for each stage of the lattice being 1, that isThis induces what would be a growth rate in the correlations of the “plaintext-operator”

to stay consistent with the correlations of (non-measured) functional , i.e.

as and when, for a two-step walk on a Cayley graph whose first step generator was some a now-considered to be constant-space c

Posted in crypto

## VRSN volume, in year of raider plays

http://www.nasdaq.com/symbol/vrsn/interactive-chart (max year option)

IN the last year, this contraction of the volume is really punishing the (excessive) shorting of VRSN, when aligned with the repurchase program that also takes up supply.

Buffet’s main moto is: release the latent value of a undervalued company (by using pure capital power to offset speculation power, or similar).

Posted in VRSN

## Turings quantum random walk

Lets see if we can relate alpha, below

http://turingarchive.org/viewer/?id=133&title=29

to the earlier work of the paper expressed in permutation group theory

http://turingarchive.org/viewer/?id=133&title=05a

http://turingarchive.org/viewer/?id=133&title=03

We look at a “small wheel’s” 2-element rotation (α0 α1) as – under the conjugating effect of R powers) -  a means of inducing an arbitrary motion of the “bigger wheel” of s elements”, in which element s moves to s+1 (for each s, obviously). Obviously, this should feel a little like the world of semi-direct products of groups. Or, better still, the world of quantum computing.

We see, in Dirac notation, Turing’s concept. S is Turing action, too ; taking a compound state to some thing that satisfies the “possibilities specified by”  the equation S, given. That is, view it as a specification of random walking space, for a token wandering around turing’s (α2 α-2 ) ring of points (see his diagram, above) under the action induced by a random coin being thrown and coming up 0/1 (i.e. his “core” cycle (α0 α1) )

The typesetting of S is awful – helping making quantum theory appear hard and awfully mathematical (when its not!) S is only saying is what Turing was saying (50 years earlier!) Or more fully, a coin is thrown, coming up heads. If so, a sequence of motions happens of the nominal tokens on i moving to i+1 as the large wheel rotates anti-clockwise, which can alternatively be expressed as making the point labeled –2 become -1 (as it goes around the possible labels, adding one). If the coin comes up tails, motion happens from i to i-1 (in which –1 would become –2, and –2 would become…. +2). Of course, being a wheel, the “collective motion” of the space happens as a “sum of” several “parallel” motions (a collective set of re-labellings).

S says some more, which is VERY turing like. The very pure math “branching conditional” between the impulse and the motion is represented by the blue cross. That the impulse induces a set of “parallel-processed” relabelings (point movements) is represented by the Sigma over i. (Yes this is NOT a numerical counting-sum, but a “specification of parallelism”). i.e. . Dirac notation helps, since i (bra) is the input state, and i+1 is the output state (ket) as one performs the walking step. One doesn’t need diagonalized matrices, now!

Of course, what is an automorphism… but one of the set of exactly these relabeling functions (of the points on the ring)?

The red plus in S means that the labels of the wheel can move over time per the rule on the left of the plus –a (n unmentioned) number of) positions in which point labeled i becomes i+1 (i.e. the +1 generator was applied, giving a clockwise relabeling of points). “Or”  the rule on the right occurs, in which the contrary wheel rotation occurs, relabeling backwards (from i to i-1) – based on the –1 generator.. In some sense we could have an entire sequence of some forward and some backward movements, leading to the current “state” of the labels, which is the notion of (red) plus.

Hey, alternatively we can think of a token on the points walking the points, similarly, so a “random walk”. Its better to think in terms of point re-labelings, however, since then conjugations and automorphisms “explain” and “calculate” the random walk of the particle/token in more “modern” quantum computing theory (circa 1954!). A particle moves forward a step… when – alternatively – the descriptor-labels move backwards – RELATIVELY. (REmember his 1920-30 math training.. when relativity is hot!) And of course several such re-labeling can be composed, using the red plus, which is of course the automorphism GROUP.

Ok this is lovely – since lots of really badly taught pure math is actually something entirely intuitive. if you have the intuitive model (a wheel motion), the formal modeling language is trivial!

Now, later we see Turing talk about the coin function being the subgroup H1 – where H1 can have different definitions – according to the types of biases one wants of a (multi-dimensional) coin, doing n-dimensional flipping. With just a classical coin (with 2 states), we get 1 and k (and –1 and –k) – when H1 is the quaternion group. The 1 term gives us +1 and -1 generator, which induces the actions as desired. The k term nominally gives +k and –k “generators” – which MAY BE what happens when you have the –1 generator act on –2 (when k=2 or -k=-2) which does not give –3 but induces –2 to d-flip over to 2 (since 2 follows –2, around the ring). Which aligns PERFECTLY with the sign flipping nature of the quaternion group as you go around a particular line.

Ok I get it (a year later).

Boy you are crap at explaining things, Turing. Or, rather, you assume context – probably per the “oral paper reading” culture infamous in the Cambridge tutorial system – in which you can assume a “reading context” – missing in the paper itself. I had a boss once trained in that, and it was infuriating until you got used to it (at which point it became superb way of advanced folks to interact).

so his 1-d projection of the “walk specifier” Phi()*Phi()*… is really nothing more than an allusion to such as

now K makes sense, being a random walk… inducing what he probably wanted – a stream generator for use in an additive cipher. So, is the delilah key stream generator? IS this the guts of the wheel wirings “design theory” of ECM (when attempting to produce a random stepping sequence for its polyalphabetic wheels).

Essentially, he is “taking the history” of the plaintext and using each to pick an automorphism that rotates his labels, inducing conjugations (in a more “refined” manner than enigma does it). Once projected onto a frequency distribution, the group theory ensures that the choice of re-labellings eventually converges to uniform selection of leaves. Or for a quaternion basis, H1 cycles creating a factor group of s=2. I think of the sign change of 1 to –k (say) as a clockpulse from low to high …which maps onto the S, nicely. If the clock is in a hadamard mixed state, then the action of the H1 “memory/history” terms induces the pulse (before returning to being balanced), which conditionally induces the dynamics, which take into consideration the memory of the plaintext – rather like a Tunny goback2onV, etc – which tells use a little of what the designers of Tunney may have been aiming for! It also makes sense now why a Turing, particularly if the work was post Tunny, would be modeling such a thing.

So he has function generator for automorphism expressions, cued off the plaintext, which uniformly eventually chooses the mapping of the current plaintext to its output. Thus, there is perhaps no “additive” of this “generated stream” to a plaintext stream – which would have been REALLY hard with the limited electronics of Delilah. He has a block cipher of 1 char!, rather like Tunny and Hagelin. its just that his “tunny motor” is entirely the plaintext – which aligns with the theories of the day for auto-inameiveforgotten crypto designs. It also aligns with step-and-go and other similar means to conditionally step (or sample) a stream to produce an additive.

Posted in Computers and Internet, early computing, quantum

Stealing knowhow from the article at http://www.cloudidentity.com/blog/2013/07/30/securing-a-web-api-with-windows-server-2012-r2-adfs-and-katana/ that indicates how to make the ADAL library talk to the (oauth2) endpoints of an ADFS server, lets see if we make the project we built in the last memo talk to our own oauth endpoints, instead.

First, we  change the webAPI startup, to invoke ADFS specific middleware in the pipeline (and point it at our own metadata endpoint, nicely present in options.) Spying using fiddler shows the app does at download the metadata (and configure itself, presumably).

Of course, this basically configures the security token handler, given the metadata endpoints, names and certificates. Its up to the OWIN pipeline to invoke said handler (a subclass of the JWT handler, no doubt). We seem to be using “Katana” code presented here, whose options class does seem to allow us to specify our own “provider” (not that we do so, yet).

considering the sample of

we amend our WCF button event handler:

ok, so what we learn is that we get exceptions on invoking AquireToken during the “validation” of the addressed provider. The overloaded method is not supported! (or something equally vague). If one changes ones address to put “/adfs” on the end (requiring our endpoint to have the path “https://ssoportal.rapmlsqa.com/adfs….”, it stops whining about that (at least).

So, to emulate adfs and talk to an ADAL client, tell said client an address of a oauth endpoint with /adfs after the domain name. Oh, and host the handlers on the page path adfs uses for its oauthendpoints too (since the client libraries assumes these).

I wonder if I plugged in my own “provider” if it might use some cue word of my own, if detected in the address of the endpoint, to decide how to complete the paths? Or, perhaps, one doesn’t need the “Cue word” approach at all, if one specifies ones own provider on the context object since it defines its own “authenticationType” (PetersProvider)

we see this, for use later…

Looking at the Katana code smells very much like a certain persons coding style (and one sees lots of patterns, in common). Thus, to invoke my own “provider”, perhaps I ought NOT to be even using the ActiveDirectory subclass and extension methods, which are no doubt “per-provider” wrappers.

Anyways, I think Im getting confused between using ADAL on the client and OWIN in the server. This will all fall out in the wash later, by doing some simple coding experiments.

## Playing with OWIN and OAUTH when talking to our own custom STS (via office ws-fedp proxying)

http://msdn.microsoft.com/en-us/magazine/dn463788.aspx

Lets do the project, using the instructions as given.

We create the WebAPI project:

And we duly authenticate (to the wizard in Visual Studio 2013, recall, using an OAUTH handshake)

Since we have done this before, the first dialog recalls which of our identities names we have used before (and which one is default). Then we see a rendering of the IDPs HTML-based login challenges:

Now that we have a webAPI server hosted under IIS Express, whose pipeline is set to evaluate incoming JWT tokens, we build the client, per the instructions. First we logon to the windows azure portal (whose UI for AD allows us to specify a webAPI client’s security and addressing parameters). Note, we successfully logon using our (netmagic/office) credentials and can see a directory, furthermore:

https://manage.windowsazure.com/rapmlsqa.com#Workspaces/ActiveDirectoryExtension/Directory/bcbf53cf-af9a-4584-b4c9-6d8b01b3781d/directoryQuickStart

We go to the applications tab, and invoke the Add function. Given a choice of application types, we indicate that we are adding one of own (that we are developing). This contrasts with the other choice (to pick an app already pre-built, by others). We then just follow instructions:

The values are

• a5010986-7b23-42fb-8223-b78035bdd102 (for oauth client identifier), and
• https://rapmlsqa.com/myWebAPItestclient (for the nominal web resource to which the oauth handshake will direct some of the server-issued messages and which the client can espy and capture (before they land there).

Now we have to connect this client to the RP (a webAPI) we created earlier. Simply following instructions, we select the webAPI by name, and implicitly bind the client shown to that RP.

TestAppForWebAPI

a5010986-7b23-42fb-8223-b78035bdd102

I create the WPF project using a standard visual studio 2013 wizard, exactly as explained, adding a button to the main window, and its event handler. The body I end up with – before trial – is this:

“master programmers” do their usual demonstration of “American exceptionalism” – not lying by omission this time, but none the less “omitting a few details”. So, FIRST add async to the method of the event handler. Second , add a reference to the system.net.http package. As instructed, replace values for one’s own case:  for webAPI appName, client id and redirect, and API’s actual SSL port.

Running multiple projects does get something expected:

AFter logon, we hit dialogs and exceptions that seemed to indicate SSL certificate errors. So we turn them off…

```ServicePointManager.ServerCertificateValidationCallback += delegate(
Object obj, X509Certificate certificate, X509Chain chain, SslPolicyErrors errors)
{
return (true);
};
```

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

While cert dialogs still appear, we do get what seems like an success trial:

The next task is to change the OAUTH endpoints that this code uses!

Posted in oauth, office365

## weights, hamming cubes and random walks

http://arxiv.org/abs/quant-ph/0303081

1 way to project walks on graphs onto 1-dimension, and still add up weights to 1,

we saw this general idea, when learning about coverings. Especially, when seeing how in zigzag expanders use the wreath product, in which the net effect requires one to squish the “direction-giving” graph.

Posted in crypto

## Q as Phi (φ), conditional probabilities

http://turingarchive.org/viewer/?id=133&title=30

Lets take the Q function above as φ, and α as the conditional probability between y and x.

Since conditional probabilities can multiply, and each such element is the output of φn(y), then we see the rules of closure applying.

We also see turing contrasting y (left-recursive) terms of the input with x, where we should think of H1 contributing…(y) and H …contributing (x).

Now it makes sense that “the domain of definition” of φ is H1, thinking of it as an expression of the subgroup (formed from its generators) then to be conjugated by y or x (selecting a coset). (Hmm, this doesn’t feel quite right, but its along these lines.) That is, we can divide y up into subsequences however we wish, so long the elements are members of the H1 coset. We have a generative representation of y, that is, in terms of the subgroup.

Now don’t we recall from the first half of the manuscript that he gave an example of calculating a word-norm. There he conjugated a given upright, and then calculated its invariant. Could we see φ as being similar, in we get a coset within a subgroup of H1 (selected/conjugated by y)

After all, Turing went to quite some trouble to show how, when searching for non-exceptional groups, how the invariants were  applied.

Considering that in the limit of this operator sequence (and thus φn) there is no uncertainty, therefore the conditional probability factor that y contributes would be 1. Thus it also makes some sense that the entropy K of x would be constant for each term in the coset (i.e. 1/4 in the case of the quaternion group using I and j as generators).

Posted in crypto

## projecting roots of unity onto 1 dimension

yesterday, I thought of  Turing’s Q function in terms of providing borel-type subsets (or open intervals) to a group engine – all in all modeling the wandering through a combinatoric space via left-recursion.

But, its also possible to look at things in terms of roots of unity.

We know that we can always swap our thinking module out, when doing this kind of work, and start “thinking” in polar and complex planes. If we see turing drawing a wheel with –1, 0, 1, 2 terms…and then drawing a (–1,1) wheel, we might be tempted to see him thinking in terms of roots of unity – and group algebras thereof.

Take the ith roots (at 2pi.i. over n) as point on the unit circle. Assume an even number, and draw a line between points that cuts the axis at 90 degrees, at point x=t. Look at this t distance from the origin as a “1-dimensional” measure of those ith roots.

If you rotate the circle (under the action of smaller wheel, wreath product style), not only do the ith roots get new names (viewing each ith root as a generator, that may be positive or negative in a symmtric generator set) it also “shift their spectrum” on that 1 d line – which is of course germane to Tunny skills, reading the output of the approximate counting done by Colossus to see if the identified setting for the newly-guessed impulse stream is supported by the right spectral weights, in the right places, looking at the combined spectrum.

One now starts to think a little of bent functions in a sbox, that must work with the OTHER bent functions conforming the sbox, to ensure that they are mutually supportive when doing their “one-way function” thing.

we also have to think “more like Tunny” realizing that the distribution at t = (pi/4)/n is the uniform distribution, on the hypercube walk – remembering that in Colossus process folks were doing spectral scanning for each eigenvector with weight 1. This is pertinent to modern quantum computing of cryptanalysis and its associated speedup, leveraging the “continuity” of the infinite impulse response to drive a quantum-search – rather than punt and wave hands.

Posted in crypto

## Turing and Tunny wheel breaking cryptanalysis (via norms and averaging)

http://turingarchive.org/viewer/?id=133&title=28

its tempting to talk about spectral gaps, expander graphs and modern notions as if Turing had a secret understanding of these topics. But we would be deluding ourselves. What Turing had was relatively more simple understanding, that happens to show that the fancy modern stuff can also be cast in terms of the simpler models. When “thinking like a cryptanalysis” and “using the data-mining toolbox of the GCHQ cryptanalysis” available on their high-end workstations with custom accelerator boards, we have to remember just what their predecessors were doing, when attacking enigma and tunny with less mechanical aids than folks use today.

When we look at the above, imagine that g is a tunny wheel, with unknown pattern. Wheel breaking is about finding that pattern! It’s summarizes a data set – as a sequential set of bits, each of which has a relative weight – much like any histogram. We are counting – by approximation – how many times the value of that bit of the wheel concurs with the values found in data set, for the associated position on the wheel-derived impulse stream.

in the lemma we are given, we see turing considering two or three wheel bits – in a “local function” with “local effects” – and their mutual relationships. These are analyzed using operators norms, where for a mean=1 world (normalized to unitary, that is) the “compression” of the information expressed by norm2(g) can be contrasted with the compression of the information by the Rf operator, norm2(R.f). Of course, this is just a ratio with the “data” of g() being the quotient of numerator. Or, once one gets back to using group theory, a factor group of H/H1.

Now Turing knows, from standard math, that the average of the sum of the next few “local terms” from the data stands in a particular relation to the sum of averages. And so he generalizes this. He knows that “rules of wheel construction” mean that there are local relationship between pairs of wheel bits – being correlated therefore. A “decoder-detector” doing operator-averaging can thus be testing for these correlates.

When we see him modeling v and c, he is just looking at a “local horizon” of 3 wheel bits (or two changes, that is). And  note his lemma, expressed in conditional entropy or “mutual information” terms. The “operator norm” is indistinguishable from the norm of the data being averaged, a case that would occcur if the markov conditions were to ever hold (there are no correlations between the wheel bits). Of course, that’s not reality. But its just a modeling technique.

Turing can later create a sequence of Rf terms, noting that the larger the “local sampler” gets, the smaller the correlations become – as reflected by considering the “measure” for the whole wheels bit under averaging to the measure of all of them but 1, say. The correlation is then 1 – meaning they are not correlated, since there is no (fractional) proportion of compression.

While the above may be his point in the paper (since he was not in cryptanalysis mode, but keystream generating mode where one WANTS no correlations…), the thinking always takes us back to Tunny, doing the reverse process of wanting correlations. When applied to turingismus, one considers the averaging of the next few bits, given the sampled data, knowing that local effects should show up and act as predictors/detectors. Thus one can guess bit 3, assuming bits 1 and 2, and look to the data set to see what its value should be (by guessing).

With this, we can “capture” the general mental model of wheel breaking processes, circa WWII – be they tunny or otherwise.

Of course the General Report on Tunny goes into far more detail, when considering pairwise mutual information. And, rather than being a turingismus predictor based on branch analysis, its more of a classical power analysis, driving errors to the margins of distributions so that this “custom norm” (a point distinguisher, at the end of the day) can find the needle (the key) that shows up the not-random “signature”, without deluding oneself to much that its not the false positive case due to nature’s inherent randomness. That is one has an improbability averaging system.

When you see Turing talk pure math and use terms like “condensation points” … just think of it as a tunny wheel bit, being condensed out from the information from all the depths. Then its easier! If you want you can even think of H1 (later in the analysis) as having the role of the Tunny motor, the “factor” that induces periodicity into the Chi stream – under appropriate averaging. Of course, the designers intended it to have the exact opposite analysis (induce aperiodicity) – not realizing the mutual information effect.

Of course, the same is true in DES, when one considers what the particular DES-designed subkey schedule is doing to that graph evolution! Its inducing dependencies too, presumably detectable if one has the right “averaging operator”.

Anyways:

what he could be saying above is that while its true that the distinguishing power becomes less-resolving as one tends to n, recall that k (the entropy-defining distribution for the data set) is what is is. if one takes the uniform distribution and averages it, putting in in ratio with the uniform distribution, of course one has a “infinite” impulse response, or a continuous function (compared to the discrete cases considered while not [quite] “at uniformity”). The “dirac-delta” fiction can then be called upon, allow onto to indicate that the operator-averaging applied to the k-data distribution, correlated with k distribution itself is 1 – meaning we have a one way function.

Strange to see how he thought about modern concepts! And his conceptions are somewhat more useful that the dumbed down American crypto theory given in academia (i.e. MIT).

Posted in crypto

## constant entropy in Turing’s paper

http://turingarchive.org/viewer/?id=133&title=30

Since g was changed to K in other parts of the text, I keep wondering if I should be changing it above, too. It makes no sense to me as the original g(a) (the proportion of inputs that are a). Note how Turing (originally, before he changed funky-g to K) seemed to be stroking the g carefully – much as one strokes h-bar.

K – as entropy – IS increasing (being uncertainty induced by the injection of randomness during path walking and generator selection).

In this case, Q seems to be a delta-like function for tensoring almost – suggesting ways of including or excluding terms of a general sequence.

After all, it makes sense that we can choose an element of H1, and cycle it till one hits the identity group element. And, it makes sense that Q(of that cycled element) has value 1, in 1d measure.

Then big-g of x (i.e. K(x)) would be constant through each coset , being 1/4 in his quaternion example.

If this holds, then the next paragraph starts to make SOME sense. For then one is interest in the conditions that make translated cosets, where the translate comes from H (not H1), but the conjugated term from HI is still in H1. The condition on the generator exponents being zero seems to be a REQUIREMENT for this H1 and H coset regime ( when H1 is normal,).

When he then talks about the factor group (H/h1) being cyclic of order s, it now makes some sense

http://turingarchive.org/viewer/?id=133&title=30

especially when his quaternion table example then goes on to note that when generated by k, one gets a 2-cycle in the evolution of the limiting function.

I can see how for a coset with 2 terms, in a non-abelian group, how the direction of the product gets one to 4 terms (with k and adjoint-k) being added to j and i. And of course, we know (once the burn-in is complete), one get to a constant K, for each coset. Each coset seems to be one line in the 7-point geometry.

But we can now see how k (and k’) generates H1:

http://turingarchive.org/viewer/?id=133&title=31

We do seem to be cycling through the 4 cases of a pair (that happen to be called I, J, and their complements). One thinks of a 4-case truth table… each row of which identifies a line in the geometry, waiting for a generator to translate the point along the line in some direction of other (with quantum bit flip).

Posted in early computing

## maniac, enigma bombe, atlas and 1954

A Short History of Markov Chain Monte Carlo- Subjective

We see from Turing’s On Permutations paper that massive amounts of non-electronic-computation based MC can be occurring, before 1954 (by definition since that’s when Turing died). Obviously, it could be occurring as early at 1941, or more likely 1946 when lots of Turing bombes are looking for “other purposes”. Remember, we don’t know when that manuscript was written (though I suspect its late 30s for reasons of Turing’s 1920s writing style, and its evolution over time).

While its true that by 1950s on Manchester computers Turing is modeling such processes as were being modeled in the mentioned the MANIAC computing efforts using his floating point registers, its also clear that he could have been doing much the same thing with high-speed bombes too, a decade earlier.

Turing would have been perfectly placed to have take 1944-era nuclear fission simulation work and apply bombes as calculating devices, calculating MC on graphs.

Posted in early computing, enigma

## tt

https://terrytao.wordpress.com/2011/12/06/254b-notes-2-cayley-graphs-and-kazhdans-property-t/

Ur is a generator

http://turingarchive.org/viewer/?id=133&title=27

so for turing all elements are generators (since the normalizer is 1/h, the order of the group).

now we see k/d, the density. and we see compact support

http://turingarchive.org/viewer/?id=133&title=28

Giving decay on convolution space (and thus expanders, in 1954 or earlier).

We can take the mean f and mean g to be indirect references to a scaled L1 norm, and the choice of 1 being tantamount to requiring a stochastic process (sum or column entries is 1, once scaled for the rank). His adjacency is really a weighted digraph!

http://turingarchive.org/viewer/?id=133&title=28

when mean f is 1 (i.e. we have a stochastic process), then we have decay, due the wheel doing an act of convolution. However, there would be no decay (in the norms) if one has the markov condition, already, in the plaintext (nth round) input. The input data must not be yet close to uniform (else convolution has no effect).

http://turingarchive.org/viewer/?id=133&title=28

if we take two step walk (power of 2), note the the neighborhood for the averaging is expanding.at the square of the power. This is his point! It’s a VERY INDIRECT way of talking about expander graphs, reasoning purely geometrically and topologically.

http://turingarchive.org/viewer/?id=133&title=29

Here, K is a k-decay-accumulator– being calculated over the plaintext (of round state). Q is the neighborhood of the (left-recursive) elements of the  plaintext not yet considered, in the decay-accumulator calculation. Take “domain of definition” to mean that as one thinks backwards in the walks through the graph, for each of the possible plaintext sequences (not yet considered in…) so one is walking through particular subsets of the graphs vertices, and there are many of theses. But, the point is, that they overlap…being powersets.

Now, it makes sense to say that the Q are over the group product (of the powerset elements, or the possible walks through the graph based on Sn), where a walk’s components (as one takes more recursive steps) are the elements in the graph product.

http://turingarchive.org/viewer/?id=133&title=30

We can see K(x) as the cumulative-decay to date, in which any additional walk step adds (in the log sense) alpha amount of additional decay (which is the reciprocal of the expansion of the correlation measure of Q – the borel sets (a group recall). and are measured against uniform, with 1 being close (since H1 is balanced) and 0 being far (its the spectral basis, recall).

So the function of the (self adjoint) H1 is to talk backwards (through left-recursive nestings of plaintext), and help calculate a numeric proportion-decay (alpha, or a quantitization of the spectral gap).. But, it’s a restrictive walk, through some but not all the possible sequences of k-ary-symmetric generators – remembering that the power of a generator in a word/walk equates to its particular wheel/upright instance being rotated, relative to preceding and succeeding wheel/diagonal/stecker. In some sense, the “generator” is reflected in the setting of the wheel – the relative-rotation at setting-time.

As we know, the main function in crypto of an expander graph, relative to mixers, is to be entropy conductors, in a “network of channels” specified by the graph whose design offers minimal amounts of “resistance” at any particular point, in the RMS sense.  We also see how he has a model of conditional entropy and conditional probabilities, expressed geometrically as the ratios of such as Norm2(R.f.n.* k) / Norm2(k) – where the denominator is the absolute spectral density of the graph itself being contrasted (to form an “operator norm”) with the entropy of the operators action.

Now one can have 3 of these machines in sequences, acting as a zig zag! This allows one to reason in terms of conditional probabilities, and it gets one to a (practical, probable construction of) shannon’s probabilistic algorithms for capacity-achieving codes, back in 1944. It also allows one to invert the process and search fo needles in haystacks, using much the same process as the Colossus method when attacking Tunny. One can also see how its used in Turingismus, with an alternative baysein heuristic weighting “feedback” guide backtracking-based searches of key spaces.

One can look at the need for H1 to have exponents that sum to zero as being equivalent in qudit-world to having a balanced binary sequence (in the qubit world);, and thus H1 tests the plaintext in much the same way that convolving with WHD tests a sequence (against all possible balanced sequences), and put it into unitary form. Turing is just working in qudits, again, rather than qubits..

If one is trying to create a sequence generator (for use with a voice coder), then one sees how then –1 to the power of the inner product of plaintext vector against WHD gives one as output a set of –1 and 1, each terms being unrelated to the next. This is the part he avoids talking about (to avoid the military censor).

Posted in early computing

## geometric conceptions of groups (and walks of graphs from groups); Heisenberg group

http://terrytao.wordpress.com/2010/07/10/cayley-graphs-and-the-geometry-of-groups/

In Turing’s case, his ei=(-1)  or ei=(+1) are ei=(-1…-d)  or ei=(+1…+d) – since he is thinking in qudits rather than qubits.

The Heisenburg group, properly explained, also makes clear how metrics can differ – by generating set choice (and therefore degree of the graph).

Geometric Aspects of the Heisenberg Group

Posted in early computing

## degree, K, eigenspectrums, distributions of dependency measures

Turings use of u, leading to K’ness

http://turingarchive.org/viewer/?id=133&title=29

Since other parts of the argument have g = f and one sees the rayleigh quotient being applied to a space that is orthogonal to the constant  functions we can see Turing focusing the rate of mixing (ie. using the spectral gap). But, he doesn’t reason in terms of matrices or the eigenspectrum. He reasons purely in terms of functional analysis, normed spaces, and metric – which of course Cambridge of the times was very specialized.

http://en.wikipedia.org/wiki/Continuous_function_(topology)#Continuous_functions_between_topological_spaces

Normed spaces and continuity, and K:-

http://en.wikipedia.org/wiki/Continuous_function_(topology)#Continuous_functions_between_topological_spaces

But we have to recall the context – that one is measuring the degree of dependency (between output and input bit flows, in power-walks).

http://citeseerx.ist.psu.edu/viewdoc/summary;jsessionid=D6B7C23D1381B2D954B1E1BF1A6EE45C?doi=10.1.1.46.3685

Furthermore, one sees the pithily stated property of that which leads to protection against differential cryptanalysis:

http://citeseerx.ist.psu.edu/viewdoc/summary;jsessionid=D6B7C23D1381B2D954B1E1BF1A6EE45C?doi=10.1.1.46.3685

When Turing discusses the wheel function in character terms (a. b-1) he is really discussing the difference between the two sides of the wheel (i.e. output to input)

The general theory of mixers (as Turing would have been thinking of such things) is at Rapidly Mixing Markov Chains- A Comparison of Techniques. One sees Goods coalescence thinking come into play, from Tunny era, AND one sees expander graphs and conductance (based on conditional probabilities of transitions between spaces):

AND

One sees how expander graphs (which ensure there are no “capacity bottlenecks) play into this now – influencing the “rapidity” of the mixing (since maximum flow is arranged for).

Posted in crypto

## Sbox, affine, etc

Check out this file on Box: https://app.box.com/s/ad7pfbpag5jfpu5eefeq

We see simple focus on one property of an sbox: distance from all affine (linear) boolean functions, as chacterized by any one bit of difference in the input mask affecting 1 bit of output mask, at 50 per cent so approximation from input, output correlation is complex.

Distinguishes this from avalanche and propogation

Posted in coding theory

## T1

http://www.encyclopediaofmath.org/index.php/Automorphic_function

Turings h1 is still mysterious. Its almost a reference to several things.

In crypto, its a ref to a function being distinguished from those functions on connected domains that can be expressed algebraically (vs geometrically, say).

In boolean theory, it seems like the truth table based  on  two bits (and  their 4 cases).

In cayley graph quaternion thinking, boolean 4 cases give way to two bits whose names are k and -k. (Or 1 and k, I’m not really sure.)

in automorphic function thinking, we are talking about generating a field (k), one of rational functions.

And when we think of matrices with rational functions on the diagonal, we think of convolutional encoders able to look back a couple of delay lines (and their 4 cases).

Posted in coding theory

## Turing and chemistry, Peter and electronics

Turing as a kid, while being precocious at math and the math arguments found in modern physics of the day, was playing mostly with Chemistry. And we see perhaps what toy he had…to play with.