Likelihood, frequency and information

Turing understood the critical differences between the terms in the title. And he understood their differences relevance to the design of cryptanalysis methods.

We already saw how in affine combination spaces one can calculate likelihood functions in terms of log addition. That is, calculate in the algebra of likelihoods (rather than in the algebra of frequency probabilities).

Any  gambler should know that in a run of similar results that there is , despite the run, no memory property in the frequencies observed. Thus likelihood is clearly distinguished from frequency.
But what is information and how did Turing apply it to cryptanalysis – knowing it to be yet further distinguished from probabilities and likelihoods both?

The answer lies in the Monte Carlo example. While an unbiased but unbalanced  roulette wheel may offer exploitable information about the frequencies observed (the outcomes) it still does not change the likelihood of any particular event.

so what is the exploit value of information?

In the roulette wheel case, it tells you how long you must play the runs before the unbalacedness will show up in the statistics … And improve your return.

The property tells one something about the underlying mechanisms – generating the likelihoods. It tells one about the underlying group and its symmetries ;or lack thereof) used to generate a run, rather than run properties. In roulette  wheel terms, it tells about unbalancedness of its mechanics.

In modern cipher design one learns about those boolean functions which are both balanced and non linear (being at a distance as far as possible from specifically affine combination functions.)

Or in pure information science terms, there must be no information from the run about the generators of the likelihoods.

And just as there must be no information flow from the product about the design So there must be no further meta information from the design about the designers (and their intellectual background). The secret of the cipher is not in the algorithm but in how the design impacts complexity.

The easiest way to think about information is to think about the complexity of the machine needed to compute the probability space. Known in the 1930, folks designs ciphers to need more compute power than was available – as measured by the information properties.

Now apply that to 1970s des where dengineers designed resistance to only a certain quantity of mechanical 

The last point to note is that the tractability of non linear mappings of spaces (that appear to resist diffrential cryptanalysis) can depend on whether they are generated by linear like vector spaces (keyed by cryptographic parameters). Just as the exponential map links probability spaces to likelihood spaces, so lie groups are related to lie algebras through the exponentiation of (lie) matrices

Posted in coding theory

Quaternion group’s simple representations as combinations of pairs of bits

I’ve never understood, till now, the cryptographic significance of Turing (in his on permutations paper) of introducing the non ablelian quaternion group. 

If u look at the four 1 dimensional representations of the quaternion groups, one has all four combinations of 2 bits.

When wheel breaking a tinny wheel, with its constraints on patters that influence which bit value in the sequence follows its prior, it’s crucial to cast such bit pairings and their biases in abstract terms, for which the quaternion groups 1-dimensional representations fit perfectly. 

One has to recall how differential cryptography was conceived then (vs now).
Deltaing allowed convolutions of the “hermitian (quadratic) form” to amplify the bias, allowing reinforcement of biases when (delta’ed) streams are added 

They thought of it all as a  faltung (vs a generalized quadratic form) for which 2 binary digits are a special case.

Now if one was charged with finding machinery to assist in wheel breaking, being done manually but at high time cost, then noting that a system of suitably wired wheels (wired according to the quaternion group algebra) could represent an ideal set of likelihoods would give rise to it, when suitably biased as a detector of correct wheel patterns

That it all applied to enigma key/wheel breaking  (as well as tunny) is still a secret!
Now one can see the algebra of bulges as an arithmetic of semi simple irreps,  expressed in the quaternion group basis.

That the groups geometry happens to slightly with an affine simplex allowing limiting convergence of sequences of good guesses is also another secret

Posted in coding theory

Hermitian vs quadratic forms

I’ve now understood more of turings design, in his on permutations paper.

It’s common to use Q as the letter for positive  definite forms.  And one then uses it to average over the group.
There are many Q, since the domain can be over 1 letter back in the cipher text, or 2, or 3… as his Lemma was pointing out (v obliquely). They are all the same, since it’s the invariance one is relying  on.
Also see now the basis for his argument why some generators are 1 while the rest is 0. 

Posted in coding theory

How the Simplex is a Vector Space | The n-Category Café
Nice easy thinking about relating vector spaces to the probability simplex.

Same as McCullough but easier!

Posted in coding theory

Calculating Nonlinearity of Boolean Functions with Walsh-Hadamard Transform – B-sides

Even simpler presentation

Posted in coding theory

Wish I’d known this before reading the tunny report.

At the same time, it’s fascinating to see how folks in 1940 arrived at the same theory
The rest of the series looks just as interesting.

Posted in coding theory

Why do physicists despise Clifford algebra?

Because it opens their field to a hundred million open minded computer science students, eager to unify.

Put Clifford algebra into the heart of year two cs math, its game over.

Posted in coding theory

Clifford Algebra: A visual introduction – slehar

Sunday reading. So well written

Posted in coding theory

Easy stats

Posted in coding theory

simplex geometry

How I’ve been thinking of the fano plane

Posted in coding theory

Wish there were more papers written like this!

Posted in coding theory

Linear transformations and norm – Mathematics Stack Exchange

Turing’s fbar is 1

Posted in coding theory

Second opinion rules ;in ca) nv won’t be much different

Posted in coding theory

Page 14 is almost exact Turing

Now we know why!

Posted in coding theory

Excellent intuitions about why for Gilbert spaces and field embedding one is interested in constant functions on cosets.

Contrasts nicely with the stats motivation that is the background to the (colossus aided) counting attack on Tunny cipher (and certain aspects of enigma too)

Posted in coding theory

House intel committee: Partisan split over canceled open hearing –

So does a us president have the power and/or authority to “arrange” for the @incidental collection” of every collectible communication of us person #99?


And yes gchq assist is that @arrangement”

Posted in coding theory

StarWind V2V Converter – Free Tool from StarWind! –

Posted in coding theory

Quaternion formalism and physics, 1880-1940

Posted in coding theory


Does the action of the averaging operator factorize the dimension of the wheel in to the substances of the wheel writings?

Posted in coding theory

Colossus c(l) osets vs statistics

It’s now clear to me that while good et al thought of the attack on tunny in purely statistical theory, Newman and Turing thought of it in terms of topology (and log linear multinomials)

We have yet to see a Newman/Turing analysis (or topology applied to either enigma or tunny (or Italian heybern)) yet.

I see why it would still be seen as sensitive.

Posted in coding theory

Metadata on trump

When I wrote my last memo I didn’t know that it was part of a wider maelstrom.

It’s clear that the Truisms of all American (cum British) lying are abound; with definitional lying at the fore.

In my Era, pre 911, via 2 layers of Anglo-American (unofficially well coordinated) small contractors, each agency would have the other (small contractor) do metadata collection (on each other’s citizens)

This all went away post 911, when each agency could do it officially (whereas before metadata was legally grey)

What was retained, post 911, was the apparatus of grey ness.

Beware trump, with his CIA and NSA versions of his preatorian guards. Now beholden to the new emporer, the old guard may be lynched; Such is the nature of raw naked (super secret) power.

Posted in coding theory

Gchq spying on trum

“based on the information available to us, we see no indications that Trump Tower was the subject of surveillance by any element of the United States government either before or after Election Day 2016.”

As I recall when deniability is required, that’s when NSA induced gchq to do the spying on Americans (contact with foreign targets, of course)

Trump must be up by now on how and what NSA do!?

He won’t be as adept as Obama. But give him time. The absolute power of secret spying will either tame his dictatorial impulses or get him rapidly impeached.

Posted in coding theory

Turings on permutations focus on mean = 1

Finally I understand the meaning of the mean being 1.

Id figured a while ago that he was interested in the shuffle around 1.

But now we get that in eigenvector explanation (which is better than the shuffle!)

any distribution is the sum of 1/n of the 1 eigenvector + a superposition of weighted others.

Posted in coding theory

Constant functions, in Turing op paper

Professional version of my similar drivel at

Posted in coding theory

One d reps of groups , per Turing almost exactly


This is the second paper with modern presentation of two of the argumentation devices Turing used in on permutations. First we finally understand the why of wanting to establish that the math power of d generator is always positive 1. And second, we see why his lemma a is concerned with a power of four.

See paper for other examples of both argumentation devices.

Posted in coding theory

See 1.3

Posted in coding theory

CiteSeerX — GROUP THEORY IN CRYPTOGRAPHY;jsessionid=349DB81FF478DC49989540C68EAE06EF?doi=

see 3.4 log signature s

Posted in coding theory

Ewens distribution on the symmetric group – Groupprops

Posted in coding theory

on permutations and lie algebras

if we reanalyze turings final argument can we say that he is testing for a given representation being one that is a group, given a lie algebra and its generator?  

he really says that g has subgroup h which inturn has a centralizer subgroup h1

see also k:

Generalization to Borel setsEdit

This distribution can be generalized to more complicated sets than intervals. If S is a Borel set of positive, finite measure, the uniform probability distribution on S can be specified by defining the pdf to be zero outside S and constantly equal to 1/K on S, where K is the Lebesgue measure of S


Posted in coding theory

The PDF for Lie groups and algebras for physicists
let me see two turing arguments

Posted in coding theory