Periods doubling due to non linear feedback is standard fare. But now advance you thinking and view the non linear component of des (and any feistel cipher) as composing two digital streams to create a non linear mathematical medium , for fabricating the theoretical Boolean function that maximizes non linearity.
View the outer bits of the nibble widths box as a third input to the medium that is subject to phase inversion, as a mathematical Kerr effect doubles and triple the balancedness … which drives the number of times the inner bits are inverted. That is data driven computation of the symmetric and asymmetric spin similarity and difference, applied to mixing the round function.
It n feistel analysis term, we are amplifying the rate of confusion per bit in proportion to the information content
Tempting to see this as a repelling process. But once you add the wider Hamiltonian cycles of the p&s with the balacedness generator you get more of chaotic attractor for those (theoretical) bent functions.
Now Turing already taught us to generate custom operator norms, that evolve with the plaintext information content. Using the similarity/dissimilarity basis of the quaternion algebra to tease out the information and use it to generate a data-aware averaging norm that is a masks to both linear and differential comparators, one gets self evolving computation of an operator norm that generates a bent function.
With the feistel network and the des era spn the complexity goes beyond the limit available with a simple sequence of rotors with quaternion group wiring. One gets now to tangent spaces of much finer granularity.
Who are the four knights in America?
Nsa head – Henry’s spymaster (domestic and foreign)
Cia head – Henry’s assassin and torturer
Fbi – Henry’s secret police (for nobles)
Irs – Henry tax collector
Given Washington’s intentional creation of a slaving state, one should see north america in medieval terms (Canada and modern Mexico aside)
Speaking as a vassal.
For a high end brand, doing excellent chiropractic care, its sad.
Sad as in scientology sad.
Don’t be duped on the corporate practices in (non insurance) billing.
It was interesting how for six weeks folks lied (badly for the most part). Finally a matured sailor fessed up, a bit like the scientologist finally admitting to the theory of aliens taking over the planet (in order to work a familyfirst insurance non billing scam).
Did I say that I love family first’s chiro methods? I can even live with the hardsell (its bot worse than orthopedics hard sell) . I cannot live with the intent to deceive the off plan insurance cases though.
Amazing that a profitable brand has not found it it the right time to ban the internal practice, though.
I’m not the likeliest fool doing crypto. But when you goto a uni whose cs dept (of which you are a member) is part of a distinguished stats dept, and half the fathers of your co-stufents work at gchq (or the Singapore equivalent), then its no small wonder you get “invested” in cryptanalysis.
“Thus, the Fisher information may be seen as the curvature of the support curve (the graph of the log-likelihood).
Near the maximum likelihood estimate, low Fisher information therefore indicates that the maximum appears “blunt”, that is, the maximum is shallow and there are many nearby values with a similar log-likelihood. Conversely, high Fisher information indicates that the maximum is sharp.”
So what dat all mean , I say?
That support is just log likelihood is the first take away
That max has a couple of variant cases is the next. Read carefully.
Turing understood the critical differences between the terms in the title. And he understood their differences relevance to the design of cryptanalysis methods.
We already saw how in affine combination spaces one can calculate likelihood functions in terms of log addition. That is, calculate in the algebra of likelihoods (rather than in the algebra of frequency probabilities).
Any gambler should know that in a run of similar results that there is , despite the run, no memory property in the frequencies observed. Thus likelihood is clearly distinguished from frequency.
But what is information and how did Turing apply it to cryptanalysis – knowing it to be yet further distinguished from probabilities and likelihoods both?
The answer lies in the Monte Carlo example. While an unbiased but unbalanced roulette wheel may offer exploitable information about the frequencies observed (the outcomes) it still does not change the likelihood of any particular event.
so what is the exploit value of information?
In the roulette wheel case, it tells you how long you must play the runs before the unbalacedness will show up in the statistics … And improve your return.
The property tells one something about the underlying mechanisms – generating the likelihoods. It tells one about the underlying group and its symmetries ;or lack thereof) used to generate a run, rather than run properties. In roulette wheel terms, it tells about unbalancedness of its mechanics.
In modern cipher design one learns about those boolean functions which are both balanced and non linear (being at a distance as far as possible from specifically affine combination functions.)
Or in pure information science terms, there must be no information from the run about the generators of the likelihoods.
And just as there must be no information flow from the product about the design So there must be no further meta information from the design about the designers (and their intellectual background). The secret of the cipher is not in the algorithm but in how the design impacts complexity.
The easiest way to think about information is to think about the complexity of the machine needed to compute the probability space. Known in the 1930, folks designs ciphers to need more compute power than was available – as measured by the information properties.
Now apply that to 1970s des where dengineers designed resistance to only a certain quantity of mechanical
The last point to note is that the tractability of non linear mappings of spaces (that appear to resist diffrential cryptanalysis) can depend on whether they are generated by linear like vector spaces (keyed by cryptographic parameters). Just as the exponential map links probability spaces to likelihood spaces, so lie groups are related to lie algebras through the exponentiation of (lie) matrices
I’ve never understood, till now, the cryptographic significance of Turing (in his on permutations paper) of introducing the non ablelian quaternion group.
If u look at the four 1 dimensional representations of the quaternion groups, one has all four combinations of 2 bits.
When wheel breaking a tinny wheel, with its constraints on patters that influence which bit value in the sequence follows its prior, it’s crucial to cast such bit pairings and their biases in abstract terms, for which the quaternion groups 1-dimensional representations fit perfectly.
One has to recall how differential cryptography was conceived then (vs now).
Deltaing allowed convolutions of the “hermitian (quadratic) form” to amplify the bias, allowing reinforcement of biases when (delta’ed) streams are added
They thought of it all as a faltung (vs a generalized quadratic form) for which 2 binary digits are a special case.
Now if one was charged with finding machinery to assist in wheel breaking, being done manually but at high time cost, then noting that a system of suitably wired wheels (wired according to the quaternion group algebra) could represent an ideal set of likelihoods would give rise to it, when suitably biased as a detector of correct wheel patterns
That it all applied to enigma key/wheel breaking (as well as tunny) is still a secret!
Now one can see the algebra of bulges as an arithmetic of semi simple irreps, expressed in the quaternion group basis.
That the groups geometry happens to slightly with an affine simplex allowing limiting convergence of sequences of good guesses is also another secret
I’ve now understood more of turings design, in his on permutations paper.
It’s common to use Q as the letter for positive definite forms. And one then uses it to average over the group.
There are many Q, since the domain can be over 1 letter back in the cipher text, or 2, or 3… as his Lemma was pointing out (v obliquely). They are all the same, since it’s the invariance one is relying on.
Also see now the basis for his argument why some generators are 1 while the rest is 0.
Nice easy thinking about relating vector spaces to the probability simplex.
Same as McCullough but easier!
Even simpler presentation
Wish I’d known this before reading the tunny report.
At the same time, it’s fascinating to see how folks in 1940 arrived at the same theory
The rest of the series looks just as interesting.
Why do physicists despise Clifford algebra?
Because it opens their field to a hundred million open minded computer science students, eager to unify.
Put Clifford algebra into the heart of year two cs math, its game over.
Sunday reading. So well written
How I’ve been thinking of the fano plane
Wish there were more papers written like this!
Second opinion rules ;in ca) nv won’t be much different
Page 14 is almost exact Turing
Now we know why!
Excellent intuitions about why for Gilbert spaces and field embedding one is interested in constant functions on cosets.
Contrasts nicely with the stats motivation that is the background to the (colossus aided) counting attack on Tunny cipher (and certain aspects of enigma too)
So does a us president have the power and/or authority to “arrange” for the @incidental collection” of every collectible communication of us person #99?
And yes gchq assist is that @arrangement”
Quaternion formalism and physics, 1880-1940
Does the action of the averaging operator factorize the dimension of the wheel in to the substances of the wheel writings?
It’s now clear to me that while good et al thought of the attack on tunny in purely statistical theory, Newman and Turing thought of it in terms of topology (and log linear multinomials)
We have yet to see a Newman/Turing analysis (or topology applied to either enigma or tunny (or Italian heybern)) yet.
I see why it would still be seen as sensitive.
When I wrote my last memo I didn’t know that it was part of a wider maelstrom.
It’s clear that the Truisms of all American (cum British) lying are abound; with definitional lying at the fore.
In my Era, pre 911, via 2 layers of Anglo-American (unofficially well coordinated) small contractors, each agency would have the other (small contractor) do metadata collection (on each other’s citizens)
This all went away post 911, when each agency could do it officially (whereas before metadata was legally grey)
What was retained, post 911, was the apparatus of grey ness.
Beware trump, with his CIA and NSA versions of his preatorian guards. Now beholden to the new emporer, the old guard may be lynched; Such is the nature of raw naked (super secret) power.
“based on the information available to us, we see no indications that Trump Tower was the subject of surveillance by any element of the United States government either before or after Election Day 2016.”
As I recall when deniability is required, that’s when NSA induced gchq to do the spying on Americans (contact with foreign targets, of course)
Trump must be up by now on how and what NSA do!?
He won’t be as adept as Obama. But give him time. The absolute power of secret spying will either tame his dictatorial impulses or get him rapidly impeached.