## slides-of-loci/README.md at master · agostontorok/slides-of-loci · GitHub

Posted in coding theory

## Course materials

http://www-bcf.usc.edu/~tbrun/Course

Posted in coding theory

## Candy Crush’s Acquisition And What It Means For Mobile Gaming Companies — Mahesh VC

Posted in coding theory
Posted in coding theory

## What is a Pre-money valuation and Post-money valuation?

Posted in coding theory

## Trump says he has no ‘tapes’ of Comey conversations – The Washington Post

Which makes no statement about whether someone else does.

It’s like me sayin there are no tapes of my phone and I made none knowing full well NSA tapes everything on my phone and died the taping).

But “i” don’t do it

Classic American disinformation

Posted in coding theory

## turings quaterion group wired rotor and Cliff of grade=2

document

Posted in coding theory

## How Do You Value A Three-Year-Old Company With A 100-Year Ambition?

Posted in coding theory

Posted in coding theory

Posted in coding theory

## http://blog.indeed.com/2017/01/10/video-game-labor-snapshot/

Posted in coding theory

## Fanciful des

Periods doubling due to non linear feedback is standard fare. But now advance you thinking and view the non linear component of des (and any feistel cipher) as composing two digital streams to create a non linear mathematical medium , for fabricating the theoretical Boolean function that maximizes non linearity.

View the outer bits of the nibble widths box as a third input to the medium that is subject to phase inversion, as a mathematical Kerr effect doubles and triple the balancedness … which drives the number of times the inner bits are inverted. That is data driven computation of the symmetric and asymmetric spin similarity and difference, applied to mixing the round function.

It n feistel analysis term, we are amplifying the rate of confusion per bit in proportion to the information content

Tempting to see this as a repelling  process. But once you add the wider Hamiltonian cycles of the p&s with the balacedness generator you get more of  chaotic attractor for those (theoretical) bent functions.

Now Turing already taught us to generate custom operator norms, that evolve with the plaintext information content. Using the  similarity/dissimilarity basis of the quaternion algebra to tease out the information and use it to generate a data-aware averaging norm that is a masks to  both linear and differential comparators, one gets self evolving computation of an operator norm that generates a bent function.
With the feistel network and the des era spn the complexity goes beyond the limit available with a simple sequence of rotors with quaternion group wiring. One gets now to tangent spaces of much finer granularity.

Posted in coding theory

## Trump and Comey are no Henry and Becket (opinion) – CNN.com

Who are the four knights in America?

Nsa head – Henry’s spymaster (domestic and foreign)

Cia head – Henry’s assassin and torturer

Fbi – Henry’s secret police (for nobles)

Irs – Henry tax collector

Given Washington’s intentional creation of a slaving state, one should see north america in medieval terms (Canada and modern Mexico aside)

Speaking as a vassal.

Posted in coding theory

## Beware family first chiropractors

For a high end brand, doing excellent  chiropractic care, its sad.

Don’t be duped on the corporate practices in (non insurance)  billing.

It was interesting how for six weeks folks lied (badly for the most part). Finally a matured sailor fessed up, a bit like the scientologist finally admitting to the theory of aliens taking over the planet (in order to work a familyfirst insurance non billing scam).

Did I say that I love family first’s chiro methods?  I can even live with the hardsell (its bot worse than orthopedics hard sell) . I cannot live with the intent to deceive the off plan insurance cases though.

Amazing that a profitable brand has not found it it the right time to ban the internal practice, though.

Posted in coding theory

## Fischer info

I’m not the likeliest fool doing crypto. But when you goto a uni whose cs dept (of which you are a member) is part of a distinguished stats dept, and half the fathers of your co-stufents work at gchq (or the Singapore equivalent), then its no small wonder you get “invested” in cryptanalysis.

“Thus, the Fisher information may be seen as the curvature of the support curve (the graph of the log-likelihood).

Near the maximum likelihood estimate, low Fisher information therefore indicates that the maximum appears “blunt”, that is, the maximum is shallow and there are many nearby values with a similar log-likelihood. Conversely, high Fisher information indicates that the maximum is sharp.”

So what dat all mean , I say?

That support is just log likelihood is the first take away
That max has a couple of variant cases is the next. Read carefully.

Posted in coding theory

## Likelihood, frequency and information

Turing understood the critical differences between the terms in the title. And he understood their differences relevance to the design of cryptanalysis methods.

We already saw how in affine combination spaces one can calculate likelihood functions in terms of log addition. That is, calculate in the algebra of likelihoods (rather than in the algebra of frequency probabilities).

Any  gambler should know that in a run of similar results that there is , despite the run, no memory property in the frequencies observed. Thus likelihood is clearly distinguished from frequency.
But what is information and how did Turing apply it to cryptanalysis – knowing it to be yet further distinguished from probabilities and likelihoods both?

The answer lies in the Monte Carlo example. While an unbiased but unbalanced  roulette wheel may offer exploitable information about the frequencies observed (the outcomes) it still does not change the likelihood of any particular event.

so what is the exploit value of information?

In the roulette wheel case, it tells you how long you must play the runs before the unbalacedness will show up in the statistics … And improve your return.

The property tells one something about the underlying mechanisms – generating the likelihoods. It tells one about the underlying group and its symmetries ;or lack thereof) used to generate a run, rather than run properties. In roulette  wheel terms, it tells about unbalancedness of its mechanics.

In modern cipher design one learns about those boolean functions which are both balanced and non linear (being at a distance as far as possible from specifically affine combination functions.)

Or in pure information science terms, there must be no information from the run about the generators of the likelihoods.

And just as there must be no information flow from the product about the design So there must be no further meta information from the design about the designers (and their intellectual background). The secret of the cipher is not in the algorithm but in how the design impacts complexity.

The easiest way to think about information is to think about the complexity of the machine needed to compute the probability space. Known in the 1930, folks designs ciphers to need more compute power than was available – as measured by the information properties.

Now apply that to 1970s des where dengineers designed resistance to only a certain quantity of mechanical

The last point to note is that the tractability of non linear mappings of spaces (that appear to resist diffrential cryptanalysis) can depend on whether they are generated by linear like vector spaces (keyed by cryptographic parameters). Just as the exponential map links probability spaces to likelihood spaces, so lie groups are related to lie algebras through the exponentiation of (lie) matrices

Posted in coding theory

## Quaternion group’s simple representations as combinations of pairs of bits

I’ve never understood, till now, the cryptographic significance of Turing (in his on permutations paper) of introducing the non ablelian quaternion group.

If u look at the four 1 dimensional representations of the quaternion groups, one has all four combinations of 2 bits.

When wheel breaking a tinny wheel, with its constraints on patters that influence which bit value in the sequence follows its prior, it’s crucial to cast such bit pairings and their biases in abstract terms, for which the quaternion groups 1-dimensional representations fit perfectly.

One has to recall how differential cryptography was conceived then (vs now).
Deltaing allowed convolutions of the “hermitian (quadratic) form” to amplify the bias, allowing reinforcement of biases when (delta’ed) streams are added

They thought of it all as a  faltung (vs a generalized quadratic form) for which 2 binary digits are a special case.

Now if one was charged with finding machinery to assist in wheel breaking, being done manually but at high time cost, then noting that a system of suitably wired wheels (wired according to the quaternion group algebra) could represent an ideal set of likelihoods would give rise to it, when suitably biased as a detector of correct wheel patterns

That it all applied to enigma key/wheel breaking  (as well as tunny) is still a secret!
Now one can see the algebra of bulges as an arithmetic of semi simple irreps,  expressed in the quaternion group basis.

That the groups geometry happens to slightly with an affine simplex allowing limiting convergence of sequences of good guesses is also another secret

Posted in coding theory

I’ve now understood more of turings design, in his on permutations paper.

It’s common to use Q as the letter for positive  definite forms.  And one then uses it to average over the group.
There are many Q, since the domain can be over 1 letter back in the cipher text, or 2, or 3… as his Lemma was pointing out (v obliquely). They are all the same, since it’s the invariance one is relying  on.
Also see now the basis for his argument why some generators are 1 while the rest is 0.

Posted in coding theory

## How the Simplex is a Vector Space | The n-Category Café

https://golem.ph.utexas.edu/category/2016/06/how_the_simplex_is_a_vector_sp.html
Nice easy thinking about relating vector spaces to the probability simplex.

Same as McCullough but easier!

Posted in coding theory

## Calculating Nonlinearity of Boolean Functions with Walsh-Hadamard Transform – B-sides

Even simpler presentation

Posted in coding theory

## http://www.tcs.hut.fi/Studies/T-79.5501/2007SPR/lectures/boolean.pdf

Wish I’d known this before reading the tunny report.

At the same time, it’s fascinating to see how folks in 1940 arrived at the same theory
The rest of the series looks just as interesting.

Posted in coding theory

## http://faculty.luther.edu/~macdonal/GA&GC.pdf

Why do physicists despise Clifford algebra?

Because it opens their field to a hundred million open minded computer science students, eager to unify.

Put Clifford algebra into the heart of year two cs math, its game over.

Posted in coding theory

## Clifford Algebra: A visual introduction – slehar

Posted in coding theory

## Easy stats

http://www.math.uah.edu/stat/expect/Properties.html

Posted in coding theory

## simplex geometry

http://www.umsl.edu/~fraundorfp/ifzx/simplexGeometry.html

How I’ve been thinking of the fano plane

Posted in coding theory

## http://www.math.vt.edu/people/brown/doc/731.pdf

Wish there were more papers written like this!

Posted in coding theory

## Linear transformations and norm – Mathematics Stack Exchange

http://math.stackexchange.com/questions/21429/linear-transformations-and-norm

Turing’s fbar is 1

Posted in coding theory

## http://www.dhcs.ca.gov/services/ccs/cmsnet/Documents/thiscomputes155.pdf

Second opinion rules ;in ca) nv won’t be much different

Posted in coding theory

## http://www1.spms.ntu.edu.sg/~frederique/lecture9ws.pdf

Page 14 is almost exact Turing

Now we know why!

Posted in coding theory

## http://www.math.cornell.edu/~dcollins/math4310/QuotientVectorSpaces.pdf

Excellent intuitions about why for Gilbert spaces and field embedding one is interested in constant functions on cosets.

Contrasts nicely with the stats motivation that is the background to the (colossus aided) counting attack on Tunny cipher (and certain aspects of enigma too)

Posted in coding theory