and in particular consider furthermore
Now wasn’t Turing teaching us, for the hamming/weight-code centric 7-point geometry and its Steiner triple system of constraints and projections, that one worries about commuting with T==4? (where T is the number of rotors giving a supply of generators)?
Now we understand how cayley graph expanders work (and how the family keeps its family trait) Turing’s focus on t==4 case makes to sense. One needs to consider the context – of wanting to enumerate commutators produced by the linear operator and their components moreover (in order to add to the expander’s set of additional generator, per round, and thus preserve the family trait).
When we consider that its necessary in a suitable semi-direct product (of one graph acting on another to transform the linear operator/functional) to compute the n-fold tensor product, it makes sense to see how hadamard constructions work! – particular when using hadamard transforms to model non-linear sources (during cryptanalysis) or support ciphering that not only transforms the norms at each round (driving the permutations and the changes of distance) but which retains the ability in the next round to have the now re-normed bit flows STILL have the action of a being a norm-inducing transform… still able to perpetuate itself to the next round in that action property.
ok I think I’m getting quite an intuitive mental model of how avalanche effect is produced; how balance (of generators) plays its role, how connectivity properties work, and how a set of bit flows can encode-up a operator, generate its replacement (while retaining the generative capacity), and also as a side-effect do what is desired – maximize non-linearity wrt the plaintext.
ok. so the world of weights is the hypercube. Add a point (at infinity) and we get projections such as the 7-point geometry. This begets (three) blocks (each side/line of the triangle), which encode non-commutative algebras (like Turing’s quaternion example) since the three terms along each block have an innate ordering – all of which applies to the world of random walks in graph spaces in which the walk happens to construct a “successor” operator, whose expander graph property is retained.
All of this happens to be driving a measure-centric “topology”. We imagine the to policy to be a series of local hills (on the contour map), where the contours address “neighboring distance – to connected vertices” and there is a local hill (with contours/neighboring averages) at EACH vertex. The expander property requires that the sum of all such neighboring averages is a constant multiple of the size of the generator set..
Let’s guess that this ultimately ALSO imposes the markov quality of there being no information shared between a round’s output (which is also the next transform-to-be) and the input to the round (which is the output of the previous-but-one round). This all induces permutations that re-balance runs of 1s and 0s in 2d space (recalling that a measure works over a set of measured-interval and the complement of “spaces between” the measured intervals, too); and evolve the output (the next transform operator recall) to being fully connected. We want to reach the state the stationary state has been reached meaning there is no information to be had – in the ciphertext bit structure. This is because the “system of measures” has coalesced – with the infinitesimal of the group.
Hmm. did I just write that (drivel)?