Rauch’s paper broaches communication theory linking series to transforms. It feels a little like the theory presentation made by Galagger who similarly explained engineering principles of communication similarly. One starts starting with continuous Fourier series, addresses discreteness, figure in fourier transforms, worry about L2 convergence, and end up with a series of numbers (that project onto a sinc(t) cyclic shifter)!

See Fourier Series, Integrals, and, Sampling From Basic Complex …

and then

See Fourier Series, Integrals, and, Sampling From Basic Complex …

When looking at DES, we were contemplating pseudo-random means of engineering t-resilient functions. In the field of convolution code encoders, based on relative simple finite state machine ideas not __un__known to Turing (!), we see how the design of the state machine can so bias the sequences such that minimum distances can be generated. In the world of Tunny cryptanalysis and the scoring of repeat bits (see Small) , we saw how folks modeled scored probabilities in custom (analytic) cryptographic channels. Let’s see how folks say all this properly!

also citation for following, unless otherwise noted

First, our intuition on leveraging both cyclic code and non-commutability seems right (since both were featured in Turing’s On Permutations manuscript, featuring the use of the Octonions, the Fano plane, parity constraints, and automorphisms (of the Fano plane))

The notion of ideals needs some background, particularly concerning the relationship with generators for normal and dual spaces:

wikipedia

Further (really well written) tutorial information on ideals and coding generators can be found at Cyclic Error-Correcting Codes.

But, getting back to convolutional (or correlational) codes, we see a variant of what we saw in Turing’s code-centric On Permutations manuscript (remembering we read also concerning 1930s sigma-algebras, in general)

Of course, in Turing’s infamous enigma-work, shifts of cycles (between the compartments of boxes in a steckering relation) were of paramount importance, requiring custom measures (in decibans, of course).

Now what is that mapping? It says that a polynomial (with n terms, and throwaway z placeholder) maps to an automorphism “function”, parameterized by z and sigma. Z is the specification(s) of the delay circuit (a multi-tap shift register acting as a custom multiplier for polynomial-fields). sigma is the possible re-labelings of the nodes in such as the 7-point fano plane – where the set of all possible re-labelings are limited to the potential set by the mutual-dependency relation specification built into each “curves” (lines) of the “design”. For rationale and intuition concerning the relevance of such constraint “networks,” just think of Turing bombe menus or think colossus cryptographer evaluation the 32 count… to see it the sub-patterns are consistent with the dual space constraint set).

In webby terms, the automorphism keeps changing the names of the references, whose identity is stable despite re-naming. An ontology instance (the fano plane) constrains the graph-rewriting.

Note now how when renaming graphs, the world of constrained-renaming of nodes itself can have an algebra and said algebra itself have properties – which is what folks are saying with sigma-cylclicity. The renaming (changing the address map of the bits in a vector) is a bit like the pulse shifts in PAM!