note the unusual formulation; the very notion of log is tied to the generator. Remember this is late 1980s NSA culture, when viterbi decoders (for cryptanalysis) are “hot”. Calculating logs in a “discrete graph” would be interesting, to get to capacity using soft decoding. Of course, one want to get to capacity because of its side-effects on non-linear approximators or ciphers seeking to maximise non-linear differentiation in order to induce key-dependent complexity.
Reminds me of generators that are commutators, where the commutator terms are [d/dx, Phi()], as in einstien field equations.
As we know, when Forney gap to capacity is almost zero one gets an expander/compressor in which the very collision of bit paths provides a way of implementing the commutator – using a SPN (vs a CPU). One is specifically looking at things almost as if one is differentiating on curved surface; where one needs that particular couplet of commutator components to ensure the expander property is inherited as it’s the very collisions between the curved-space-differentiator and the transition function that induces the familiry trait inheritance.
It makes some sense now why Fiestels original cipher induced random permutations (in the quantum expander sense) by letting xor (differentiation) create the “collisions” whose resulting perbutation of the field in some sense generate the “instruction” of this “cipher computing machine”.
I think we have to understand Turing reduction more (as he though of it, not as modern theory casts it).