rods to stems to congruent to GOST to complexity


We are now used to the idea that walking the rods allows one to determine the initial rod points and rod positions that define the initial condition of the set enigma machine – actually setup in terms of the window-visible external setting and the internal setting. Each of the rods in sequence is the encoder/decoder for the wheel (at some position); but it also the label of a state in the state transition graph, whose input symbol is 26-ary. We have an m-configuration and a symbol, that is. Tracing out a run of the m-configurations makes a sequence of configurations of course, where a configuration is just the evaluation of the machine for the next input char, traced.

We can easily see how one can describe the enigma not as explained in the Profs book, but in terms of Turing machines. The machine is walking the cosets of the right hand wheel, as a function of a walk of the cosets generated by the other wheels and the sequence to date of input symbols. One can model the turnover sequence of wheels in terms of a coset decomposition chain for the space as a whole, which the left hand wheel being the most significant term in an “enigma” counter.

Sequencing walks through generated cosets is pretty “natural”, as Cameron explains, as excerpted below. His computable sequences are playing the same role as the rods in the enigma wheel. 1950s crypto liked to have several such rod-square generators (or the fib world equivalent “rod” generators) in a sequence, clicking over a larger space of multi-dimensional fib counters using coset decomposition (group partitioning) ideas.

image

http://cameroncounts.wordpress.com/2012/06/21/fibonacci-numbers-2/

But its Cameron’s next post on the topic that strikes me even crypto-relevant. Having introduced the “successor()”-centered “representation” of the sequence generator, he shows the linkage between a generalized 3-element sequence of succ() terms  – n-1, n, n+1 – and relates this form of generator to machines that evolve an initiate state to a desired state – generating the sequence of group operations along the way.

image

http://cameroncounts.wordpress.com/2012/06/29/fibonacci-numbers-3/

The A to the power n concepts is a cyclic group element, in that group world that is built to “grow” in a geometrical progression that non the less is countable.

Now what interested me was not the result, but the structure of the demonstration he made – building on the sigma function, a couple of “neighboring” recurrence examples , and a generalization that turns around relationships between terms in the generated computable sequence into a relationship in the m-configuration generator world that has the same effect.

Don’t we see Turing playing with the same ideas, but using lattice groups that can deal with the particular “engineering” of the geometry of the plant stem? … as the surface being modelled?

IN any case, stripped of particular geometries Turing’s core idea seems to be that the medium of  chemistry governed by his reaction-diffusion differential equations can encapsulate the action of computing a cyclic group element’s self-multiplication, and then multiplication by that (stored in the grown stem of course) and the next term on the congruential series.

The relevance of his “biological” computing *medium*” is not its acting as a particular lattice generator itself. It’s the ability to compute lattices generally, regardless of their parameters. And one one can do that, one can encode and decode in the discrete world – which allows one to execute a turing machine of course…

Well, that makes some sense. I’m a glorified plant, and I can execute a turing machine, since my definition its only that which a human computer can step through. And human computer is only a glorified plant, subject to chemical concentrating according to diffusion operators.

So…that local neighboring successive-ness idea ties nicely into the notion of the fiestel cycle causing pattern formation at a distance. We can combine that with what we know in one model (GOST) that a mere 6 cycles of that cycle generator can leverage mere xor as a bit of non-linear input. Put together, we know that linear-cryptoanalysis techniques are unable to model such a generated state analytically – as its complexity has “changed class”, such that its no longer amenable to satisfaction constraint solving. One sees the notion of “phase transitions” coming into play, in the macro-condensation sense that complexity itself transitioned. We do have a new stationary wave; its just not one operating on the plane of probability mass functions. Its one can alters the notion of limits, and suggests a world of limit operators for different such complexity-changing stationary wave types – probably necessary to capture the notion of replication. One can imagine a series of such stationary waves (a bit like fourier harmonics) stripping down the space to its core – much like we work with logs to compute entropy.

Well that was fun. A limit is stationary notion tied to differentiation, and one can differentiate the differentiating operators to product a sequence of limit types that produce a “stationary limit-wave” that computes entropy  at its own meta-limit – and by doing so learns to replicate – because the act of doing the computation of reducing something to its entropy *is* the act of replication. So, “by storing” so “one replicates”.

I must eat my greens, which some poor plant photosynthesized from the sun’s energy and “stored away” for the winter, so it could replicate itself next year. Then, perhaps, I will stop thinking like this, and get back to work. Perhaps then I can evolve into a plant floating in the sea of other bits of dead plan, and unroot myself.

Hmm.

About home_pw

Computer Programmer who often does network administration with focus on security servers. Sometimes plays at slot machine programming.
This entry was posted in early computing, phyllotaxis. Bookmark the permalink.