In one of this “theoretical minimum” lectures, Susskind goes through what he thinks of as elementary presentations about the linkup between boolean algebra and quantum mechanics. He is showing that there is an ordering to the evaluation of an (non-exclusive) or expression, that has impact upon certain types of evaluations. In his case, his evaluation logic is subject to the probabalistic rules of quantum mechanics, which impose conditionality. Of course, conditional expressions are now second-nature, to those who studied computer science!
Susskind teaching at Stanford
Similar ideas occur back in Tunny days, too!
Good et al teaching about Tunny-era cryptanalysis
In the attack on the Tunny cipher, using colossus to count, the folks writing the General Report on Tunny gives lots of examples (and apparently very poorly written up “theory”) about the methods of their attack. While in general its about calculating the specific entropy per bit for the cipher depths under attack, what is most interesting is how folks use a method of experiment design and specifically designs involving ordered combinations of experiments to figure out certain critical values … that ultimately give the detector that forms the basis of ‘a most likely ‘decision procedure’” for guessing the settings/wheels (for the ciphertexts under evaluation).
Getting back to susskind, he elaborates the relationship between non-commutative algebras (the spin algebra, discovered in the 1930s and famously known to and studied by Turing et al) and statistical inference. He shows, by example, how the ordering of experiments matters WHEN experiments have impact on later experiments. In math terms, we would say that the events being measured in a borel algebra have some mutual dependency. Folks are measuring vectors in large dimension spaces (crypto depths!) that are specifically NOT orthogonal!
Contrary to elementary teaching about quantum mechanics, one *can* get useful information out of systems that are not formed only of orthogonal vectors. Indeed, one can perhaps characterize “cryptanalysis” as the art of doing JUST that. Perhaps one can then characterize cipher design as the art of ensuring that its hard as possible (WITHOUT THE KEY) to perform *that* class of cryptanalysis!
As showcased in the attack on Tunny using their “non-commutative algebra of proportional bulges”, one can discover particular orderings of experiments (i.e. counts on colossus) that drive the probabilities down. In the case of a binary or expression, of the 4 cases possible in the truth table one can so order the counts such that mutual dependency means that the false, false case (1 of the 4, in an or world, to give false) has only a probability proportional to 1 in 4 (that is 25%). The application of more casually-connected experiments designs drive such as 3-ary, 4ary, n-ary conditional expressions to exponentially diminishing lower proportions for each “additional 1-ary”– that helps isolate potential crypto solutions from non-solutions. We have a generalized search, based on the “statistical method”, that is.
in Tunny, it’s the co-dependency of the expressions underlying the various streams that combine to make certain elements of the dechi stream (those visible in the clear, at certain motor positions) that gives the required non-commutativity.
Its hard to read the Tunny report and see its underlying logic; being written using archetype terminology. If you place it all in the time line of quantum mechanism while also recalling just how well versed in the “new” math required for quantum mechanics were the 1930s-trained intellectuals doing 1940s cryptanalysis) their choice of archetype wording starts to make sense.
Above, we see (now) how Turing and co have indeed formed up an early fuzzy logic, design to give a maximum likelihood decoder.
In general, the averaging process is summarized “a.most in passing”