quantum boolean algebra and colossus-era cryptanalysis


In one of this “theoretical minimum” lectures, Susskind goes through what he thinks of as elementary presentations about the linkup between boolean algebra and quantum mechanics. He is showing that there is an ordering to the evaluation of an (non-exclusive) or expression, that has impact upon certain types of evaluations. In his case, his evaluation logic is subject to the probabalistic rules of quantum mechanics, which impose conditionality. Of course, conditional expressions are now second-nature, to those who studied computer science!

Screenshot (129)

https://www.youtube.com/watch?feature=player_detailpage&v=a6ANMKRBjA8#t=2266,

Susskind teaching at Stanford

 

Similar ideas occur back in Tunny days, too!

image

http://www.alanturing.net/turing_archive/archive/t/t04/TR04-005.html

Good et al teaching about Tunny-era cryptanalysis

In the attack on the Tunny cipher, using colossus to count, the folks writing the General Report on Tunny gives lots of examples (and apparently very poorly written up “theory”) about the methods of their attack. While in general its about calculating the specific entropy per bit for the cipher depths under attack, what is most interesting is how folks use a method of experiment design and specifically designs involving ordered combinations of experiments to figure out certain critical values … that ultimately give the detector that forms the basis of ‘a most likely ‘decision procedure’” for guessing the settings/wheels (for the ciphertexts under evaluation).

Getting back to susskind, he elaborates the relationship between non-commutative algebras (the spin algebra, discovered in the 1930s and famously known to and studied by Turing et al) and statistical inference. He shows, by example, how the ordering of experiments matters WHEN experiments have impact on later experiments. In math terms, we would say that the events being measured in a borel algebra have some mutual dependency. Folks are measuring vectors in large dimension spaces (crypto depths!) that are specifically NOT orthogonal!

Contrary to elementary teaching about quantum mechanics, one *can* get useful information out of systems that are not formed only of orthogonal vectors. Indeed, one can perhaps characterize “cryptanalysis” as the art of doing JUST that. Perhaps one can then characterize cipher design as the art of ensuring that its hard as possible (WITHOUT THE KEY) to perform *that* class of cryptanalysis!

As showcased in the attack on Tunny using their “non-commutative algebra of proportional bulges”, one can discover particular orderings of experiments (i.e. counts on colossus) that drive the probabilities down. In the case of a binary or expression, of the 4 cases possible in the truth table one can so order the counts such that mutual dependency means that the false, false case (1 of the 4, in an or world, to give false) has only a probability proportional to 1 in 4 (that is 25%). The application of more casually-connected experiments designs drive such as 3-ary, 4ary, n-ary conditional expressions to exponentially diminishing lower proportions for each “additional 1-ary”– that helps isolate potential crypto solutions from non-solutions. We have a generalized search, based on the “statistical method”, that is.

in Tunny, it’s the co-dependency of the expressions underlying the various streams that combine to make certain elements of the dechi stream (those visible in the clear, at certain motor positions)  that gives the required non-commutativity.

image

http://www.alanturing.net/turing_archive/archive/t/t04/TR04-003.html

Its hard to read the Tunny report and see its underlying logic; being written using archetype terminology. If you place it all in the time line of quantum mechanism while also recalling just how well versed in the “new” math required for quantum mechanics were the 1930s-trained intellectuals doing 1940s cryptanalysis) their choice of archetype wording starts to make sense.

image

http://www.alanturing.net/turing_archive/archive/t/t04/TR04-004.html

Above, we see (now) how Turing and co have indeed formed up an early fuzzy logic, design to give a maximum likelihood decoder.

 

image

http://www.alanturing.net/turing_archive/archive/t/t04/TR04-018.html

 

In general, the averaging process is summarized “a.most in passing”

image

image

Advertisements

About home_pw@msn.com

Computer Programmer who often does network administration with focus on security servers. Very strong in Microsoft Azure cloud!
This entry was posted in colossus. Bookmark the permalink.