S box nibble sized graph evolution (with quantum coin from support between pairs of plaintext chars).

Distinguished from e(), that de-localized flows, simulating quantum coherence

S box nibble sized graph evolution (with quantum coin from support between pairs of plaintext chars).

Distinguished from e(), that de-localized flows, simulating quantum coherence

Posted in coding theory

So here we are again, explaining

the principles upon which des is designed. Yes it’s the nth time; and the same caveat as before: go anywhere else to learn someone’s description of the algorithm. Here we motivate the design.

So imagine you are studying the infamous quantum mechanics 2 hole screen – separating electron gun on the left from intensity screen. Or the 1 hole. Or the two holes shifted up (or down) a bit on the grate.

Oh and don’t forget that we can flip the grate, should we want the gun on the right (and the screen on the left).

After all like des, quantum mechanics is inevitably reversible, since “information” is conserved.

Now imagine that the e function of des is the grate.

And 2 of the six input bits of the current sbox are the slits in the grates.

The purpose of the way the e function shift inputs left (and right) is to simulate shifting the slits in the grate ip and down.

Why?

So that the qm normal density is shifted.

Our goal is to uniformly fill an intensity space, on the screen. Or better, average the left and right intensity screen (per the classical functional form.

Now, even though we shift our normal curve up and down (as the number of supports in the pair wise plaintext’s chats induce a key particle to move 1,2 3… lambada from the center, within the normal curve (and while constructing/destructing as we go) we still need the addition of intensity curves to uniformly fill the output space.

And here is where comes in the particular key schedule. It’s particular sequencing moves the grate around not only so feistels multiplexor covers all subpace but does so in such manner that guarantees that the concentration of intensity at any point (in 2d intensity space) is never more than the second eigenvalue gap.

Now recall that des does not have reversible sboxes. But we don’t need them! After all we have interleaved diffraction grates, since left 2 right we have half s des round and (right 2 left) we have the other half.

Now view the des subkey generatio functions own (highly programmed) bit duplication as a means of subtly (at huge granularity) measuring the quantum effect, thus influencing just how the left and right particle motions occur – giving a characteristic.

And ensuring that there exists no matrix representation of the same graph.

Posted in coding theory

Let C be a conjugacy class of the symmetric group Sn . Recall that C consists of all of the permutations of a given cycle type.

Hence we may define the support of C, denoted by supp(C), to be the number of nonfixed digits under the action of a permutation in C.

If C has “large” support, then each element of C“moves” a lot of letters.

If the permutations in C are all even, then we say that C is an even conjugacy class. If the permutations in C are all odd, we say that C is an odd conjugacy class.

For n≥5, the subgroup generated by C, denoted by C, is Sn if C is odd, and An if C is even.

now we understand how turing thought about avalanche -and the significance of 4. he used gccs terms, like beetles….

Posted in coding theory

The proper (fractional) base used when cryptanalyzing runny messages was not such a strange idea. After all, napiers original log tables were conceived in term of using a base that was a small offset from 1. E.g. O.9999999

I you were 1920 trained math person, you be trained in such “strange” log tables – and such mechanical devices with vertical wooden “rods” bearing abacus like beads. The latter could compute, using the log basis in question.

So think! Tunny back to rods.

And from rods we get back to enigma/Hebern crytanalysis, and isomorph searching.

Of course, rods in enigma algebra are less about logs and more about relative automorphisms (as one computes conjugates).

But you see the eureka transitions.

Once turing and co figured that a discriminator could exist for a rotor set, now one can leap to its log (and an algebra of bulges).

What unclassified docs don’t say is the parallel analysis going on with quantum mechanics calculations, given the parallel effort going on with the ;mostly compartmentalized) atomic bomb making efforts (circa 1944).

Must have been fun to be thinking about the ” potential” of the electron cloud in a colossus tube/valve and that similar controlled electron (well neuron) flow used to accelerate a uranium chain reaction.

Posted in coding theory

Let’s imagine it’s 1980 again and you have a field full of custom des processors (just doing exhaustive search).

The kind trapdoor I’d want is to probably abandon a given key before having to

Complete all rounds, putting that key at the back of the queue. Given the cost of io in a cluster and the accounting, I want every key derived from that candidate by key schedule to similarly get filtered out of the early trials

So now imagine your field had a special precomputation area with electronics dedicated to the subkey scheduling. So what would that look like?

Look at today’s general purpose CPU pipelining for a good clue on how such custom vlsi was built then.

Also? Since io is the main impeder, think of clustering that is centered on the accounting and queue. One thinks of a giant ram cache… so how do multi cores today share gigs of ram? Probably sane back then…

Posted in coding theory

Each row of a Hadamard matrix is perpendicular to the previous one.

And hadamard gates implement conditional qbit operations.

Even in 1940 thinking, multiplying u by r meant figuring what the ciphertect would have been should the rotor have rotated the u.

What we want after n rebounds of des is that if u consider each round output to be ur then ur and ur2 differ as do rows on the hadamard matrix.

Ie half the bit differ.

Which is more tangible than “avalanche” since now we had a limiting condition – when the sequence of conditionals has induced the cipher text itself to be/become an action matrix ;that only produces outputs indistinguishable from uniformity)

In q terms, the data under a long sequence of qbit manipulation has itself evolved to become a hadamard action…

Ok so that means we have a replicating group. We have an expander whose action is to output a (key/plaintext parameyerizrd) expander code that is itself an action…

Posted in coding theory

The whole point about the laplacian is that you are talking about edges (rather than the nodes) and gradient averaging. That is, the edge is the difference between two nodes values (perhaps). Eventually we want any residual bias that ties one plaintext bit to its successor to diminish. And we want trigrams, and quintograms to similarly show uniformity.

An expander is that set of graph that assures that the maximal use of edges will occur (diffusing Bayesian factors) And the profusion of edges for almost all cliques will quickly move one out of a local cycle to wider cycles more globally afield. Moreover as the edges cross the boundary between the clique and all the other potential rods, one wants the transitioning action to replace codependency on plaintext bits with dependency on key bits.

In a crypto avalanche one wants an average result that a change of one unit of distance in key or plaintext chooses half the edges that flip the cipher text bit on the next round. That is key and plaintext become isomorphs with even one bit flip inducing acting as an initial condition that causes an increase in uniformity.

So while the rotor wirings may be inducing long sequence of quantum conditional operations that preserve the dependency of each plaintext bit on key bits And preserve the randonmness of the key bit throughout, it’s the function of the expander to be guiding the walk through the qbit space.

Sent from my iPhone

Posted in coding theory

The Hamiltonian in des are there to deliver a computation graph based on an infinite set of quaternion groups, which not only generates a uniform sample but preserves quantum randomness as each bit of plaintext is cited with a key bit, thus ensuring that the cipher text retains the same spectral Randomness as the key itself.

The side effect of the expander code is to deliver an diffusion engine that enables a sparse matrix to redistribute the Bayesian factors of each nibble over the entire vector space, in random directions and distance.

You can see the Hamiltonian graphs as the wiring of rotor cores in the rotor versions of des.

Posted in coding theory

Now consider (per dream think) that contrary to disruptive academic teaching des sub and per are about non linear subs that Need to be distributed (by the per).

The sub is compressing but the per distributes (randomly) so eventually you have covered the entire data space.

Posted in coding theory

Every few months I have what I call my NSA/des dream. Either I work for, half work for , or in some not quite complete manner associate with NSA… to do (guess what) crypto. Or I’m enacting a life process that is symbolic of the inner story behind des. Always des!!

Last night it was both : a partial NSA affiliate hiking, where the steps of the intricate inca path represented the inner story on how and why the des key schedule provides strength.

Go figure. It was more tangible than the previous one where I was looking at a meta magical wAterfall (on the mythical NSA campus) that expressed des (while trying not to get the hall cleaner into trouble…)

Tis true that (10 years later) today I found out just why Msft added mscep to their cert server (circa 2006). Hopefully I’m not dreaming pkcs7 or xml, tonight!

Posted in coding theory

Little has changed .

Download some microcode , and even your current intel x now offers “more instructions” (that now spy)

Funny how old comp sci is the most guarded secret of all.

Posted in coding theory

early Turing computer science models were founded in crypto rotor machines. That is, each rotor moved independently, each according to its own state machine (defined as a Cayley graph).

Viewing each graph as a coroutine (expressed in code as switch and goto between cases, of the same or other switches), one gets to a continuous space (of graph points).

- If things were subroutines, he would have said: discrete space (or “finite” dimension”)
- Interesting to see comp sci in the rarefied language of pure intellectualusm (circa 1930).

Posted in coding theory

Posted in coding theory

To make the SPA sample in visual studio 2015 work, with AAD, do something like the following to configure the client: Note the passive setting in the calling code. Note the redirect URI I configured in the AAD record.

app.UseGoogleAuthentication(new GoogleOAuth2AuthenticationOptions()

{

ClientId = “328410290065.apps.googleusercontent.com”,

ClientSecret = “alhNXE31GCwU5BgMDzSB4r0n”

});

app.UseOpenIdConnectAuthentication(new OpenIdConnectAuthenticationOptions

{

AuthenticationMode = Microsoft.Owin.Security.AuthenticationMode.Passive,

Authority = “https://login.microsoftonline.com/cincyrapmlsqa.onmicrosoft.com”,

ClientId = “00b85995-e861-40e3-9094-f264a0b58d16”,

ClientSecret = “fkgOn8ven2OqvZMX61KAEjiKSDZyJzEI+w1hxl/mku8=”

});

Posted in coding theory

Re turing and cryptanalzing bigrams in key covers used in layter naval enigma

From web:-

http://stackoverflow.com/questions/352670/weighted-random-selection-with-and-without-replacement

One of the fastest ways to make many with replacement samples from an unchanging list is the alias method. The core intuition is that we can create a set of equal-sized bins for the weighted list that can be indexed very efficiently through bit operations, to avoid a binary search. It will turn out that, done correctly, we will need to only store two items from the original list per bin, and thus can represent the split with a single percentage.

Let’s us take the example of five equally weighted choices, (a:1, b:1, c:1, d:1, e:1)

To create the alias lookup:

Normalize the weights such that they sum to 1.0. (a:0.2 b:0.2 c:0.2 d:0.2 e:0.2) This is the probability of choosing each weight.

Find the smallest power of 2 greater than or equal to the number of variables, and create this number of partitions, |p|. Each partition represents a probability mass of 1/|p|. In this case, we create 8 partitions, each able to contain 0.125.

Take the variable with the least remaining weight, and place as much of it’s mass as possible in an empty partition. In this example, we see that a fills the first partition. (p1{a|null,1.0},p2,p3,p4,p5,p6,p7,p8) with (a:0.075, b:0.2 c:0.2 d:0.2 e:0.2)

If the partition is not filled, take the variable with the most weight, and fill the partition with that variable.

Repeat steps 3 and 4, until none of the weight from the original partition need be assigned to the list.

For example, if we run another iteration of 3 and 4, we see

(p1{a|null,1.0},p2{a|b,0.6},p3,p4,p5,p6,p7,p8) with (a:0, b:0.15 c:0.2 d:0.2 e:0.2) left to be assigned

At runtime:

Get a U(0,1) random number, say binary 0.001100000

bitshift it lg2(p), finding the index partition. Thus, we shift it by 3, yielding 001.1, or position 1, and thus partition 2.

If the partition is split, use the decimal portion of the shifted random number to decide the split. In this case, the value is 0.5, and 0.5 < 0.6, so return a.

Here is some code and another explanation, but unfortunately it doesn't use the bitshifting technique, nor have I actually verified it.

Posted in coding theory

sample code

https://github.com/AzureADQuickStarts/B2C-WebApp-OpenIdConnect-DotNet

code

signup

claims having created local sesssion

if we add google as an IDP, we see during signUP

Posted in AAD

lets play with B2c of Azure AD

The App

B2C_1_pwsignup:

{ "issuer": "https://login.microsoftonline.com/8acee302-9d63-4634-800f-73f31f5ef745/v2.0/", "authorization_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/authorize?p=b2c_1_pwsignup", "token_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/token?p=b2c_1_pwsignup", "end_session_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/logout?p=b2c_1_pwsignup", "jwks_uri": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/discovery/v2.0/keys?p=b2c_1_pwsignup", "response_modes_supported": [ "query", "fragment", "form_post" ], "response_types_supported": [ "code", "id_token", "code id_token" ], "scopes_supported": [ "openid" ], "subject_types_supported": [ "pairwise" ], "id_token_signing_alg_values_supported": [ "RS256" ], "token_endpoint_auth_methods_supported": [ "client_secret_post" ], "claims_supported": [ "emails", "name", "sub", "idp" ] }

pwsignin

{ "issuer": "https://login.microsoftonline.com/8acee302-9d63-4634-800f-73f31f5ef745/v2.0/", "authorization_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/authorize?p=b2c_1_pswignin", "token_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/token?p=b2c_1_pswignin", "end_session_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/logout?p=b2c_1_pswignin", "jwks_uri": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/discovery/v2.0/keys?p=b2c_1_pswignin", "response_modes_supported": [ "query", "fragment", "form_post" ], "response_types_supported": [ "code", "id_token", "code id_token" ], "scopes_supported": [ "openid" ], "subject_types_supported": [ "pairwise" ], "id_token_signing_alg_values_supported": [ "RS256" ], "token_endpoint_auth_methods_supported": [ "client_secret_post" ], "claims_supported": [ "emails", "name", "sub", "idp" ] }

pwprofile

{ "issuer": "https://login.microsoftonline.com/8acee302-9d63-4634-800f-73f31f5ef745/v2.0/", "authorization_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/authorize?p=b2c_1_pwprofile", "token_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/token?p=b2c_1_pwprofile", "end_session_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/logout?p=b2c_1_pwprofile", "jwks_uri": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/discovery/v2.0/keys?p=b2c_1_pwprofile", "response_modes_supported": [ "query", "fragment", "form_post" ], "response_types_supported": [ "code", "id_token", "code id_token" ], "scopes_supported": [ "openid" ], "subject_types_supported": [ "pairwise" ], "id_token_signing_alg_values_supported": [ "RS256" ], "token_endpoint_auth_methods_supported": [ "client_secret_post" ], "claims_supported": [ "sub", "idp" ] }

Posted in AAD

Been rereading like math papers on saving rand bits (using expander network’s).

So if spook a fly does it fly off in a random direction?

Posted in coding theory

here are the changes I made to the github provider … to make it talk instead to Azure’s AAD (in non-managed IDP mode).

yes .. it uses the code from /auth twice, once to get a non-standard access token (that access the “standard” openid userinfo resource) and twice to get a JWT (the id token) suited for access the JWT-powered graphAPI of AAD.

Go figure this American mess.

Posted in AAD

Posted in coding theory

With the javascript universal windows store app that comes with visual studio community, I could follow the instructions as given

See http://microsoftedge.github.io/WebAppsDocs/en-US/win10/CreateHWA.htm

Posted in Computers and Internet

Hey John, remember when that comedian reduce snowden to a dick pic – and the public got that we spy on everyone, now? Well we need you to do the same, for the Chinese spying on us. We need you to go out and say: they are reading your emails. The public will get the message then. Especially the classified ones.

Its all synced up with Hilary, and her email “manufactured scandal”. Its wrong for her to received emails from others who have mentioned classified info. But, its fine for you to know that the chinese read your emails, with classified material.

We need a quick win to help out whathisname in the UK, as they ramp up their snooper’s charter (in secret, this time). Tony in Koala-land is all on board, too.

Posted in dunno

We learn that depths provided sufficient evidence to figure the chi and psi wheel patterns – which were changing quarterly or monthly, only. And , we know that before the machine age of GCCS really got going that this change pattern was amenable to hand methods of cryptanalysis.

http://www.alanturing.net/turing_archive/archive/t/t16/TR16-024.html

We also learn that the machine wheel was viewed, in the original design concept, as the daily wheel (to be changed much as in the enigma world). What is interesting is the reference to cribs, as a valid means to assist with wheel breaking (assuming that countermeasures didn’t make crib matching too hard).

fascinating to see related the chain of assumptions and deductions – that would reveal a crypto tell.

http://www.alanturing.net/turing_archive/archive/t/t16/TR16-025.html

it was fun re-reading the general report on tunny – a couple of years after I first encountered its strange language and before I learned the core math it leverages – as taught today using our language game.

We are used, today, when decoding or error-correcting to using iterative message-passing algorithms. That is, given a parity matrix that specifies a state machine (with edges and nodes), pass beliefs (about code-breaking-related “propositions”) along the edges to act as a custom computing machine. In the case of tunny break, log-likelihoods were passed (much as today), with a particular computation of the inner product between the evolving wheel bits and their evidence valuations and each row of the “parity matrix” – which in colossus days is of course the sample of depths as 1271 spacing of the cipher tape.

What is interesting next is the architecture not only of the Manchester computer – which followed colossus – but also machines as recent as NSA’s cray computers (with custom CPUs). They of course have “secret” instructions – that compute scalar products (i.e. geometric vector angular distances, applying such as the tunny-era masks for doubting bits).

So, given that “secretly-specialized” but otherwise (fast and) general purpose CPUs have gone out of fashion when designing cryptanalytical machines, we have to really go look at the graphics processors of the 80s to see how, back then, cryptanalytical hardware was proceeding. One has to look at how the hardware pipelines supported conformal projections and calculations of complex function vector spaces – to glimpse at what the cryptanalytical capability really was – back then. With that done, one can project forward to today, knowing how raw hardware capability has evolved since the first generation of GPUs.

Posted in Computers and Internet

Today’s a research cloud day, so we get to look at the official support for blog encryption in encryption (along with key storage).

The interesting thing is to see just how much uk policy is build into the architecture (nicely arranging that useless UK-brand HSM will be effecting the wrapping operation, with UK snooper charter access built in as a “underlying capability:’ courtesy of Microsoft azure no doubt).

Posted in azure

The unity project build server (in their cloud service) does not appear to use OAUITH to talk to the API of the github repository. It seems to requires than one manually install SSH keys instead, as generated by unity (which does the pulling of the repo code)

I’m just guessing, but we install the SSH key from the unity site as a “deploy” key in the github repository.

This is rather different to how the dockerhub consumer bound to the github API.

Posted in build