Folks doing cryptography in the 1950s didn’t need to be mathmaticians. They simply needed to understand a particular set of algorithms (that happen to be derived from math theorems).

The https://www.safaribooksonline.com/oriole/probabilistic-programming-from-scratch-1-a-b-testing-with-approximate-bayesian-computation teaches you what a cryptanalyst learned in cryptographics 101, circa 1950. Little had changed since 1943.

If I said this a few years ago, Id be shot. Now its all laid out…

The teacher is good. He’s a total natural. When he says that the hardness of the problem is basically how wide is the distribution of guess, for the count fairs jar of coins, he is exactly right. now one gleans just why, back in 1944 colossus, one is driving the uncertainty out to the very extremes of the deviations … so that one has maximally learned from the simulated data, from guesses, how to minimize the variance from all the trials. That’s a fancy way of saying … the data now supports the conclusion!

In 1943, with ships sinking (at obvious cost) and time being of the essence, one needed means to decide which path to take (when attacking cryptograms). one sees how such entirely qualitative factors could be added to an otherwise quantitative search, to help direct the machine search.

The following address tunny (rather than enigma); but the general principle is the same!

]]>The nsa doublespeak

]]>Thats because “targets” of spying are no longer @customers@

Its lying by definition

]]>Now one sees tunny rectangling as de

]]>Post war turing on cpu instruction set design knew how to leverage convolutionsl computation graphs – documented in the tunny report – to leverage conditional branching for cryptanalysis.

Even i was taught, in first year cd computer science computer architecture courses how the microcode behind each type of conditional branch instruction leverage amore basic computing model : handling the instruction pipeline itself.

What folks need to know is that the fixes indicate the wider problem: microcode download (for good or evil, or good again tomorrow when gchq releases its (well known) targeted spying patch

So what surprised me? That the precisely editions didn’t first require another zero day exploit, compromising the formal kernel security boundary

Gchq going for the user mode exploit is just brazen

]]>In the dream I am explaining that there is little difference in spirit between George Washington’s plantation (a place you are worked to death, raped for breeding and generally oppressed by terroristic white overseers and their nazi bosses making money from race slavery) and Belsen. The essence is the same: mass murder (by forced labor). There is no out given to Washington (oh he was nice and freed his house slaves after his death).

A group of students then go on a field trip, in a giant trailer that flys through the sky (in England). They fly over and in where the next group of reprobates are to be gassed, since society has returned to gassing folks. They are quite nonchalant about the whole experience, as they expect folks to be gassed (since its normal .. and they are not in the gassing class).

You know its a dream because as the trailer full of older students flys – like a 747 – two of the students are outside, like a couple of elves on santas sleigh. But in the dream its normal to be outside at 30k feet!

Well its normal until, as the “plane” is flying over its route and banking at the Isle of White (where some military area is seen, according to the announcer), the sleigh riders start to zombify. They are being gassed (and it happens just before the rest of the class are gassed in the flying rv). Yes its an accident. The flying protocol was mistakenly set as a gassing class ride.

Im flying on the next ride, and as we get on we see body parts being handled as the ride is being prepared – evidence of the last gassing run. On the trip, I and others remark and how unfortunate it is that the wrong group get gassed – specially as they were all well heeled, private school students from wealthy families. But… here we are on the same trip – in a nazi society that has accepted such things as normal.

ok. so I know why these themes come up in the dream – given certain recent reality events that are still present in recently-augmented memory. But its still fascinating to see how the brain concocts a mis-reality given the limited ability of memory and cognition to cooperate as the chemistry of the brain areas each return from sleep state.

]]>Not me!

]]>I fixed the FIRST BIT OF bit rot by replacing no-longer available cardboard sdk imports with an import of the relevant assets from the (older) cardboard unity project… also mentioned in the article. I say “import” – but in reality I used windows explorer to drag/drop the cardboard folder from its source to the assets directory of the unity project.

with that you can at least wander around the scene in the PC unity’s play mode.

]]>which is terrible

]]>And that is every deportee (who returned).

so the rumor is true. Ice will use this as a funnel (for the felon-to-be class). And that’s a lot of folk.

At a policy level I cannot disagree. At a practice level, ice have to do better in leveling. The storm shelter program and red cross Assistance is just another tool (to keep roundup costs low and ice staff safety high).

]]>What did you do !!

]]>See http://catlikecoding.com/unity/tutorials/procedural-grid/ ]]>

Just sad. As in trump sad. But very very American.

]]>It’s like me sayin there are no tapes of my phone and I made none knowing full well NSA tapes everything on my phone and died the taping).

But “i” don’t do it

Classic American disinformation

]]>View the outer bits of the nibble widths box as a third input to the medium that is subject to phase inversion, as a mathematical Kerr effect doubles and triple the balancedness … which drives the number of times the inner bits are inverted. That is data driven computation of the symmetric and asymmetric spin similarity and difference, applied to mixing the round function.

It n feistel analysis term, we are amplifying the rate of confusion per bit in proportion to the information content

Tempting to see this as a repelling process. But once you add the wider Hamiltonian cycles of the p&s with the balacedness generator you get more of chaotic attractor for those (theoretical) bent functions.

Now Turing already taught us to generate custom operator norms, that evolve with the plaintext information content. Using the similarity/dissimilarity basis of the quaternion algebra to tease out the information and use it to generate a data-aware averaging norm that is a masks to both linear and differential comparators, one gets self evolving computation of an operator norm that generates a bent function.

With the feistel network and the des era spn the complexity goes beyond the limit available with a simple sequence of rotors with quaternion group wiring. One gets now to tangent spaces of much finer granularity.

Nsa head – Henry’s spymaster (domestic and foreign)

Cia head – Henry’s assassin and torturer

Fbi – Henry’s secret police (for nobles)

Irs – Henry tax collector

Given Washington’s intentional creation of a slaving state, one should see north america in medieval terms (Canada and modern Mexico aside)

Speaking as a vassal.

]]>Sad as in scientology sad.

Don’t be duped on the corporate practices in (non insurance) billing.

It was interesting how for six weeks folks lied (badly for the most part). Finally a matured sailor fessed up, a bit like the scientologist finally admitting to the theory of aliens taking over the planet (in order to work a familyfirst insurance non billing scam).

Did I say that I love family first’s chiro methods? I can even live with the hardsell (its bot worse than orthopedics hard sell) . I cannot live with the intent to deceive the off plan insurance cases though.

Amazing that a profitable brand has not found it it the right time to ban the internal practice, though.

]]>“Thus, the Fisher information may be seen as the curvature of the support curve (the graph of the log-likelihood).

Near the maximum likelihood estimate, low Fisher information therefore indicates that the maximum appears “blunt”, that is, the maximum is shallow and there are many nearby values with a similar log-likelihood. Conversely, high Fisher information indicates that the maximum is sharp.”

So what dat all mean , I say?

That support is just log likelihood is the first take away

That max has a couple of variant cases is the next. Read carefully.

We already saw how in affine combination spaces one can calculate likelihood functions in terms of log addition. That is, calculate in the algebra of likelihoods (rather than in the algebra of frequency probabilities).

Any gambler should know that in a run of similar results that there is , despite the run, no memory property in the frequencies observed. Thus likelihood is clearly distinguished from frequency.

But what is information and how did Turing apply it to cryptanalysis – knowing it to be yet further distinguished from probabilities and likelihoods both?

The answer lies in the Monte Carlo example. While an unbiased but unbalanced roulette wheel may offer exploitable information about the frequencies observed (the outcomes) it still does not change the likelihood of any particular event.

so what is the exploit value of information?

In the roulette wheel case, it tells you how long you must play the runs before the unbalacedness will show up in the statistics … And improve your return.

The property tells one something about the underlying mechanisms – generating the likelihoods. It tells one about the underlying group and its symmetries ;or lack thereof) used to generate a run, rather than run properties. In roulette wheel terms, it tells about unbalancedness of its mechanics.

In modern cipher design one learns about those boolean functions which are both balanced and non linear (being at a distance as far as possible from specifically affine combination functions.)

Or in pure information science terms, there must be no information from the run about the generators of the likelihoods.

And just as there must be no information flow from the product about the design So there must be no further meta information from the design about the designers (and their intellectual background). The secret of the cipher is not in the algorithm but in how the design impacts complexity.

The easiest way to think about information is to think about the complexity of the machine needed to compute the probability space. Known in the 1930, folks designs ciphers to need more compute power than was available – as measured by the information properties.

Now apply that to 1970s des where dengineers designed resistance to only a certain quantity of mechanical

The last point to note is that the tractability of non linear mappings of spaces (that appear to resist diffrential cryptanalysis) can depend on whether they are generated by linear like vector spaces (keyed by cryptographic parameters). Just as the exponential map links probability spaces to likelihood spaces, so lie groups are related to lie algebras through the exponentiation of (lie) matrices

]]>If u look at the four 1 dimensional representations of the quaternion groups, one has all four combinations of 2 bits.

When wheel breaking a tinny wheel, with its constraints on patters that influence which bit value in the sequence follows its prior, it’s crucial to cast such bit pairings and their biases in abstract terms, for which the quaternion groups 1-dimensional representations fit perfectly.

One has to recall how differential cryptography was conceived then (vs now).

Deltaing allowed convolutions of the “hermitian (quadratic) form” to amplify the bias, allowing reinforcement of biases when (delta’ed) streams are added

They thought of it all as a faltung (vs a generalized quadratic form) for which 2 binary digits are a special case.

Now if one was charged with finding machinery to assist in wheel breaking, being done manually but at high time cost, then noting that a system of suitably wired wheels (wired according to the quaternion group algebra) could represent an ideal set of likelihoods would give rise to it, when suitably biased as a detector of correct wheel patterns

That it all applied to enigma key/wheel breaking (as well as tunny) is still a secret!

Now one can see the algebra of bulges as an arithmetic of semi simple irreps, expressed in the quaternion group basis.

That the groups geometry happens to slightly with an affine simplex allowing limiting convergence of sequences of good guesses is also another secret

]]>It’s common to use Q as the letter for positive definite forms. And one then uses it to average over the group.

There are many Q, since the domain can be over 1 letter back in the cipher text, or 2, or 3… as his Lemma was pointing out (v obliquely). They are all the same, since it’s the invariance one is relying on.

Also see now the basis for his argument why some generators are 1 while the rest is 0.

Nice easy thinking about relating vector spaces to the probability simplex.

Same as McCullough but easier!

]]>Even simpler presentation

]]>At the same time, it’s fascinating to see how folks in 1940 arrived at the same theory

The rest of the series looks just as interesting.

Because it opens their field to a hundred million open minded computer science students, eager to unify.

Put Clifford algebra into the heart of year two cs math, its game over.

]]>Sunday reading. So well written

]]>How I’ve been thinking of the fano plane

]]>Turing’s fbar is 1

]]>Now we know why!

]]>Contrasts nicely with the stats motivation that is the background to the (colossus aided) counting attack on Tunny cipher (and certain aspects of enigma too)

]]>So does a us president have the power and/or authority to “arrange” for the @incidental collection” of every collectible communication of us person #99?

Yes.

And yes gchq assist is that @arrangement”

]]>We have yet to see a Newman/Turing analysis (or topology applied to either enigma or tunny (or Italian heybern)) yet.

I see why it would still be seen as sensitive.

]]>It’s clear that the Truisms of all American (cum British) lying are abound; with definitional lying at the fore.

In my Era, pre 911, via 2 layers of Anglo-American (unofficially well coordinated) small contractors, each agency would have the other (small contractor) do metadata collection (on each other’s citizens)

This all went away post 911, when each agency could do it officially (whereas before metadata was legally grey)

What was retained, post 911, was the apparatus of grey ness.

Beware trump, with his CIA and NSA versions of his preatorian guards. Now beholden to the new emporer, the old guard may be lynched; Such is the nature of raw naked (super secret) power.

]]>As I recall when deniability is required, that’s when NSA induced gchq to do the spying on Americans (contact with foreign targets, of course)

Trump must be up by now on how and what NSA do!?

He won’t be as adept as Obama. But give him time. The absolute power of secret spying will either tame his dictatorial impulses or get him rapidly impeached.

]]>Id figured a while ago that he was interested in the shuffle around 1.

But now we get that in eigenvector explanation (which is better than the shuffle!)

any distribution is the sum of 1/n of the 1 eigenvector + a superposition of weighted others.

]]>Professional version of my similar drivel at https://yorkporc.wordpress.com/2013/10/10/turings-quantum-of-informationa-typex-wheel-wiring-plan-predating-shannons-information-theory/

]]>This is the second paper with modern presentation of two of the argumentation devices Turing used in on permutations. First we finally understand the why of wanting to establish that the math power of d generator is always positive 1. And second, we see why his lemma a is concerned with a power of four.

See paper for other examples of both argumentation devices.

]]>see 3.4 log signature s

]]>he really says that g has subgroup h which inturn has a centralizer subgroup h1

see http://www.turingarchive.org/viewer/?id=133&title=29

see also k:

This distribution can be generalized to more complicated sets than intervals. If *S* is a Borel set of positive, finite measure, the uniform probability distribution on *S* can be specified by defining the pdf to be zero outside *S* and constantly equal to 1/*K* on *S*, where *K* is the Lebesgue measure of *S*.

from https://en.m.wikipedia.org/wiki/Uniform_distribution_(continuous)

]]>let me see two turing arguments ]]>

https://case.edu/artsci/math/esmeckes/Haar_notes.pdf

i like this writer. she doesnt lose the point with endless symbols.

]]>this is the sense of turing- the point at which the measure of diffusion (of colinear differentials in crypto) has become chaotic

recall turing used a doubly transitive operator to mix the plaintext differentials within the output space

]]>Center and zero

More great Turing

View as CPU design, instruction set for rational – i.e. Probability densities seen in cryptanalysis

View in terms of early comp sci, searching for a computable group (suited to improbability calculus)

]]>Relate center to quotient as Klein.

We saw this in tunny

]]>Contrast with fano plane which holds for octionions

]]>http://math.stackexchange.com/questions/866026/quaternion-group-as-permutation-group

]]>His system is still constrained to be looking at lambda2 (max difference from constants etc).

Interesting also that the inner product norm is to him just a (easy sum of squares) measure of the squiggle path one takes, with respect to another vector (including his eigenket vectors, k).

]]>excellent uk teaching.

as turing was taught it.

]]>Back to the future

Gives a nice notion of distance for measurable probability densities such as those found in Markova ciphers resisting 1943 era differential cryptanalysis

]]>Simple groups

Now we see why Turing left unstated the relationship to differential cryptanalysis.

]]>Link needed.

Because NSA objected and , just as with the dhs attempt to usurp NSA role earlier, was able to state why things must remain with NSA. ;and that’s not even discounting the alternatives lack of deployed capability).

The emotional reason doesn’t take a lot of understanding (and even trump – t for torture and p for pussy, recall – understands that without control over the metadata, you cannot spy. And that means spying as much on congress as vlad the impaler.

At this point NSA will be pitching trump on how they validate and protect him (oh king, imperatur) as the preatorians. At his level of paranoia it will be doubly effective.

]]>ie the screen is the torus surface

we want the circle packing to measure the embedding into spectral uniformity

we want gap minimzed when number of tangents on bith torus are equal

What’s is interesting will be to relate light cones and Padic labeling to the n particle case of the des e functions “slit”, as subkey expansion evolve the positions of the slits, and thereby imposes a discriminator (attacking des)

]]>What is cute is the very simple way the evolution shifts (the distances between the transformed state).

]]>S box nibble sized graph evolution (with quantum coin from support between pairs of plaintext chars).

Distinguished from e(), that de-localized flows, simulating quantum coherence

]]>We even get (after 3 years) why the tunny alphabet was ordered the way it was, when applied to cryptanalysis. See spectral partitioning and the way in which nodes were ordered.

]]>As we will see in a later lecture, there is a nearly linear time algorithm that finds a vector {x} for which the expression {\delta} in the lemma is very close to {1-\lambda_2}, so, overall, for any graph {G} we can find a cut of expansion {O(\sqrt {h(G)})} in nearly linear time.

…for any tunny “run” known to exhibit a stats measurable bias, through just counting (on colossus).

Think of each std deviation worth of amplification as refining the cutset And tuning to the ciphertext, to best decide between improbably candidates (wheel s that in x or still in the edge expansion set)

]]>U also look at the area of each as probability (of the event space). Noting obviously circle is less than triangle.

This is highly pertinent to quantum walks.

Now in terms of resistance to differential cryptanalysis one has to think of tunny attack: which wanted to chose just that cutest (between circle and triangle) that allowed a decision on two hypothesis tied to the improbability of each.

To resist D.C., the weight difference between the two h must be within the spectral gap.

Do I get the two sides of the des design argument. We are requiring the quantum random walk but also needing it amplifications to evolve the density to within the spectral gap do that h cannot be weighted in an attack.

Ok so I was right on my intuition, the other day. Now we have histologucal and pure math support.

]]>the principles upon which des is designed. Yes it’s the nth time; and the same caveat as before: go anywhere else to learn someone’s description of the algorithm. Here we motivate the design.

So imagine you are studying the infamous quantum mechanics 2 hole screen – separating electron gun on the left from intensity screen. Or the 1 hole. Or the two holes shifted up (or down) a bit on the grate.

Oh and don’t forget that we can flip the grate, should we want the gun on the right (and the screen on the left).

After all like des, quantum mechanics is inevitably reversible, since “information” is conserved.

Now imagine that the e function of des is the grate.

And 2 of the six input bits of the current sbox are the slits in the grates.

The purpose of the way the e function shift inputs left (and right) is to simulate shifting the slits in the grate ip and down.

Why?

So that the qm normal density is shifted.

Our goal is to uniformly fill an intensity space, on the screen. Or better, average the left and right intensity screen (per the classical functional form.

Now, even though we shift our normal curve up and down (as the number of supports in the pair wise plaintext’s chats induce a key particle to move 1,2 3… lambada from the center, within the normal curve (and while constructing/destructing as we go) we still need the addition of intensity curves to uniformly fill the output space.

And here is where comes in the particular key schedule. It’s particular sequencing moves the grate around not only so feistels multiplexor covers all subpace but does so in such manner that guarantees that the concentration of intensity at any point (in 2d intensity space) is never more than the second eigenvalue gap.

Now recall that des does not have reversible sboxes. But we don’t need them! After all we have interleaved diffraction grates, since left 2 right we have half s des round and (right 2 left) we have the other half.

Now view the des subkey generatio functions own (highly programmed) bit duplication as a means of subtly (at huge granularity) measuring the quantum effect, thus influencing just how the left and right particle motions occur – giving a characteristic.

And ensuring that there exists no matrix representation of the same graph.

]]>Hence we may define the support of C, denoted by supp(C), to be the number of nonfixed digits under the action of a permutation in C.

If C has “large” support, then each element of C“moves” a lot of letters.

If the permutations in C are all even, then we say that C is an even conjugacy class. If the permutations in C are all odd, we say that C is an odd conjugacy class.

For n≥5, the subgroup generated by C, denoted by C, is Sn if C is odd, and An if C is even.

now we understand how turing thought about avalanche -and the significance of 4. he used gccs terms, like beetles….

]]>I you were 1920 trained math person, you be trained in such “strange” log tables – and such mechanical devices with vertical wooden “rods” bearing abacus like beads. The latter could compute, using the log basis in question.

So think! Tunny back to rods.

And from rods we get back to enigma/Hebern crytanalysis, and isomorph searching.

Of course, rods in enigma algebra are less about logs and more about relative automorphisms (as one computes conjugates).

But you see the eureka transitions.

Once turing and co figured that a discriminator could exist for a rotor set, now one can leap to its log (and an algebra of bulges).

What unclassified docs don’t say is the parallel analysis going on with quantum mechanics calculations, given the parallel effort going on with the ;mostly compartmentalized) atomic bomb making efforts (circa 1944).

Must have been fun to be thinking about the ” potential” of the electron cloud in a colossus tube/valve and that similar controlled electron (well neuron) flow used to accelerate a uranium chain reaction.

]]>The kind trapdoor I’d want is to probably abandon a given key before having to

Complete all rounds, putting that key at the back of the queue. Given the cost of io in a cluster and the accounting, I want every key derived from that candidate by key schedule to similarly get filtered out of the early trials

So now imagine your field had a special precomputation area with electronics dedicated to the subkey scheduling. So what would that look like?

Look at today’s general purpose CPU pipelining for a good clue on how such custom vlsi was built then.

Also? Since io is the main impeder, think of clustering that is centered on the accounting and queue. One thinks of a giant ram cache… so how do multi cores today share gigs of ram? Probably sane back then…

]]>And hadamard gates implement conditional qbit operations.

Even in 1940 thinking, multiplying u by r meant figuring what the ciphertect would have been should the rotor have rotated the u.

What we want after n rebounds of des is that if u consider each round output to be ur then ur and ur2 differ as do rows on the hadamard matrix.

Ie half the bit differ.

Which is more tangible than “avalanche” since now we had a limiting condition – when the sequence of conditionals has induced the cipher text itself to be/become an action matrix ;that only produces outputs indistinguishable from uniformity)

In q terms, the data under a long sequence of qbit manipulation has itself evolved to become a hadamard action…

Ok so that means we have a replicating group. We have an expander whose action is to output a (key/plaintext parameyerizrd) expander code that is itself an action…

]]>An expander is that set of graph that assures that the maximal use of edges will occur (diffusing Bayesian factors) And the profusion of edges for almost all cliques will quickly move one out of a local cycle to wider cycles more globally afield. Moreover as the edges cross the boundary between the clique and all the other potential rods, one wants the transitioning action to replace codependency on plaintext bits with dependency on key bits.

In a crypto avalanche one wants an average result that a change of one unit of distance in key or plaintext chooses half the edges that flip the cipher text bit on the next round. That is key and plaintext become isomorphs with even one bit flip inducing acting as an initial condition that causes an increase in uniformity.

So while the rotor wirings may be inducing long sequence of quantum conditional operations that preserve the dependency of each plaintext bit on key bits And preserve the randonmness of the key bit throughout, it’s the function of the expander to be guiding the walk through the qbit space.

Sent from my iPhone

]]>The side effect of the expander code is to deliver an diffusion engine that enables a sparse matrix to redistribute the Bayesian factors of each nibble over the entire vector space, in random directions and distance.

You can see the Hamiltonian graphs as the wiring of rotor cores in the rotor versions of des.

]]>The sub is compressing but the per distributes (randomly) so eventually you have covered the entire data space.

]]>Last night it was both : a partial NSA affiliate hiking, where the steps of the intricate inca path represented the inner story on how and why the des key schedule provides strength.

Go figure. It was more tangible than the previous one where I was looking at a meta magical wAterfall (on the mythical NSA campus) that expressed des (while trying not to get the hall cleaner into trouble…)

Tis true that (10 years later) today I found out just why Msft added mscep to their cert server (circa 2006). Hopefully I’m not dreaming pkcs7 or xml, tonight!

]]>Download some microcode , and even your current intel x now offers “more instructions” (that now spy)

Funny how old comp sci is the most guarded secret of all. ]]>

Viewing each graph as a coroutine (expressed in code as switch and goto between cases, of the same or other switches), one gets to a continuous space (of graph points).

- If things were subroutines, he would have said: discrete space (or “finite” dimension”)
- Interesting to see comp sci in the rarefied language of pure intellectualusm (circa 1930).

]]>

app.UseGoogleAuthentication(new GoogleOAuth2AuthenticationOptions()

{

ClientId = “328410290065.apps.googleusercontent.com”,

ClientSecret = “alhNXE31GCwU5BgMDzSB4r0n”

});

app.UseOpenIdConnectAuthentication(new OpenIdConnectAuthenticationOptions

{

AuthenticationMode = Microsoft.Owin.Security.AuthenticationMode.Passive,

Authority = “https://login.microsoftonline.com/cincyrapmlsqa.onmicrosoft.com”,

ClientId = “00b85995-e861-40e3-9094-f264a0b58d16”,

ClientSecret = “fkgOn8ven2OqvZMX61KAEjiKSDZyJzEI+w1hxl/mku8=”

});

]]>

From web:-

http://stackoverflow.com/questions/352670/weighted-random-selection-with-and-without-replacement

One of the fastest ways to make many with replacement samples from an unchanging list is the alias method. The core intuition is that we can create a set of equal-sized bins for the weighted list that can be indexed very efficiently through bit operations, to avoid a binary search. It will turn out that, done correctly, we will need to only store two items from the original list per bin, and thus can represent the split with a single percentage.

Let’s us take the example of five equally weighted choices, (a:1, b:1, c:1, d:1, e:1)

To create the alias lookup:

Normalize the weights such that they sum to 1.0. (a:0.2 b:0.2 c:0.2 d:0.2 e:0.2) This is the probability of choosing each weight.

Find the smallest power of 2 greater than or equal to the number of variables, and create this number of partitions, |p|. Each partition represents a probability mass of 1/|p|. In this case, we create 8 partitions, each able to contain 0.125.

Take the variable with the least remaining weight, and place as much of it’s mass as possible in an empty partition. In this example, we see that a fills the first partition. (p1{a|null,1.0},p2,p3,p4,p5,p6,p7,p8) with (a:0.075, b:0.2 c:0.2 d:0.2 e:0.2)

If the partition is not filled, take the variable with the most weight, and fill the partition with that variable.

Repeat steps 3 and 4, until none of the weight from the original partition need be assigned to the list.

For example, if we run another iteration of 3 and 4, we see

(p1{a|null,1.0},p2{a|b,0.6},p3,p4,p5,p6,p7,p8) with (a:0, b:0.15 c:0.2 d:0.2 e:0.2) left to be assigned

At runtime:

Get a U(0,1) random number, say binary 0.001100000

bitshift it lg2(p), finding the index partition. Thus, we shift it by 3, yielding 001.1, or position 1, and thus partition 2.

If the partition is split, use the decimal portion of the shifted random number to decide the split. In this case, the value is 0.5, and 0.5 < 0.6, so return a.

Here is some code and another explanation, but unfortunately it doesn't use the bitshifting technique, nor have I actually verified it.

]]>sample code

https://github.com/AzureADQuickStarts/B2C-WebApp-OpenIdConnect-DotNet

code

signup

claims having created local sesssion

if we add google as an IDP, we see during signUP

]]>

lets play with B2c of Azure AD

The App

B2C_1_pwsignup:

{ "issuer": "https://login.microsoftonline.com/8acee302-9d63-4634-800f-73f31f5ef745/v2.0/", "authorization_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/authorize?p=b2c_1_pwsignup", "token_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/token?p=b2c_1_pwsignup", "end_session_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/logout?p=b2c_1_pwsignup", "jwks_uri": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/discovery/v2.0/keys?p=b2c_1_pwsignup", "response_modes_supported": [ "query", "fragment", "form_post" ], "response_types_supported": [ "code", "id_token", "code id_token" ], "scopes_supported": [ "openid" ], "subject_types_supported": [ "pairwise" ], "id_token_signing_alg_values_supported": [ "RS256" ], "token_endpoint_auth_methods_supported": [ "client_secret_post" ], "claims_supported": [ "emails", "name", "sub", "idp" ] }

pwsignin

{ "issuer": "https://login.microsoftonline.com/8acee302-9d63-4634-800f-73f31f5ef745/v2.0/", "authorization_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/authorize?p=b2c_1_pswignin", "token_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/token?p=b2c_1_pswignin", "end_session_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/logout?p=b2c_1_pswignin", "jwks_uri": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/discovery/v2.0/keys?p=b2c_1_pswignin", "response_modes_supported": [ "query", "fragment", "form_post" ], "response_types_supported": [ "code", "id_token", "code id_token" ], "scopes_supported": [ "openid" ], "subject_types_supported": [ "pairwise" ], "id_token_signing_alg_values_supported": [ "RS256" ], "token_endpoint_auth_methods_supported": [ "client_secret_post" ], "claims_supported": [ "emails", "name", "sub", "idp" ] }

pwprofile

{ "issuer": "https://login.microsoftonline.com/8acee302-9d63-4634-800f-73f31f5ef745/v2.0/", "authorization_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/authorize?p=b2c_1_pwprofile", "token_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/token?p=b2c_1_pwprofile", "end_session_endpoint": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/oauth2/v2.0/logout?p=b2c_1_pwprofile", "jwks_uri": "https://login.microsoftonline.com/b2ctrialpw.onmicrosoft.com/discovery/v2.0/keys?p=b2c_1_pwprofile", "response_modes_supported": [ "query", "fragment", "form_post" ], "response_types_supported": [ "code", "id_token", "code id_token" ], "scopes_supported": [ "openid" ], "subject_types_supported": [ "pairwise" ], "id_token_signing_alg_values_supported": [ "RS256" ], "token_endpoint_auth_methods_supported": [ "client_secret_post" ], "claims_supported": [ "sub", "idp" ] }]]>

So if spook a fly does it fly off in a random direction?

]]>yes .. it uses the code from /auth twice, once to get a non-standard access token (that access the “standard” openid userinfo resource) and twice to get a JWT (the id token) suited for access the JWT-powered graphAPI of AAD.

Go figure this American mess.

]]>

]]>

]]>

See http://microsoftedge.github.io/WebAppsDocs/en-US/win10/CreateHWA.htm

]]>Its all synced up with Hilary, and her email “manufactured scandal”. Its wrong for her to received emails from others who have mentioned classified info. But, its fine for you to know that the chinese read your emails, with classified material.

We need a quick win to help out whathisname in the UK, as they ramp up their snooper’s charter (in secret, this time). Tony in Koala-land is all on board, too.

]]>http://www.alanturing.net/turing_archive/archive/t/t16/TR16-024.html

We also learn that the machine wheel was viewed, in the original design concept, as the daily wheel (to be changed much as in the enigma world). What is interesting is the reference to cribs, as a valid means to assist with wheel breaking (assuming that countermeasures didn’t make crib matching too hard).

fascinating to see related the chain of assumptions and deductions – that would reveal a crypto tell.

]]>http://www.alanturing.net/turing_archive/archive/t/t16/TR16-025.html

We are used, today, when decoding or error-correcting to using iterative message-passing algorithms. That is, given a parity matrix that specifies a state machine (with edges and nodes), pass beliefs (about code-breaking-related “propositions”) along the edges to act as a custom computing machine. In the case of tunny break, log-likelihoods were passed (much as today), with a particular computation of the inner product between the evolving wheel bits and their evidence valuations and each row of the “parity matrix” – which in colossus days is of course the sample of depths as 1271 spacing of the cipher tape.

What is interesting next is the architecture not only of the Manchester computer – which followed colossus – but also machines as recent as NSA’s cray computers (with custom CPUs). They of course have “secret” instructions – that compute scalar products (i.e. geometric vector angular distances, applying such as the tunny-era masks for doubting bits).

So, given that “secretly-specialized” but otherwise (fast and) general purpose CPUs have gone out of fashion when designing cryptanalytical machines, we have to really go look at the graphics processors of the 80s to see how, back then, cryptanalytical hardware was proceeding. One has to look at how the hardware pipelines supported conformal projections and calculations of complex function vector spaces – to glimpse at what the cryptanalytical capability really was – back then. With that done, one can project forward to today, knowing how raw hardware capability has evolved since the first generation of GPUs.

]]>

looks good.

]]>The interesting thing is to see just how much uk policy is build into the architecture (nicely arranging that useless UK-brand HSM will be effecting the wrapping operation, with UK snooper charter access built in as a “underlying capability:’ courtesy of Microsoft azure no doubt).

]]>I’m just guessing, but we install the SSH key from the unity site as a “deploy” key in the github repository.

This is rather different to how the dockerhub consumer bound to the github API.

]]>

giving

Eventually we sign away our life to Google (and NSA) and we get

]]>

UA-65697436-1

Now lets take this a step further and ask how well such as github are supporting code builds (vs assembly builds).

For no particular reason, we install unity – a source code and project maintenance system that has little or nothing to do with our normal work. But, it does have its own cloud build service and integration with visual studio. So how do all the cloud, and source repo, and IDE tools all cooperate?

ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC/5xBob4orOMyNkFIftY5SCoG86bUOkN99pnGVI80tEZW5jwU82VBUvMLjiLXLMpfNqLyPE2Mou/U3tayQxSXqY55aHeVwYs9Z0+l2O9dMcIjz9OJCuO0MXgtYCXoIxQDojfO2MZNqCtPZ0Tuce+YtsgrHar1qcVFQoank/EZlJODlCRxMLYFeE9NBUvKd4BrODnBWFasFuRcEkDz88oBJZ0UL9E74FdMOsihD2oom1oHlmucWtn0XBf842DO79cN3saPh0Lq62CYQywpQZOUX6CSLK/ZJ1OpT3VAD5vWU+dBAnp4A16qsJYRIj4dWyLH2+wQnsH56suFsSYjWfJ+F

not having created a real project, our build fails:

]]>

using amazon SP connection (which self-issues a SAML request to the microsoftonline gateway):

we try to fix it by assigning a cert to the SP connection:

and update the idp metadata in the SP:

major problems assigning the per-RP signing cert. Doesn’t seem to work (or at least the wizard doesn’t seem to work..)

]]>

once the container is partly built (from baseline images), of course the asp.net image itself builds (the project), which involves using the docker/linux version of nuget within the asp.net image to go get all the project’s own dependencies.

We can now access our dockerized asp.net app, running on kestrel on linux, executing as a container on the docker host itself operating on ubuntu, in a VM running in azure VM fabric.

]]>

Provider ARN

arn:aws:iam::385727861301:saml-provider/PPE

Provider Type

SAML

Creation Time

2015-07-20 15:13 PDT

Then we configure the SP/SP adaptor (in Ping parlance) that is a part of AWS “role” setup, per the instructions at https://msdn.microsoft.com/en-us/library/azure/dn706228.aspx

Attribute

SAML:aud

Value

Looking at the SP connection (to AWS) in the AAD setup manager, we see for connection attributes:

Per the tutorial we add role and mail attribute name/values:

https://aws.amazon.com/SAML/Attributes/Role : arn:aws:iam::385727861301:saml-provider/PPE

https://aws.amazon.com/SAML/Attributes/RoleSessionName: mail

Turning now to the main parameters of the AAD-side SP connection:

using the signup wizard

When we try the initiating URI

GET https://myapps.microsoft.com/signin/Amazon%20Web%20Services%20(AWS)/8b1025e41dd2430ba1502ef79cd700f5 HTTP/1.1

we see a redirect to

followed by an openid handshake (somewhat surprisingly):

I’ll guess we are seeing some kind of openid connect to SAML gateway in operation.

we see the live.com IDP assert a code-grant to the gateway:

<html><head><title>Working…</title></head><body><form method=”POST” name=”hiddenform” action=”https://account.activedirectory.windowsazure.com/”><input type=”hidden” name=”code” value=”…” /><input type=”hidden” name=”id_token” value=”…” /><input type=”hidden” name=”state” value=”OpenIdConnect.AuthenticationProperties=5AXRYHTy6jr_jeBhxOx1ci8tTEwmvkIqz_Z_qgR0Wd0kDa8J_Ah_1ghY2E3o3B72cF9Hx97h1pZgthNE_ouDhLKv2X1tG5rT8iQ3oGbNuEPayPfZGuBN2BjkRdk5K8VmvG1p27kiaewyGXk3-5K7zM99ZYltTUnOWq1pIBOAzbhMRfyhBryA2hn2v1Eho-enuvYr_npWUPY6F8uyf7-biS0UGqdWeA7LzNwar2ZPPyI6JKbbq9FyHRZ447KbTiJ0JUTAT7mdRl5nFMd1Xuo2p4L2MiDpu8bNk3ldJwOe37D0WmUsIYVw7fT0qnIleFIXwGXaPXSym776o8Hku4tzb1CnNS3upTE8XRjKzxvmuqiNRvBQjElwPgEeX97xeH-L8IPVEE50U-_zhF-ZCJylUg” /><input type=”hidden” name=”session_state” value=”e2c316b5-e8fe-442c-a7c0-72053483d2f6″ /><noscript><p>Script is disabled. Click Submit to continue.</p><input type=”submit” value=”Submit” /></noscript></form><script language=”javascript”>window.setTimeout(‘document.forms[0].submit()’, 0);</script></body></html>

We then see the second phase of the gateway:

When using a live.com account we hit various problems. We get a little when swapping to admin@netmagic.onmicrosoft.com, having declared this user authorized to assert through the SP Connection. we see that out gatewaying process (that now has a user session) can pretend to be the SP and issue a SAML request:

]]>http://azure.microsoft.com/en-us/marketplace/partners/docker/docker-subscription-for-azure/

Configuring, we tryout the ssh public key method:

ssh-rsa AAAAB3NzaC1yc2EAAAABJQAAAQEAmOEBYv1j8rU93h1Oj7hG2JfRKXsux503YTZLoq3ZlvkSMjFqzNrifsItldEJlC38nrJeUCrmG4/RQQfYOf+DknPdGia6m55F2CAL7xYbBGz66CB0bJloCmfTELrh7+bCuPVJAbIZ3Q0ovEDIcWVh4RvGqi9s2TUCw4ZTwy2qQoke2L1nVqlaeC5HlzT8rr4Jf8poEdhmn9vRYwSVmh0vuXhVjcnVO8pqk6UwhB2rTAkaJBzzojQTfNd1/BGqUvxfp5gMPCxk/MdTd8SVBuvPNN7csgZBYzJZochXwCFkCMzG1UKObrH8uLlHAixlEiUqsSDmByA2T9JxyqKBvHMjsw== rsa-key-20150720

due to buy problems, this is as far as I could get, today.

]]>azure cli in linux-docker-host container:

]]>

we built an apache-enhanced ubuntu image and launched it on that docker host.

###########################################

# Dockerfile to build an apache2 image

###########################################

# Base image is Ubuntu

FROM ubuntu:14.04# Author: Dr. Peter

MAINTAINER Dr. Peter <peterindia@gmail.com># Install apache2 package

RUN apt-get update && \

apt-get install -y apache2 && \

apt-get clean# Set the log directory PATH

ENV APACHE_LOG_DIR /var/log/apache2# Launch apache2 server in the foreground

ENTRYPOINT [“/usr/sbin/apache2ctl”, “-D”, “FOREGROUND”]

To build the image, we use

docker build -t apache2 .

to run the image in a container with exposed endpoints, we use

docker run -d -p 80:80 apache2

To ensure that the container hosts own port 80 is itself exposed as an azure endpoint, we configure azure (to expose the docker hosts port 80..,)

allowing us to interact with http://pwdocker3.cloudapp.net/ from the public internet:

End.

]]>First we create a load test runner – in the visual studio online environment. This means creating the environment, tied to home_pw@msn.com, and then configuring visual studio to bind to it.

Now we follow instructions:

]]>https://code.msdn.microsoft.com/Getting-started-with-17a52e95

webhosting – link hosting site to source repo

We have basically imported the repo we made earlier in the day on github to bitbucket (once we oauth-authorized bitbucket to make API calls to the github service endpoint)

]]>

We go clean up our github test repositories and create a new one – ready to be integrated with dockerhub’s automated build process.

https://github.com/homepw/dockerautomationbuild

To push the dockerfile, we make windows github client be a client of the github repository:

to this we add our Dockerfile (so once that this repo is pulled by dockerhub the automated build process of dockerhub can build its image).

Lots of the above needed to be done again, cleanly. But the principles are correct.

We end up with github having pushed the build instructions to the task runner

finally, we can manually pull and run the image in our linux host, dockerized in azure

]]>

pwilliams@pwdocker3:~$ cat >Dockerfile

FROM busybox:latest

CMD echo Hello World!!

pwilliams@pwdocker3:~$ cat Dockerfile

FROM busybox:latest

CMD echo Hello World!!

pwilliams@pwdocker3:~$ docker build .

Sending build context to Docker daemon 20.99 kB

Sending build context to Docker daemon

Step 0 : FROM busybox:latest

latest: Pulling from busybox

cf2616975b4a: Pull complete

6ce2e90b0bc7: Pull complete

8c2e06607696: Already exists

busybox:latest: The image you are pulling has been verified. Important: image verification is a tech preview feature and should not be relied on to provide security.

Digest: sha256:38a203e1986cf79639cfb9b2e1d6e773de84002feea2d4eb006b52004ee8502d

Status: Downloaded newer image for busybox:latest

—> 8c2e06607696

Step 1 : CMD echo Hello World!!

—> Running in 4141dfc3a0ce

—> 8e51f94add31

Removing intermediate container 4141dfc3a0ce

Successfully built 8e51f94add31

pwilliams@pwdocker3:~$ ^C

pwilliams@pwdocker3:~$

pwilliams@pwdocker3:~$ docker run 8e51f94add31

Hello World!!

So. first we “login” to docker bug from the docker client command line tool (which is obviously a different login session to the web session on the dockerhub webapp). we can guess the command line login is a resource-owner grant whereas the webapp login is a authorization_grant, in oauth2 terms.

pwilliams@pwdocker3:~$ docker login

Username: homepw

Password:

Email: home_pw@msn.com

WARNING: login credentials saved in /home/pwilliams/.docker/config.json

Login Succeeded

some kind of token store retains credentials (vs login values)

{

“auths”: {

“https://index.docker.io/v1/”: {

“auth”: “aG9tZXB3OkZvemllYmlzYWIhMjM0NQ==”,

“email”: “home_pw@msn.com”

}

}

}

since we have a locally committed homepw/ub and have learned we need a docker hub repo of the same name (so we can push the locally committed image), we create another (Deleting ub1)

we see at the docker command line (running on a docker host, on a linux machine hosted in azure to which azure docker extensions were added, recall)

pwilliams@pwdocker3:~/.docker$ docker push homepw/ub

The push refers to a repository [homepw/ub] (len: 1)

b1241102e845: Image already exists

d2a0ecffe6fa: Image successfully pushed

29460ac93442: Image successfully pushed

b670fb0c7ecd: Image successfully pushed

83e4dde6b9cf: Image successfully pushed

Digest: sha256:48b17e16f16001df36b1a60ef29ec5dd4b067a060d14a8a62789859f2a3b4071

We now delete all locally running processes and the images, so we might rebuild things having pulled from the docker hub repo.

]]>pwilliams@pwdocker3:~/.docker$ docker run -i -t homepw/ub /bin/bash

Unable to find image ‘homepw/ub:latest’ locally

latest: Pulling from homepw/ub83e4dde6b9cf: Pull complete

b670fb0c7ecd: Pull complete

29460ac93442: Pull complete

d2a0ecffe6fa: Pull complete

b1241102e845: Already exists

Digest: sha256:48b17e16f16001df36b1a60ef29ec5dd4b067a060d14a8a62789859f2a3b4071

Status: Downloaded newer image for homepw/ub:latest

root@c376a47b4f85:/# cd /home

root@c376a47b4f85:/home# ls

abc cde fgh

root@c376a47b4f85:/home#

https://registry.hub.docker.com/u/homepw/ubuntu_wget/settings/webhooks/

]]>

With this post we learned how to enroll with dockerhub. That is, we can now login using username/password (home_Pw@msn.com/pwd)

at the registry, we see such as the asp.net image.

Let’s read a book

and lets destroy our old dockerizzed ubuntu host in azure and replace it with an all new one:

putty, to pwdocker3.cloudapp.net (port 53658, not default of 22, as learned from the endpoints tab of the azure settings for the vm)

on our windows 10 tablet, lets install the docker client – even though we will be using the local docker client on that new ubuntu OS instance.

The docker info says

pwilliams@pwdocker3:~$ docker info

Containers: 0

Images: 0

Storage Driver: aufs

Root Dir: /var/lib/docker/aufs

Backing Filesystem: extfs

Dirs: 0

Dirperm1 Supported: true

Execution Driver: native-0.2

Logging Driver: json-file

Kernel Version: 3.16.0-43-generic

Operating System: Ubuntu 14.10

CPUs: 1

Total Memory: 1.639 GiB

Name: pwdocker3

ID: EA3B:4KQV:ZIRR:2Z34:DKA2:2GPJ:FGVW:MAN5:KHDR:426N:R3O2:HUUM

WARNING: No swap limit suppor

trying to do something half useful with all this, we follow page

pwilliams@pwdocker3:~$ sudo docker pull busybox

latest: Pulling from busybox

cf2616975b4a: Pull complete

6ce2e90b0bc7: Pull complete

8c2e06607696: Already exsts

busybox:latest: The image you are pulling has been verified. Important: image verification is a tech preview feature and should not be relied on to provide security.

Digest: sha256:38a203e1986cf79639cfb9b2e1d6e773de84002feea2d4eb006b52004ee8502d

Status: Downloaded newer image for busybox:latest

and we can print hgello world!

]]>pwilliams@pwdocker3:~$ docker images

REPOSITORY TAG IMAGE ID CREATED VIRTUAL SIZE

busybox latest 8c2e06607696 12 weeks ago 2.433 MBpwilliams@pwdocker3:~$ sudo docker run busybox echo “Hello World!”

Hello World

]]>

]]>

Lets recall that one of the main drivers for even augmenting such as quickbooks with a membership system (in which 500 realtors are listed) is so its easyt and effective for the realtor to pay dues to that local realty association (and possibly also fund NAR’s PAC)

in windows, we are directed to a particular wallet provider, which on windows 10 (preview) installation gives:

which is apparently really a chrome browser plugin (for the version of chrome running on windows)

chrome plugin

Since this is just an app, we need a serer side account:

after creating various crypto objects (including rubbish like PGP), we see it even requires specifically google authentication :

This we did, on a windows phone, using the Authenticator app (based on google’s own library code, apparently). We took a photo of the pattern, this populated the apps userid and secret. We cite the code shown on the phone to the web page, as you would expect, and get to the next phase of enrollment:

There appear to be some interesting 2-factor ideas, based on moving authentication code producers between devices. To be honest, I didn’t quite understand this, first time through! But, we will.

on our windows phone, we have been running a bitcoin miner, which created an address: 1J8RN59iB48upe6HPRywTE9w2Y3FFBYAsA, which I think I can add toe the account, so I can send it bitcoins!

Who knows!

anyways, we see just how tightly integrated it is with the phones two factor coding system, to send some coinage from the wallet to the account:

c

]]>we see that one logs into the gravatar site itself using a wordpress IDP account:

what we are doing here is asking the question: so what if you do not have a Rapattoni magic (style) membership system? What if you membership system is merely a “membership plugin” to something one of two thousand small realty boards may necessarily have – such as an deployment of quickbooks?

If one wanted to build a lightweight membership plugin that could compete with a quickbooks membership module (which is little more than an address book for registered accounts of accounts payable (the users!), with some schema fields specific to real estate), could one be using the “web membership system”?

wqe can see what a modern web-centric membership system entails, when looking at features beyond the gravatar and the profile page. We can see the connections to “verified services, for example, for which we try out the linkedin integration:

Here we see that the oauth integration happens not only to requires a login to the linked in account, but the gravatar carries with the act of “connecting” across sites – something that goes beyond the oauth model. Note how the oauth integration ALSO allows for signup to the linkedIn service (should that necessary).

Of course, the kinds of older membership systems seen in US realty have no such feature set – despite realty being an entirely “networking business” – but a business that has resisted adopting the web technologies (for networking and membership) to date.

We also connected to facebook –which gave us a facebook login session obviously – similarly.

Then, we happened to opt for a goodread connection, too, which took us to its login page which happens to be facebook enabled (and an automatic goodreads session was thus created). This automatically completed most of the Oauth handshake, which we see back on the goodreads OP:

Since we have since writing the post got a bitcoin address, we can try adding it to the gravatar’s profile area for cyber currencies

]]>

Its pretty clear that both newman and turing withheld from the crypto types who wrote up the tunny report most of the intellectual theory concerning how the attack in tunny really worked.

Posted from WordPress for Windows Phone

]]>Posted from WordPress for Windows Phone

]]>from the portal.azure.com

gives

]]>