metrolistmlsqa server-side app authentication flow

Azure mobile provides a nice easy way to create an andoid app, using android studio. It also host a backend (javascript) website that supports that app, particularly when doing “server-side login”.

image

image

image

 

This flow is one in which the code provided by the AS as a result of the app-invoked embedded browser is delivered to a server-side endpoint – via an HTTP redirect. This last step is espied by the app, which shuts down the browser and interacts with the website t0- get itself a local session, app to site. The site’s own webapp process has, meantime, converted the code to a token, to support the maintenance of the local session. This interaction is configured by setting identity values in the mobile app configuration to align with that of an associated AAD application (and vice versa):

we configure the mobile backend site to talk to the “homespotter webapp” AAD application registration, in our netmagic.onmicrosoft.com AAD tenant.

image

image

Posted in AAD, azuremobile

metrolist custom domain of netmagic.onmicrosoft.com with app

We added a second custom domain to our AAD tenant and bug rfixed our IDP for the metrolist (IDP) tenant’s particular UX.

We then build a oauth-based application using the metrolist oauth configuration. The screen shots show the app’s login screen, in the windows UI, and a form showing records, after login, for the authenticated user. This app emulates what a metrolist vendor will do, similarly, when building an IOS or Android application.

image

image

At the server API supporting the client, the server code shown below indicates that the client provided a authorization token on the api call to GET records – having obtained that token previous from metrolist’s infrastructure using the oauth/openidconnect handshake.

image

 

A trace of the embedded browser talking to the AS is show below:

image

Posted in AAD

visual studio CTP 6 and cordova for woodgrove claims sample

The visual studio 2015 CTP6 edition of Windows Server 2102 hosted in Azure comes with a set of code samples – for the cordova tool chain. One of these woodgrove claims – a project we learned about back here: https://yorkporc.wordpress.com/2014/05/29/office365-adal-with-backbone-mvvc-and-cordova-build/

We got the project to load in the latest CTP, once we changed the project file to add the term ”Apache” in front of CordovaTools in a couple of path names.

We did make it talk to the authorization endpoint of AAD, but the ripple emulator could not successfully invoke the token swap, at the token endpoint of AAD. This could be a project or a CTP6 issue – I don’t know.

Posted in cordova

getting api manager to use AAD STS, finally

image

image

The key was NOT to have the proxy web app use an authorization code grant.

It must use use only a client credentials grant:

image

The awful azure api manager documentation didn’t help, sending me the wrong way, for a week

image

Posted in api manager

debugging azure api managers use of the AAD token endpoint

 

To see what parameters are send to AAD STS, for token issuing as part of the oauth2 handshake,  we configured API manager to use an azure website – rather than AAD – as the token issuer.

image

 

The website is just a hosted ASp.NET web form project, running in the remote debugger, to which we added a beginrequest handler – so we can inspect the request

image

 

image

The posted form has the following parameters:
       {grant_type=authorization_code&code=AAABAAAAvPM1KaPlrEqdFSBzjqfTGKuDarHWcVaE_gAW-T8bBEhryxGinvUkA66Jt-uxbmRl8J5rjUc0aFJi94eDUoO9Bnt-6NR2sXVQYRXhygLUhjdLYrV9UmlBKZu2U_WZFSXO1_6oeIr-1Phz7VoooKJm0Vmh-N4lfUYdPTsbpgbWMhqA60jkFdiGbAwL0ocUrPw-4V8-8PwddLb1mcFOcGERx1jKa62ffZ9L22tJwkAgHhQPvk4K4TDAq60YY1JMWMgUeL9zc3oT_C6AXv6BkiK-cDm6mE9vx3ZTqz6oHP6LdUqE4QO6hukp7ptcr2Tl15WpJus-Ro4hM4gmdXer7hlwBVM22RLdPBKKZOsm649q12SokmOTdhgHcUX0y2aDxqNPhcTwy0z1QNj6pdZ4PiEVJ9i-qxvZrdB2MUSUNrJ7Lw5bEvzD1rM_eSOPjx-rKwu6gSWqYTNFbXcaBgEoQA6m8PULBdItUNwVwjcyeXTHvEhqrYJLBGdhjpucFGTDYqiteM5zyhFj-GiRkS–9x0kv4vg9TbYl0fLFv8bJwjkG19yZIwVKCVelzZ3TVvsQfyT9srcFCCv6BGu2QnLgA-la0Vksu9NnXHh1hpnO1drt7QLXj6p2FTHhCIDEKv1EobQJwFol8yrsTSdi4wJnYa-dvObvFmXn_8nBw57qKFRp-ogAA&redirect_uri=https%3a%2f%2frapmlsqa.portal.azure-api.net%2fdocs%2fservices%2f54e4f45e73c60f106453dac3%2fconsole%2foauth2%2fauthorizationcode%2fcallback&client_id=0bc904ae-3f2c-4ec7-8b71-40f7207112f0&client_secret=fV1OJsfRFOTDdIqTzs%2fdZCRJkHvcPr9fZGJhWo1dQNg%3d}  

Posted in api manager

TOTAL INFORMATION UNAWARENESS–Gemalto style

image 

http://www.cnet.com/news/sim-card-maker-gemalto-says-its-cards-are-secure-despite-hack/

Folks in the know may recall TIA: total information awareness. This was the codeword for metadata collection and scanning (the process by which one becomes aware about information, described by the metadata).

 

its general practice for the CISO to be officially unaware (of what his NSA-paid staff do). That is, deniability is a built in protocol. I can be unqware, and totally uinware (where the ‘totally’ is a codeword for “I have ‘legal cover’ for saying I’m unaware – or have “no reason” to believe, because its not part of my job description FORMALLY to make such utterances that perpetuate deniability – truthfully or otherwise). What matters is that I do not lie (not that the statement is truthful). This is DARPA at its best (indoctrinating folks in the distinction between telling a a lie and stating some thing [useful to a policy position] that you don’t know to be truthful – or otherwise).

“to the best of his knowledge”, quote the journalist – without even commenting on the PR protocol being invoked.

Our SIMS are secure! says Gemalto (the supplier of NSA/DoD’s 67464C CAC cards). That the purpose of the card (making private calls) doesn’t work has NOTHING To do with the claim of the “security” of the technical countermeasure (note the doublespeak).

Posted in Computers and Internet

hogwash in the self-signed cert arena for OEMs

bad journalism and self-centered experts. That’s all I see

image

http://www.cnet.com/news/superfish-torments-lenovo-owners-with-more-than-adware/

It was always intended that OEMs and browser distributors should insert their own trust points into the windows trust stores.

If Sprint licensed the original netscape browser and tune things up so it was easy to dial up Spring ISP internet access points, it was    absolutely intended that Spring would also tune up the root list for SSL – adding and removing entries as they saw fit, for their value-added network. If they wished to offer an SSL CONNECT Proxy service (to only those consumer who dialed up internet via sprint ISP) they were absolutely entitled and expected to offer a CONNECT service, with their own trusted roots.

This foofoo is all about nothing, with an OEM doing what its supposed to be doing  _ selecting trusted drivers, vetting them, and this includes “CAs”.

If you “buy” a corporate laptop, it comes with the enterprise root too, all setup for MITM and spying on you (the employee) by the evil Enterprise. Whether iuts OEM or Enterprise, the model is the same. You trust the source of the hardware to vet things, and they will probably vet themselves highly – mostly so they can spy or otherwise add value (like shove ads in your face).

Posted in spying, ssl

nsa version of XNS

was actually quite interesting, since it mixed type I waveform protections with type II ciphering.

not as interesting but very nostalgic is the picture that shows a workgroup supported by a “300 MByte” file server.

image

2015-0133.pdf (via cryptome’s site)

Which is less than the store on your phone, full of your photos probably!

ok ok, neither the XT nor the fileserver was never used for cryptanalytical jobs – even though the first Manchester computer (built 1946), with drum memory for storing colossus tape – had less io than the XT + 4Mbps ethernet.

Posted in Computers and Internet

solving ki with tls

“The only effective way for individuals to protect themselves from Ki theft-enabled surveillance is to use secure communications software, rather than relying on SIM card-based security. Secure software includes email and other apps that use Transport Layer Security (TLS), the mechanism underlying the secure HTTPS web protocol. The email clients included with Android phones and iPhones support TLS, as do large email providers like Yahoo and Google.”

https://firstlook.org/theintercept/2015/02/19/great-sim-heist/

 

hah hah hah.

 

““We need to stop assuming that the phone companies will provide us with a secure method of making calls or exchanging text messages,” says Soghoian.”

well durr!

And don’t assume – unless youare wholly stupid – the “new” Telco “TLS suppliers” are any different to the old Telcos. Real Google, think compromised by design. Read Microsoft, read compromise by praxis.

Posted in dunno

microsoft azure ws-fedp login, and FISMA/FEDRAMP

The login button on the azure portal screen invokes oauth (and its openid connect profile) in order to land on the “old” management portal site:-

image

image

We see that this process uses an authorization server, with authorization grant endpoint, at https://login.microsoftonline.com/common/oauth2/authorize?response_type=code+id_token&redirect_uri=https%3a%2f%2fmanage.windowsazure.com%2f&client_id=00000013-0000-0000-c000-000000000000&resource=https%3a%2f%2fmanagement.core.windows.net%2f&scope=user_impersonation+openid&nonce=172c6ab7-7231-4759-ad39-35d9650894c6&domain_hint=&site_id=500879&response_mode=query tales an id_token  parameter for response type.

image\

The server leverages user discovery

image

which generates the ws-fedp request (with its mix of proprietary and standard query string parameters).

As per the office 365 login experience, the site with web resources uses microsoftonline AS (that invokes ws-fedp websso on our IDP).

 

We can contrast this site login experience to the api management “manager site” (having just logged into the azure management portal, note).

 

image

here we see that the site uses an authorization server at  https://login.windows.net/common/oauth2/authorize?client_id=aad2a9dc-2d78-4a34-a6ea-8b535b177cd9&response_mode=form_post&response_type=code+id_token&scope=openid+profile&state=OpenIdConnect.AuthenticationProperties%3dAQAAANCMnd8BFdERjHoAwE_Cl-sBAAAAAZIdGDtugkSxUpRQIN9TaQAAAAACAAAAAAAQZgAAAAEAACAAAAC6kFh2Ew2gnA_lt7qgl9cQgkatiwwZLr3h4jftLw8WPQAAAAAOgAAAAAIAACAAAADmfY1WdHg4ve5Q4UJR2kB9eBK9hsEwPmJqG7MLskx-bkAAAAAWdW7-gkC9APajNuDG5yD0RTf14rcJ2M1ajbPkrTs0quWG_heb0jKaGyvp6vD_c2QJvHC-BZ_owkNcS1NSQNgmQAAAALe55wqpF8jbWMpJBAmnUozok7XZ_1Ftm-eSPzEKROF_UgDTt4lViOKkWLuwRa7l4x_GzvuON3jmRCj8t4U92gg&nonce=635594561610091501.NzA5MTQxZmEtM2ZjNC00MThmLWE2YzItZDRiMjU3YWY0NzQwZmZmZjg4NzctNjQ2NS00MWFiLTlmOTMtY2MyYzZjMjZhOWQy

image

by means unknown, this server knows it has a session (associated with microsoftonline.com’s AS, presumably)

If we delete cookies in the browser, we see microsoft.net AS that invoke’s microsoftonline.com ws-fedp responder (note well, this is not the AS used earlier), which invokes our IDP in a IDP proxying model.

image

image

This duly lands the user (who now visits the IDP) on the api management site:

image

Posted in Azure AD

VRSN, Azure logging, cybersecurity exec order, NSA data center

SO the US did nothing other today than it promised a while ago: start to build an next generation comsec economy. This follows the Api economy (that failed). The comsec economy is a cold war gamble – and likely  to succeed.

image

http://www.nasdaq.com/symbol/vrsn/real-time

The chart of VeriSign’s stock price is one indicator of the  just how well the plan is working; since DNS authority is at the heart of metadata scanning, in the new world of the comsec economy.

WE can point out Azure machine learning tools, too – the raw infrastructure of processing huge numbers of (rather mundane) log file events. Of course, events that log activity are better known as metadata.

image

http://azure.microsoft.com/en-us/services/machine-learning/

The focus by Microsoft on this area is similar to the focus a few years ago on office 365 infrastructure – assuming that government agencies would eventually outsource their (legacy) mail and intranet servers (and voice switches, ideally). This time, the assumption is that each such agency will be outsourcing its log processing,

The executive order binding stooge companies (bank of America, my bank, amongst them) to the federal initiative to collect log statistics, apply machine learning, and specifically predict cyber attacks aimed at American private firms… is now a reality. This has been in the works for years.

image

http://www.whitehouse.gov/the-press-office/2015/02/12/fact-sheet-executive-order-promoting-private-sector-cybersecurity-inform

A word of warning on that initiative, since it sounds almost identical to the post clinton exec order, after FBI were denied mandatory key escrow: the establishment of a MITRE-led initiative to create the metadata scanning program (instead).

In the audit arena, folks have been gearing up for this cybersecurity focus for a while, led by the revitalized FISMA process:

image

http://csrc.nist.gov/groups/SMA/fisma/Risk-Management-Framework/index.html

note how the changes or orientation towards cybersecurity –motivated logging are now downplayed, in light of increased public understanding of metadata scanning threats. Its quite hard now to linkup fisma (revised) with fedramp:

image

http://www.gsa.gov/portal/category/102375

one notes the outcome of the NSA vs DHS war, on who is to control metadata scanning!

image

http://www.dhs.gov/news/2010/11/18/dhs-highlights-two-cybersecurity-initiatives-enhance-coordination-state-and-local

of course, this is all dominated by the investment in Utah

image

https://nsa.gov1.info/utah-data-center/

All of which goes back years, to the decision to use subversion (of internet security programs) rather than mandatory key escrow.

image

https://www.schneier.com/crypto-gram/archives/1999/1015.html

Posted in Computers and Internet

using AAD and oauth2 for Azure API Management’s Developer console need for access tokens

In the last post, we noted how one configures the api manager in azure to communicate with the AAD authorization server for the purpose of enabling developers to logon to the api developer site using their federated credentials. Of course, behind the scenes one is simply configuring the site to talk to AAD using oauth2 – as augmented for multi-tenant access by Microsoft.

In this post, we configure another aspect of api management: oauth2 authorization servers for the purpose of supporting authenticated API requests. That is, the api client and authorization server will communicate to get access tokens that are then attached to the request. The additional authorization headers in the request is processed by the server as an access token, as normal.

image

image

 

we guess at the right parameters and then move on to configure the AAD application so that we obtain the missing clientid/secretkey values.

image

image

 

moving to AAD application configuration…

image

image

https://rapmlsqa.portal.azure-api.net/docs/services/54dd05ad73c60f1110d802b1/console/oauth2/authorizationcode/callback

image

we enabled all app and delegation permissions (not knowing any better)

clientid: 0bc904ae-3f2c-4ec7-8b71-40f7207112f0

key: p4BAp2OLoQf4PR7AAMIR6TY+j7/033DgrVhb4A80XFc=

image

 

giving

image

Now we arm the api endpoints to demand the access token and interact with an authorization server, such as that configured above. Using the API screen, in the management console:

image

image

To test the setup, we follow the instructions and user the developer portal, to invoke a client of the my echo API:

image

selecting authentication code induces display of the websso experience at the authorization server (which is good!)

image

image

 

image

once we fix a bug or two, we can continue this to completion.

image

note the audience fields (doesn’t feel right!)

note how the wtrealm field (provided by azure’s initiator) becomes confused in the IDP:

image

image

Posted in Azure AD

Azure AD and API enforcement after an openid connect (oauth2) handshake

Azure offers a simple webapi proxying service that consumes authenticated requests and relays responses to clients.

We created the “rapmlsqa” portal for api management VIA THE AZURE CLOUD; where we see a default api – provided as an element of the ‘starter product’

image

image

We learn to invoke this api using the developer console. We also learned, first, to obtain from the management console “users” panel the subscription key for the “starter product”.

image

image

image

we see that the invocation requires a subscription key, even though no (proxy) security requirement is set.

The first use we make of AAD – and its oauth2/openid-connect handshake is for the purpose of user login to the management/developer portal site:

image

http://azure.microsoft.com/en-us/documentation/articles/api-management-howto-aad/

image

https://rapmlsqa.portal.azure-api.net/signin-aad

image

image

image

image

image

RpJnlhSWfRkFCg9h04rmk5N3INmCpNzmua/Uh7UWQjg= (key)

aad2a9dc-2d78-4a34-a6ea-8b535b177cd9 (clientid)

image

This gives us a signin experience setup for local account and AAD (rapmlsqa federated) login:

image

image

https://rapmlsqa.portal.azure-api.net/signin

After the usual azure cloud login experience, we  land on the site which has auto-created a (federated-local) account

image

image

back on the management site, we can exploit our admin privileges to modify the subscriptions to which the new user has access

image

image

we see that this new (federated user) – with its distinct subscription ids now – can invoke the API:

image

Of course, this is not an oauth2 handshake (to access the API). in  realty terms, we have a portal that enables us to assign vendorids (to MLS federated users) and grant said user subscription power to certain APIs (eg RETS).

In the next effort, we WILL APPLY oauth2 handshakes between API CLIENT and API SERVER/PROXY

Posted in azure, Azure AD

Iiw and crypto subversion

Is a reference to a document whose content describes how lots of americans (mostly) were cooperating to define a path for identity management using, so-called, community consensus processes.

Under the assumption that the “would if we could” sector of America representing natsec interests will have subverted such processes to suborn them (probably covertly), we might ask: who in the iiw process is (secretly) an nsa/fbi/dod/cia/etc subversive?

Not all standards related groups need subversion. Iesg has no problem being infiltrated by parties whose goals are at odds with stated positions. Ansi security committees are well known for being openly stuffed with stooges, who openly flaunt their loyalty ti us natsec interests.

Posted in coding theory

Ps and Qs (and an unexceptional understanding of KEA)

http://blog.cryptographyengineering.com/2015/01/hopefully-last-post-ill-ever-write-on.html is interesting for its discussion of Ps and Qs.

Long ago, I got a briefing (from NSA folk) on the security architecture and crypto algorithms used in the security mechanisms for the type II version of MSP – the layer 7 forerunner of S/MIME (in DoD). One might recall how DoD certs, in the fortezza era, contained signing and KEA keys. KEA has a DH-like computation, and cooperates with a particular block formation process to compute a master session key. Its really quite cute, and shows that NSA has a lot of class. At the time, I was quite in awe of NSA ability to subtly tune up algorithms , mechanisms  and services (and still am, truth be told).

Now, in the KEA process based in discrete log based problems (vs curve based problems) the so-called random numbers from the certs’ keying blocks were augmented by valuyes supplied by the protocol …. from which master keys were derived (after certain blocking processes were used, and signing processes verified, note well). The keys from the UKM list (a list of secondary “randoms” that “changed” the master key for the crypto periods desired) were the source  of the protocols-contribution of (partial) keyimg material, generating the master secret.

When used in interactive protocols (such as TLS these days), both parties would contribute the KEA key and a nominated value from their UKM. In the case of email protocols, only the sender would contribute such material, with values from the devices RNG satisfying the formal requirements.

It qas made quite clear that the KEA values in the certs were public keys, that could be treated as random numbers at one level (and public keys at another).

While the security mechanism were identical, the security services that result from these different procedures for leveraging keying material were different. And, there was nothing held back about why such differences were necessary – based on pretty obvious (but also subtle) requiremnets analysis.  This was “security engineering” – that I learned from the agency (and loved to learn, from folks evidently well skilled).  That the fortezza card could be applied to the difference security services was, furthermore, absolutely intended – since folks wanted different communication security semantics, to suite the different operational theatres the devices was intended to serve.

I kind of miss professional NSA security engineering, given the drivel I read from the academic sector.

Posted in crypto

HSM, EAL4 and worthless assurances from Microsoft Azure on crypto

image

http://azure.microsoft.com/blog/2015/01/08/azure-is-now-bigger-faster-more-open-and-more-secure/

Trust is hard to obtain and maintain – especially in a world now  well-tuned into the cynicism that expects large corporations to be working with governments in the execution of systemic, trust-based deception plan. If you must, call it a national identity program, or national cybersecurity plan. Anything (so long as not PKI – joke).

If I put 15 more deadbolts on my front door, I am truly “more protected”. Yes, noone lied. But its still worthless (despite being being “better” than merely 1 deadlock).

Why? Because one smashes the weak door jamb down (not the door) when one “disables” all the 15 $20 deadbolts with the $10 sledgehammar weilded by an ape with an exceptionalist attitude. That is: the deadlocks make no difference to security. They just sell worthless American assurances.

A FIPS level-2 HSM is a nice to have; but about as useful to crypto-security and confidentiality as the windows security manager in your desktop when faced with an FBI sledgehammer. Level 2 means it relies on a password scheme (and anyone knows that a “little bit of legal duress” can induce you to handover your password.

It’s true that having HSMs to protect certs and signing keys in a cloud-based HSM, are nicer than on-premise HSMs. And one sees how it supports encryption by a hosted SQL server of tuples. But, as for “security and safety”, the HSM feature is as worthless as any other American-sourced crypto/security product – since the vendor (microsoft) is under the thumb of the crypto regulator.

Sorry Microsoft. Right idea, wrong marketing. You must admit that – like a front door  and its deadbolt – the security is illusory.

your own trustworthiness depends on saying what some don’t want you to say – including the above clear statement of limits. Each and every attempt to use marketing terms and clever parsing to hide the reality leads one to put microsoft in the american bucket (its just another deception program a few exceptional seeking to mislead the rest of humanity 0- the non-exceptionals – on crypto).

Posted in crypto

windows technical preview tablet mode: missing toolbars

I like it – in general.

Well compared to the mode switch that came with windows 8!

also interesting to see bugs/issues, wrt older win32 apps –even  one’s “properly engineered (back then)” by Microsoft itself to work properly with future OS changes.

image

Posted in dunno

steve crocker, darpa, x500 and DNS and trust

Steve croker was head of IETF internet security activities at the time when I was involved in standards work. He was also a founder and thought leader of the internet movement itself – that civilian form of the military internet he helped foment. As a DARPA program manger, he fit the bill perfectly: strong willed, academic, and able to think 25 years ahead.

We have  learned what DARPA was aiming for (25 years ago), with the internet; and, in particular, what it was aiming for in how the name servers (DNS) were to evolve, so that the civilian world (of 25 years hence) would aid and enable the military posture of the USA (and nominally its allies). Steve was one who despised the ISO/CCITT form of name serving (the X500 system), wanting “anything US and internet” to win (at all costs) in the name serving arena. Stuff from international standards bodies was essentially evil, per se (not being American).

TO BE FAIR, in the technical arena of name serving I think he really believed in thewisdom of using packet technologies to scale services (like DNS) – and its 100% true that the DNS we have to day is very much an  internet-era, packet-switched technology that leverages the internet architecture to deliver its feature set. What we have learned since, however, is that the benefit  of this gunboat approach to design has a sharp kickback when fired – specially when the gunboat is all about American exceptionalism. The gunboat is really  a spying ship.

I’m prompted to write the missive having just overheard a starbucks barista castigate a customer wanting to pay $1.50 for a paper newspaper, rather than use the “free” internet to obtains news. This is the world of 25 years later; the world that Steve wanted to foment and indeed did foment – acting  as that special class of exceptional: the DARPA program manager.

DARPA indoctrinated its folks in deception –  the use of intelligence and foresign to aim “to win” at all costs by leveraging deep, programmatic deception to win hearts and minds through use of technology seeding to spread american ideas themselves. We see now that Steve, as with many of the DARPA paid folks of the era, did their job well – inducing civilian infrastructure to be co-joined with american military infrastructure. Knowing that open communications for the masses would interfere with military spying, it was critical to ensure that internet (vs  milnet) technologies in name serving would aid and abet a military advantage to the USA. Since international standards provided no such advantage, they were to be skewered, denied, and undermined – particularly by spying-related agencies such as NASA.

Its not my place to discuss how (what was known as ) milnet and DNS works – to protect name serving via packet technologies in a way that the DNS service is not protected in the internet form of packet switching. After all, if there is a real war – I will be ON NSAs side doing whatever it takes to exploit an intelligence advantage. As a civilian today, however, with an technical political opinion, I can properly ask questions, and point out the issues.

It comes down to trust. And America is not doing a good job of being seen to be trustworthy. The likes of DARPA are now perceived to be the anti-thesis of trust – being seen to have been verily for the very purpose of executing technology-based deception plans, at a systemic level. The internet, and the thesis of open technology dependency, is the evidence.

Posted in dunno, rant

someone attacks my phone and pc, overnight

And, they are good.

While attacking a US sourced phone is not hard, since all the firmware and motherboard vendors work with NSA/CIA to enable remote disablement of the core chips (let alone the OS running on the CPU), disabling the PC was quite a feat. It required remote start from the wifi.

This was a military capability, leveraging preparedness to attack civilian infrastructure. This was beyond FBI (own) capabilities

Wonder why!

Using such against me is really not on. First off, I’m not a target on which its worth wasting such knowhow. This leaves only the desire to collect dirt, plant something, etc – which is a CIA function (not NSA).

Posted in dunno

1940s crypto secrets

Things that were considered worthy of being classified (as 1930s-era secrets in the field of cryptanalysis) included what we would now call iterative methods (in algorithm design).

One example of an iterative method the highly-crypt0-centric method  of banburismus – the updating of Bayesian estimates concerning likelihood of potential solutions in a search set to help direct search directions. One thinks also of counter-intelligence examples in use in the 1940 era when planes were sent to search, being intended to be observed by the enemy as they conducted searches using known-search methods; that hid more advanced search processes used only in well-protected crypto facilities

Iterative methods used for calculating eigenvalue/vectors from approximations, for quantum mechanics related matrices/operators, were also “guarded secrets”. The secrets concerned the raw “methods” themselves along with the ”means” used to complete the processes in a cost-effective time. One things of processes based on rayley quotients and minimization to find the minimum eigenvalue (and the second least eigenvalue) – given the special role this distance has in certain groups when allowing convergence to only occur after almost the maximum number of iterations.

The special space-relations that exist between solution-spaces and search spaces were known to characterize complexity and difficulty metrics – allowing development of key-wrapping ciphers that would measure the security of a key in terms of how long it WOULD take to reverse the mathematics (assuming the ciphertext data had been compromised by capture). This obviously gives a maximum operational window for use of the keying material, in a worst case analysis world.

Posted in crypto

Rayley, turing, Nash. Cryptanalysis heart.

I was watching a lecture on the power method. Formally, it delivers approximate solutions to figuring eigenvalues of expander graphs – those found in the cryptanalysis solutions for wiring plans (think des, think enigma) of keyed rotor machines.

Whether its nash math, for constraint satisfaction in economics or crypto, or solution approximating based on rationale numbers and rayley quotients, its just fascinating to see how the core “super secret” world if cryptanalytic technique has been sitting open fie a long time. Its just that our eyes were *forced * closed, so we could not see.

Fascinating also to consider how Turing – the cryptanalyst turned cipher designer – knew how to defend against guessing attacks, assuming complexity and capabilities of attack machinery based in industrial capacity of an economy etc

Posted in coding theory

“testing” for trojaned American silicon

Managed to obtain an older magicgate chip.

Lets see how exceptional and unique folks thought they were when seeding the civilian consumer space with military purposed spyware (breaking the rules of war). No you can’t just invent a pseudo military (natsec).

Would have no real problem if folks want to “pay a visit” to seize this. Entirely understandable.

Posted in assange

pedego leds interpretation

If you like me own a pedego (classical cruiser) electric bicycle, you might ask: so how do I interpret the traffic light leds on the bike’s throttle control?

So the answer is: I don’t know. But, I do have an intuitive mental model (based on observation and use analysis). This aligns with an article I read; in which the anonymous author opined that: 1) while three lights show, you have a nominal 100% performance still available; 2) if two lights show, you have 30%; and, if only one light shows you have 10% (and disaster is imminent).

That is, hey it act like traffic lights – it’s a set of signals about pending issues

When I ride my bike, there is a hill at the end of the trail (heading for home). You do NOT want to pushing that a 50lb bike up that hill, if you have a choice. So, you want to be husbanding energy so  that – somehow or other – the bike gets you home.

I learned that that SHOULD I demand less speed from the battery pack, I can keep the bike on all three traffic leds – once the faster speed I WAS travelling at (for the hill in question) produces a 2-led warning.

So, the number of leds showing on a pedego means: slow down – so that the power delivery – available from the pack – can meet the demand you place upon it. You always want to travel in the  3 leds situation, so you do not deplete the power at a rate that induces failure conditions in the equations. If you are travelling on 3 leds and it turns to 2 leds, slow down .. till three leds shows once again.

If two leds show, and 1 led shows when you travel faster than warned, worry. The battery pack is about to die on you, till you do a recharge.

So how long will a battery pack last, while showing 3 leds while being susceptible to showing 2 leds (once power delivery support the higher level of demand). Dunno. But It has not failed on me yet (for the final hill up to my house). When I stressed things at 2 lights, I ended up pushing 50 lbs up a miserable hill.

Posted in pedego

sony, drm, NSA hacking, me.

SONY, the US media company, is back in the news. I can tell some stories here – though I’ve changed some names and storylines to protect folks.

Back when I was at VeriSign, I got to play chief architect – at least for a while. During that period, I worked with Microsoft’s Windows group, and the newly formed Authenticode team. WinTrust was the result – the means of validating signatures on code/files, including drivers or streams of data subject to policies enforced by a (note a, not the)  TCB. Recall, that the NT architecture is a micr0-kernel design and there are multiple subsystems, each with their own TCB.

First clue given above. And no, its not “Microsoft”.

Sony was known at the time, in Microsoft circles, for being a right royal pain in the ass when it came to being a corporate stooge for NSA (the NSA of back then, not now). The US side of Sony was entirely indoctrinated and penetrated – and saw it as their mission to use media distribution as a way of doing the job formally done by CIA. Quite properly, they wanted to make a  civilian business out of former government “business-lines” – and DRM and “trojaned silicon” was to be their edge. One of the first converts was to be windows (and its DRM’ed DVD player); and SONY wanted the means to remotely project poliicy control over an PC, ostensibly to enforce  media rights. Of course, the whole program was a foil for CIA, in the sense that it was CIAs job to support the NSA of the day when performing the implanting process.

Perhaps, folks recall sony magicGate chip line. One might want to get a hold of some, and NOWADAYS go analyze the silicon gates. Go look for the trojans build into the silicon itself, in the certain batches intended “for export”

Anyways, sony wanted to Microsoft to alter authenticode – at the architecture level –  purportedly for reasons of ensuring that hardware (HP Bristol/GCHQ’s TPM) would control the playback drivers. The goal was to ensure that only certain (signed/certified) drivers could drive certain layers of the screen (which facilitated a covert channel, susceptible to the kind of antenna mounted in the typical US embassy)

So what was my job?  produce a consultancy report that independently advised an alterative course. No, it was NOT to subvert the program noted above. It was, however, to do what the program claimed to do (rather than have the program used as systemic dupeware … to do something, ahem, else).

One of the problem the current natsec folks will be having, over the current issue is that its very likely only to open up to technical scrutiny what USED to be happening, as national policy, in the memory chip area. And that will open up questions about how much better might be similar processes, at the silicon level recall, today.

Peter.

Posted in coding theory

joomla 3 with mod adfs plugin, debugging in visual studio 2013

Took three days to build a debugging php/joomla site in visual studio, having created the site in azure websites, synced to webmatrix, and then launched visual studio (with php plugin).

Various fiddles to make the synced website, running in IIS Express typically, fire up in the debuggers own web server, on the same port as IIS express, with the debug monitor enabled.

image

Since we are targeting version 3.3 now of joomla (since that is what comes with azure), we have to make some changes to our older code. First off, to make the login button appear (somewhere), we had to change how the module helper  class is loaded:

image

More generally, we configured the modules and plugins, as shown next:

image

authentication

 

image

user

 

image

system

 

image

image

module

Second, the getToken method in the JUtility class has gone! as we learn from

image

http://stackoverflow.com/questions/18416247/joomla-3-0-call-to-undefined-method-jutilitygettoken

We fix this, in version specific manner, as shown:

image

(note how we were unable to even include the older versions class reference… except in a comment hint…as linking errors otherwise prevent loading)

to run all this, we emulate a live joomla site, with know good test parameters. Fiddler is configured to redirect the live site’s URIs to the localhost process under debugger control thus:

image

This assumes that our localhost joomla, with our sso plugin, has been configured with a copy of the IDP metadata (see http://1drv.ms/1DQxhUF) for DEMO/9 IDP and expects to title itself as the SP whose audience is mgs.rapams.com.

image

image

https://ssoportallax.rapmls.com/spinitiatedssohandler.aspx/DEMO/9

 

On windows server, we have to add each domain to the trusted site lists:

image

Now that we have a basic setup, we refine the testing parameters to use the so called federation mode flow (with DEMO/9 being the FP, and demo/8 being the IDP)

image

This allows us to get to: a complete flow, for the first time:

image

To get to auto-registration of a user, there are a couple more porting bugs to fix

image

We need to adopt the latest status return value conventions, for positive

$version = new JVersion();
if ($version->RELEASE == ‘3.3’) {
    $response->status = 1;
} else {
    $response->status = JAUTHENTICATE_STATUS_SUCCESS;
}

and negative return paths

               $version = new JVersion();
                if ($version->RELEASE == ‘3.3’) {
                   //
                } else {
                    $response->status = JAUTHENTICATE_STATUS_FAILURE;
                }

 

Since the assertion apparently indicates a null-string email attribute, this upsets some of the joomla 3 login (in a way that presumably it did not upset 2.5 logic). So we amend the condition:

//////////// EMAIL   
                    if (isset($userDetails->attributes[“email”])) {
                        if ($userDetails->attributes[“email”][0] == “”)
                        {
                            ;
                        }
                        else 
                        {
                            $response->email = $userDetails->attributes[“email”][0];
                        }
                    }

once we manually gave the rapstaff user a registered/super user group assignment, the login screen showed what one would expect

image

To make the debugging process easier (upon receipt of assertion), we amended the inbound message handle thus:

image

Posted in fiddler

my ISIS connection to George Washington

image

Thanks to Google and its general corporate stance of facilitating the spying (by the exceptionals) on me, I’m sure I’m now being spied on – because of the “association” with ISIS.

Ridiculous. But, entirely good enough association to go and harangue me.

George Washington would be proud (and not that’s not sarcasm). He WOULD be (since I’m not one of the elite class of human beings). He set the tone, as a human slaver, and its still with us. say one thing, sell it to the immigrants, and work them to death so long as they are good consumers making the landowning elite “super rich”.

Some poor US contractor has to go make note in my “intelligence dossier now” on the Risk of subversion, given this memo. Poor thing.

Posted in dunno

org id and webmatrix

image

windows server 2012, azure/MSDN image

this means that the proxy for the desktop is confused (probably because fiddler was running, when the PC crashed).

Posted in Azure AD

making webapp talk to rets webapi

image

Image | Posted on by

google, moffet, me, Google and USA’s historical spying triumvirate

image

http://www.cnet.com/news/google-to-lease-historic-bay-area-airfield-from-nasa/

The article prompts me to have nostalgia – since I worked at the site in question. A fixture of silicon valley for sure, the hanger is worth a visit – particularly when understanding what it was used for after the zeppelins long since made their departure. What the rest of the NASA side of the Navy site was doing is also relevant, to internet history – particularly in the area of spying and silicon valleys long love affair with spin around saying its crypto works while doing all in its power to pay back its sponsor for lots of lucrative procurements in the manner desired – ENSURE CRYPTO DOESN’T WORK, when called upon.

So Google engineers will be able to trace the same halls I walked, and visit the many rooms I could not – being classified areas, with super secret meeting rooms, specially clad, etc. So, just what was so secret (apart form the culture of secrecy, that called for classifying the unix ascii man page… once!)?

One has to understand that the site in question once hosted a joint CIA/NASA mission to exploit space for the spying role that NSA did not do itself (back then). No, I’m not talking about the obvious (spy satellites with cameras and radars, beaming “telemetric” data back to NSA receivers at the far end of the site). Even back then, it was critical to judge whether foreign counter-surveillance penetration had succeeded to gain entrance into the “then-critical infrastructure” (the systemic foreign-surveillance-infrastructure that guided emergency response protocols). I’m talking about the process of IMPLANTING or ACTIVATING the bug that takes/took a router or switch and turns/turned it into the spying outpost (for NSA).

Remember it was and still is NASA/CIAs’ job to enable NSA (not be NSA). It’s NASAs job to use space based or space-related-terrestrial communication platforms to enable CIA (to deliver the implanting or activation, for NSA). Its CIAs job to work with the vendors and the supply chain to provide for implantation – with covers and techniques to suite the risk of being discovered or (worse) be fed counter-intelligence material.

It’s kind of cute that its specifically Google – and its all american attitudes concerning unique exceptionalism – that will take charge of one of the founding sites of both pre-internet (aka early nuclear test monitoring) and internet-era  (i.e general spying) secret surveillance – with its massive infrastructure outlay and an entire culture of layered people management (to provide suitable cover stories). One assumes that a few, ultra-nationalist and ultra-indoctrinated Googlers with advanced CIA covers will have a few “Special Assignments”  to perform, too – without the CEOs knowhow (or care, typically, once a “Government Contract” or two is placed ). After all, the secret protocols to the leasing contract will not be known, publicly and will help induce that most american property of exceptionalism of all: plausible deniability.

Posted in spying

Nevada motorcycle licensing madness (in one case)

assuming I converted my newly-minted and presumably-valid motorcycle safety training credential into an “endorsement” on my drivers license (an american/Nevada affair, these days), I am 007 – officially licensed to kill (myself).

That was the summary and debriefing from the rather excellent but second group of trainers, as they gave me the passing grade. In summary:- Urr….don’t actually ride on the road… (even though you passed). They both sighed with relief when I revealed that the ride I took just (to pass the exam) was probably the last ride I’ll ever take – unless perhaps I go on holiday to some exotic island and want a little fun ride.

Now comes some material that I’m tempted not to write or disclose since its publication – like an advert for scotch – will surely increase the killing rate. Since I’m talking to and about those who think they are uniquely exceptional based on their various exceptional national processes (which includes full disclosure of facts along with endless prattling and mouthiness), as a non-exceptional I can hardly be blamed, however, for inappropriateness; should I emulate my betters by opining…

…that on this and the previous course, everyone was a liar (with small number of exceptions). Shush! They had all ridden countless hours before (illegally, for the most part). Perhaps, much as illegal immigrants to the USA might be denied public hospital care given their status, folks who evidently can ride might now be denied access to a novice course by the instructors upon initial evaluation (aiming to be fair to all, remembering that this is American fairness – somewhat elitist and inequitable, by European standards). A course for actual novices – defined as folk who have NEVER RIDDEN a motorcycle – it was not. A course to get a DMV license, having met minimums, it was. Fortunately, the folks on the second course had a level of teaching skills that do NOT place them that class of teachers, typically found in private schools, who are expert – about teaching “examination technique”, along with the curriculum.

My first teacher was a clearly a superior classroom teacher; and was good at getting folks through both the written and skills exams. With bad luck, Id have passed the skills test under him (despite being entirely unprepared for riding a parking lot safely let alone a road with other vehicles). Only a total screwup up on one test, under this techniques caused me to have too many demerits (to become licensed). A split second decision  on my part let me FAIL! It could easily have gone the other way (and I have been licensed by Nevada, while incompetent).

It was ego that induced me to try again at a second run of the course (since obviously I could pass, should luck flow the other way). So I signed up again, and went through 40 more hours of american style drill training – that mostly focused on drill based learning of skills (rather than examination technique). And so I passed, with a wide margin (though with some personal caveats, as mentioned in the introduction). Played the second time, the course was what I expected: thorough, professional, skilled at skill teaching, and one that used formats and (rider coach) team work to real advantage!

So here is the part you may NOT want to read.

There are examination things you MUST NOT DO, to avoid large numbers of demerits. put another way, here is EXAMINATION TECHNIQUE that minimizes getting demerits:

1. The test, for swerving, always has you swerve to the right. So AIM for the right cone, and don’t worry too much if you swerve a foot too early. The demerits for this are FAR BETTER to incur than hitting the (virtual) obstacle – particular when super nervous in test mode. Thanks to the policeman, on the same test, for the advice (that I ignored the first time around). As I say, America really is FULL OF  EXCEPTIONALS (playing the rules, for killing with guns or motorbiking alike).

2. the test for emergency stop has some nasty demerits for something almost irrelevant to the apparent testing criteria (stopping, before hitting the kid). You get 5 demerits for failing to stop in first gear (the kid is alive… but who cares. first gear is more important…). So, no matter what you do, sacrifice stopping distance for doing the first gear.

Yes, I know, I know. Folks who had “properly internalized” the skills would be doing each of the above (without needing to cheat). But that’s not my point. learning examination technique COULD be putting you more at risk, as you show off your nice new shiny motorcycle license (which is a formal license to kill yourself, in more than one case, from what I observed).

Ok, I’m whining about american processes, and not  giving much positive. As always, it’s the usual caveat, which applies to lots and lots of american’isms; what I see is the best of an inherently bad lot. Which is actually praise (from an Englishman). You are the best (and that includes lying, cheating, over tutoring, and making licensing money from licenses to kill). Its also the best in a more positive vein: american drill process was excellent (assuming you are used to it, which I’m not).

so what would I improve?

Well, the instructors I had were not internet savvy. They failed to use many props and failed to use videos or simulators – to convert words into skill. It took me 40 hours to figure that “press” (in american) means “push” (in English). A simple prop, with a bicycle would never have allowed that! In my first run through the course, I kept trying to press (down) – rather than given a quick pushing shove on the handle, that tips the bike over a bit to help one lean. Sigh!

Washington State’s evidently similar program had an excellent DMV-based video series, that trained some, perhaps like me who are academically enabled and thus NOT particular well adapted any longer to drill training,  to see HOW one measures skill (and why the testing is designed, given lots of testing tradeoffs). And their videos did it in a manner that was not about “playing the examination game,” either. It was simply an academic presentation of the skill levels being expected, talking to an audience in a way that would resonate.

In Nevada schools for biking, however, biker culture reigns (perhaps because of the excellent biking geography). Thus, it is folk lore (as enacted through word/phrase correctness, drill and repetition-style training) that dominates; since that’s the operative culture. If your brain, as with puberty, was irreversibly altered by getting to the higher academic levels, you, like me, might not respond well to drill based learning (in motorcycling or any other area).

Should an academic, from a good private school that trained me up to respect examination technique, be saying this to folks at risk?

You decide! I’m VERY happy on my powered bi-cycle, maxed out at 20mph doing the same manoevres as a real motorbike (using a machine that is REALLY easy to LEARN TO handle).

All in all, in honor of my excellent teachers, today, I may now wave a little at the bikers – as a fellow road user at horrendous risk, due to the nature of “shared road”. Wonder if they will wave back or just scowl at the impudence!?

Posted in coding theory