Despite hours of effort measured in the 100s, I have not really managed to get to the source of NAR/RESOs oauth strategy. I’m left with interpreting the technical documents, comparing them with similar documents produced by other profiling groups. In summary, I’m confused. From what I can tell, IMHO, LOL and other teenage terms now used by managers in their fifties and sixties, it seems it starts “out of date”. Quite why it is so backward in its thinking seems to be based on the commercial motives of those doing the profiling: keep real estate IT backward and a nice little earner for a select group. Oh, and ensure it pretends to satisfy anti-trust rules, while in reality blocking the entrance of cloud vendors and their millions of “tenants” into the IT market.
Hmm, the humor element is missing. So let’s add some, English style.
Not being a member of the exceptional class of human being (not seen seen the end of the nazi-era of “the exceptional race”), I looked at Microsoft Azure’s Mobile service for architectural guidance. How might OAuth be applied – TODAY – if one were to adopt cloud-centric thinking? For the purposes of learning by thinking through issues from all perspectives – and not only one’s own – one can now compare the different generations of oauth systems and their profiles. One can start analysis looking at the pre-cloud oauth architectures of 5 years ago – which is where RESO seems to be starting out.
Azure mobile is a post-RESO model – of cloud thinking. It focuses on webapi-enabled mobile apps on several devices held by a community of users – supported by oauth security technology that in turn is supported by classical websso (using SAML and ws-fed). It comes both with a server-side model (where tokens are delivered from the OAuth AS directly to webapi endpoints in return for session-tokens) and a client-side model (in which trusted apps maintain caches of tokens, that they choose to renew and/or supply to one or more webapis).
If we characterize RESO’s design in terms of Azure Mobile, RESO is a clumsy server-side scheme – not seen the early days of oauth when folks invested perhaps at most of couple of days development effort, as they bootstrapped their ideas linking up some existing login page to the early webapis. But why would NSA be interested in this class of effort?
It’s just preposterous to think of NSA vs. RESO in terms of NSA vs Web (and the “perhaps” humorous model of plantations of enslaved web users, with NSA as the model “master” keeping the 8 billion non-exceptional slaves – like me – in line with systemic threat and surveillance culture, all wrapped up in a flag). It is, none-the-less, more proper see the security planning of small profiling groups in terms of more well-intentioned national programs, such as those in identity management or cybersecurity preparation. What a contrast (NSA the spy, NSA the protector)!
If NSA was “subverting” RESO, in the sense that it compromised many a cipher device in the 70s through cozy relations with vendors and standards groups, why would it be attempting to keep things so backwards – in such an important economic sector as real estate? Why would the security technology be out of date, before the spec is even voted on? Why would the management dynamics of the standards group be so contrived? Why would “they” want it so “small-minded”?
I’ve heard several proposed answers to such questions. One answer – the vendor rationale – is founded on the program being a last gasp attempt to prevent the cloud-revolution replacing the traditional set of vendors. Another last gasp theory – the NAR rationale – contends that it is all founded in trying to deter dis-intermediation, protecting against entrance of the “web assurances of new economic players with novel approaches to “facilitating property conveyance” and making it hard for them to replace the core assurances provided by the human Realtor. One slightly-more realistic answer – the NSA paranoia rationale – proposes that NSA is simply doing what it knows how to do well as an institution – motivate adoption of modern cloud in those areas of real critical infrastructure while keeping things nicely server-side for the rest of the general economy on which it spies, engendering adoption of older standards for which the subversion techniques are well known, are known to work well and can be assured to work by deception through by financial bribes (or golf trips) to the right people, in the right places. As they contrive.
Well the humor never ceases – as the game of crypto and cryptanalysis plays out.