## pedego leds interpretation

If you like me own a pedego (classical cruiser) electric bicycle, you might ask: so how do I interpret the traffic light leds on the bike’s throttle control?

So the answer is: I don’t know. But, I do have an intuitive mental model (based on observation and use analysis). This aligns with an article I read; in which the anonymous author opined that: 1) while three lights show, you have a nominal 100% performance still available; 2) if two lights show, you have 30%; and, if only one light shows you have 10% (and disaster is imminent).

That is, hey it act like traffic lights – it’s a set of signals about pending issues

When I ride my bike, there is a hill at the end of the trail (heading for home). You do NOT want to pushing that a 50lb bike up that hill, if you have a choice. So, you want to be husbanding energy so  that – somehow or other – the bike gets you home.

I learned that that SHOULD I demand less speed from the battery pack, I can keep the bike on all three traffic leds – once the faster speed I WAS travelling at (for the hill in question) produces a 2-led warning.

So, the number of leds showing on a pedego means: slow down – so that the power delivery – available from the pack – can meet the demand you place upon it. You always want to travel in the  3 leds situation, so you do not deplete the power at a rate that induces failure conditions in the equations. If you are travelling on 3 leds and it turns to 2 leds, slow down .. till three leds shows once again.

If two leds show, and 1 led shows when you travel faster than warned, worry. The battery pack is about to die on you, till you do a recharge.

So how long will a battery pack last, while showing 3 leds while being susceptible to showing 2 leds (once power delivery support the higher level of demand). Dunno. But It has not failed on me yet (for the final hill up to my house). When I stressed things at 2 lights, I ended up pushing 50 lbs up a miserable hill.

Posted in pedego

## sony, drm, NSA hacking, me.

SONY, the US media company, is back in the news. I can tell some stories here – though I’ve changed some names and storylines to protect folks.

Back when I was at VeriSign, I got to play chief architect – at least for a while. During that period, I worked with Microsoft’s Windows group, and the newly formed Authenticode team. WinTrust was the result – the means of validating signatures on code/files, including drivers or streams of data subject to policies enforced by a (note a, not the)  TCB. Recall, that the NT architecture is a micr0-kernel design and there are multiple subsystems, each with their own TCB.

First clue given above. And no, its not “Microsoft”.

Sony was known at the time, in Microsoft circles, for being a right royal pain in the ass when it came to being a corporate stooge for NSA (the NSA of back then, not now). The US side of Sony was entirely indoctrinated and penetrated – and saw it as their mission to use media distribution as a way of doing the job formally done by CIA. Quite properly, they wanted to make a  civilian business out of former government “business-lines” – and DRM and “trojaned silicon” was to be their edge. One of the first converts was to be windows (and its DRM’ed DVD player); and SONY wanted the means to remotely project poliicy control over an PC, ostensibly to enforce  media rights. Of course, the whole program was a foil for CIA, in the sense that it was CIAs job to support the NSA of the day when performing the implanting process.

Perhaps, folks recall sony magicGate chip line. One might want to get a hold of some, and NOWADAYS go analyze the silicon gates. Go look for the trojans build into the silicon itself, in the certain batches intended “for export”

Anyways, sony wanted to Microsoft to alter authenticode – at the architecture level –  purportedly for reasons of ensuring that hardware (HP Bristol/GCHQ’s TPM) would control the playback drivers. The goal was to ensure that only certain (signed/certified) drivers could drive certain layers of the screen (which facilitated a covert channel, susceptible to the kind of antenna mounted in the typical US embassy)

So what was my job?  produce a consultancy report that independently advised an alterative course. No, it was NOT to subvert the program noted above. It was, however, to do what the program claimed to do (rather than have the program used as systemic dupeware … to do something, ahem, else).

One of the problem the current natsec folks will be having, over the current issue is that its very likely only to open up to technical scrutiny what USED to be happening, as national policy, in the memory chip area. And that will open up questions about how much better might be similar processes, at the silicon level recall, today.

Peter.

Posted in coding theory

## joomla 3 with mod adfs plugin, debugging in visual studio 2013

Took three days to build a debugging php/joomla site in visual studio, having created the site in azure websites, synced to webmatrix, and then launched visual studio (with php plugin).

Various fiddles to make the synced website, running in IIS Express typically, fire up in the debuggers own web server, on the same port as IIS express, with the debug monitor enabled.

Since we are targeting version 3.3 now of joomla (since that is what comes with azure), we have to make some changes to our older code. First off, to make the login button appear (somewhere), we had to change how the module helper  class is loaded:

More generally, we configured the modules and plugins, as shown next:

authentication

user

system

module

Second, the getToken method in the JUtility class has gone! as we learn from

http://stackoverflow.com/questions/18416247/joomla-3-0-call-to-undefined-method-jutilitygettoken

We fix this, in version specific manner, as shown:

(note how we were unable to even include the older versions class reference… except in a comment hint…as linking errors otherwise prevent loading)

to run all this, we emulate a live joomla site, with know good test parameters. Fiddler is configured to redirect the live site’s URIs to the localhost process under debugger control thus:

This assumes that our localhost joomla, with our sso plugin, has been configured with a copy of the IDP metadata (see http://1drv.ms/1DQxhUF) for DEMO/9 IDP and expects to title itself as the SP whose audience is mgs.rapams.com.

https://ssoportallax.rapmls.com/spinitiatedssohandler.aspx/DEMO/9

On windows server, we have to add each domain to the trusted site lists:

Now that we have a basic setup, we refine the testing parameters to use the so called federation mode flow (with DEMO/9 being the FP, and demo/8 being the IDP)

This allows us to get to: a complete flow, for the first time:

To get to auto-registration of a user, there are a couple more porting bugs to fix

We need to adopt the latest status return value conventions, for positive

\$version = new JVersion();
if (\$version->RELEASE == ‘3.3’) {
\$response->status = 1;
} else {
\$response->status = JAUTHENTICATE_STATUS_SUCCESS;
}

and negative return paths

\$version = new JVersion();
if (\$version->RELEASE == ‘3.3’) {
//
} else {
\$response->status = JAUTHENTICATE_STATUS_FAILURE;
}

Since the assertion apparently indicates a null-string email attribute, this upsets some of the joomla 3 login (in a way that presumably it did not upset 2.5 logic). So we amend the condition:

//////////// EMAIL
if (isset(\$userDetails->attributes[“email”])) {
if (\$userDetails->attributes[“email”][0] == “”)
{
;
}
else
{
\$response->email = \$userDetails->attributes[“email”][0];
}
}

once we manually gave the rapstaff user a registered/super user group assignment, the login screen showed what one would expect

To make the debugging process easier (upon receipt of assertion), we amended the inbound message handle thus:

Posted in fiddler

## my ISIS connection to George Washington

Thanks to Google and its general corporate stance of facilitating the spying (by the exceptionals) on me, I’m sure I’m now being spied on – because of the “association” with ISIS.

Ridiculous. But, entirely good enough association to go and harangue me.

George Washington would be proud (and not that’s not sarcasm). He WOULD be (since I’m not one of the elite class of human beings). He set the tone, as a human slaver, and its still with us. say one thing, sell it to the immigrants, and work them to death so long as they are good consumers making the landowning elite “super rich”.

Some poor US contractor has to go make note in my “intelligence dossier now” on the Risk of subversion, given this memo. Poor thing.

Posted in dunno

## org id and webmatrix

windows server 2012, azure/MSDN image

this means that the proxy for the desktop is confused (probably because fiddler was running, when the PC crashed).

## making webapp talk to rets webapi

Image | Posted on

The article prompts me to have nostalgia – since I worked at the site in question. A fixture of silicon valley for sure, the hanger is worth a visit – particularly when understanding what it was used for after the zeppelins long since made their departure. What the rest of the NASA side of the Navy site was doing is also relevant, to internet history – particularly in the area of spying and silicon valleys long love affair with spin around saying its crypto works while doing all in its power to pay back its sponsor for lots of lucrative procurements in the manner desired – ENSURE CRYPTO DOESN’T WORK, when called upon.

So Google engineers will be able to trace the same halls I walked, and visit the many rooms I could not – being classified areas, with super secret meeting rooms, specially clad, etc. So, just what was so secret (apart form the culture of secrecy, that called for classifying the unix ascii man page… once!)?

One has to understand that the site in question once hosted a joint CIA/NASA mission to exploit space for the spying role that NSA did not do itself (back then). No, I’m not talking about the obvious (spy satellites with cameras and radars, beaming “telemetric” data back to NSA receivers at the far end of the site). Even back then, it was critical to judge whether foreign counter-surveillance penetration had succeeded to gain entrance into the “then-critical infrastructure” (the systemic foreign-surveillance-infrastructure that guided emergency response protocols). I’m talking about the process of IMPLANTING or ACTIVATING the bug that takes/took a router or switch and turns/turned it into the spying outpost (for NSA).

Remember it was and still is NASA/CIAs’ job to enable NSA (not be NSA). It’s NASAs job to use space based or space-related-terrestrial communication platforms to enable CIA (to deliver the implanting or activation, for NSA). Its CIAs job to work with the vendors and the supply chain to provide for implantation – with covers and techniques to suite the risk of being discovered or (worse) be fed counter-intelligence material.

It’s kind of cute that its specifically Google – and its all american attitudes concerning unique exceptionalism – that will take charge of one of the founding sites of both pre-internet (aka early nuclear test monitoring) and internet-era  (i.e general spying) secret surveillance – with its massive infrastructure outlay and an entire culture of layered people management (to provide suitable cover stories). One assumes that a few, ultra-nationalist and ultra-indoctrinated Googlers with advanced CIA covers will have a few “Special Assignments”  to perform, too – without the CEOs knowhow (or care, typically, once a “Government Contract” or two is placed ). After all, the secret protocols to the leasing contract will not be known, publicly and will help induce that most american property of exceptionalism of all: plausible deniability.

Posted in spying

assuming I converted my newly-minted and presumably-valid motorcycle safety training credential into an “endorsement” on my drivers license (an american/Nevada affair, these days), I am 007 – officially licensed to kill (myself).

That was the summary and debriefing from the rather excellent but second group of trainers, as they gave me the passing grade. In summary:- Urr….don’t actually ride on the road… (even though you passed). They both sighed with relief when I revealed that the ride I took just (to pass the exam) was probably the last ride I’ll ever take – unless perhaps I go on holiday to some exotic island and want a little fun ride.

Now comes some material that I’m tempted not to write or disclose since its publication – like an advert for scotch – will surely increase the killing rate. Since I’m talking to and about those who think they are uniquely exceptional based on their various exceptional national processes (which includes full disclosure of facts along with endless prattling and mouthiness), as a non-exceptional I can hardly be blamed, however, for inappropriateness; should I emulate my betters by opining…

…that on this and the previous course, everyone was a liar (with small number of exceptions). Shush! They had all ridden countless hours before (illegally, for the most part). Perhaps, much as illegal immigrants to the USA might be denied public hospital care given their status, folks who evidently can ride might now be denied access to a novice course by the instructors upon initial evaluation (aiming to be fair to all, remembering that this is American fairness – somewhat elitist and inequitable, by European standards). A course for actual novices – defined as folk who have NEVER RIDDEN a motorcycle – it was not. A course to get a DMV license, having met minimums, it was. Fortunately, the folks on the second course had a level of teaching skills that do NOT place them that class of teachers, typically found in private schools, who are expert – about teaching “examination technique”, along with the curriculum.

My first teacher was a clearly a superior classroom teacher; and was good at getting folks through both the written and skills exams. With bad luck, Id have passed the skills test under him (despite being entirely unprepared for riding a parking lot safely let alone a road with other vehicles). Only a total screwup up on one test, under this techniques caused me to have too many demerits (to become licensed). A split second decision  on my part let me FAIL! It could easily have gone the other way (and I have been licensed by Nevada, while incompetent).

It was ego that induced me to try again at a second run of the course (since obviously I could pass, should luck flow the other way). So I signed up again, and went through 40 more hours of american style drill training – that mostly focused on drill based learning of skills (rather than examination technique). And so I passed, with a wide margin (though with some personal caveats, as mentioned in the introduction). Played the second time, the course was what I expected: thorough, professional, skilled at skill teaching, and one that used formats and (rider coach) team work to real advantage!

So here is the part you may NOT want to read.

There are examination things you MUST NOT DO, to avoid large numbers of demerits. put another way, here is EXAMINATION TECHNIQUE that minimizes getting demerits:

1. The test, for swerving, always has you swerve to the right. So AIM for the right cone, and don’t worry too much if you swerve a foot too early. The demerits for this are FAR BETTER to incur than hitting the (virtual) obstacle – particular when super nervous in test mode. Thanks to the policeman, on the same test, for the advice (that I ignored the first time around). As I say, America really is FULL OF  EXCEPTIONALS (playing the rules, for killing with guns or motorbiking alike).

2. the test for emergency stop has some nasty demerits for something almost irrelevant to the apparent testing criteria (stopping, before hitting the kid). You get 5 demerits for failing to stop in first gear (the kid is alive… but who cares. first gear is more important…). So, no matter what you do, sacrifice stopping distance for doing the first gear.

Yes, I know, I know. Folks who had “properly internalized” the skills would be doing each of the above (without needing to cheat). But that’s not my point. learning examination technique COULD be putting you more at risk, as you show off your nice new shiny motorcycle license (which is a formal license to kill yourself, in more than one case, from what I observed).

Ok, I’m whining about american processes, and not  giving much positive. As always, it’s the usual caveat, which applies to lots and lots of american’isms; what I see is the best of an inherently bad lot. Which is actually praise (from an Englishman). You are the best (and that includes lying, cheating, over tutoring, and making licensing money from licenses to kill). Its also the best in a more positive vein: american drill process was excellent (assuming you are used to it, which I’m not).

so what would I improve?

Well, the instructors I had were not internet savvy. They failed to use many props and failed to use videos or simulators – to convert words into skill. It took me 40 hours to figure that “press” (in american) means “push” (in English). A simple prop, with a bicycle would never have allowed that! In my first run through the course, I kept trying to press (down) – rather than given a quick pushing shove on the handle, that tips the bike over a bit to help one lean. Sigh!

Washington State’s evidently similar program had an excellent DMV-based video series, that trained some, perhaps like me who are academically enabled and thus NOT particular well adapted any longer to drill training,  to see HOW one measures skill (and why the testing is designed, given lots of testing tradeoffs). And their videos did it in a manner that was not about “playing the examination game,” either. It was simply an academic presentation of the skill levels being expected, talking to an audience in a way that would resonate.

In Nevada schools for biking, however, biker culture reigns (perhaps because of the excellent biking geography). Thus, it is folk lore (as enacted through word/phrase correctness, drill and repetition-style training) that dominates; since that’s the operative culture. If your brain, as with puberty, was irreversibly altered by getting to the higher academic levels, you, like me, might not respond well to drill based learning (in motorcycling or any other area).

Should an academic, from a good private school that trained me up to respect examination technique, be saying this to folks at risk?

You decide! I’m VERY happy on my powered bi-cycle, maxed out at 20mph doing the same manoevres as a real motorbike (using a machine that is REALLY easy to LEARN TO handle).

All in all, in honor of my excellent teachers, today, I may now wave a little at the bikers – as a fellow road user at horrendous risk, due to the nature of “shared road”. Wonder if they will wave back or just scowl at the impudence!?

Posted in coding theory

## web forms proxy, from Azure Active Directory to Amazon Cloud

We perform the steps outlined below, to create a classical ASP.NET web forms application

Now we will take the webapp, webforms work from the sample proxy to invoke the Amazon web service’s STS, to translate the openid-connect id-token into a “security credential” usable at the amazon web-based management console (and its APIs ).

## RESO/NAR’s client proxy for odata & oauth–application (vs user) access

In the world of real estate, the big dog is called NAR – the National Association of Realtors. At the national level, it has lots of money to spend on its favorite vendors, leads standards, and does politics. It’s a big-money trade association! …with folks doing what they are supposed to be doing!

On one topic, there is technical strangeness afoot. It concerns the leveraging of cloud-scale security infrastructure to indirectly govern the connectivity of thousands of clients and servers in the states and localities where actual real estate transactions are generated; AND the activities of newer “value-adding opportunity” of “beyond client/server” vendor intermediaries. One wants, as a governance body, to decide (without seeming to violate anti-trust) WHO gets to “add value” (and under what governance rules).

One well known intermediary, not seen to be dis-intermediating NAR’s Realtor brand, is Trulia – which has essentially comes up with a new model (of doing old things). Good! there is a lot of money in advertising to folk visiting realtor listings (not that the realtor or the broker will see much of the cash!).

On technical security topics, NAR is not strong and seems to pick lower skill security vendors for advice. For example, one might want to build a modernized webapp for searching listing that now talks to an odata API (where the webapp presents the required security tokens to the api provider, using the oauth scheme).

If one understood openid connect scheme properly one would KNOW how properly and professionally to use audiences, scopes and permissions to allow only certain app deployments (and only those-authorized web apps, at those certain sites) to allow public access. And one would not need to hide a username/password in the code of the webapp, or other amateurish low-assurance practices. Note, one might store applicationid/secrets (in what is a trusted app); but not a USER-credential set. Note the subtle difference (one allows to app to behave legally AS-IF the user; the other to behave merely “with authority”).

If one does NOT really understand the way the full security model works when enacting governance regimes for hugely distributed systems (even though the requirements are actually well understood in web app land for such as the public’s searching of listings), one does such things – like invent special “high security” web agents proxies with “special powers” (that store a special “username/password”, so as to get some default “USER” credential necessary to access then the guarded webapi).

This is amateurish way of doing things (in the security world).

How would a professional security cloud vendor approach the problem?

take a look at this sample code (NAR/RESO):

Note how user identities and app identities are handled, when architected properly in multi-tier oauth environments. Compare it with other projects in the sample set – which showcase different uses of the security protocols, for different systemic objectives, where user controls and application controls have different objectives.

Posted in RETS

## Amazon cloud’s Identity fabric–interworking with AAD

We started reading a book on amazon’s cloud in the course of which we started playing with the cloud console (and some of its identity management features). By change, our eye was drawn to the security credentials menu option ; off of which hangs the ability to specify identity providers.

Amazon’s help  system describes how to use identity providers, thus:

http://docs.aws.amazon.com/IAM/latest/UserGuide/idp-managing-identityproviders.html

We first configured the SAML provider, learning to do some special edits – as noted.

getting us to some success

Since AAD, as IDP, will requires an SP record, we note the amazon console’s audience name: https://signin.aws.amazon.com/saml.

The resulting “trust policy” is a set of conditions on what claims must be asserted by the IDP to be “trusted” as a source of identity. These are not access control rules!

{
“Version”: “2012-10-17″,
“Statement”: [
{
“Effect”: “Allow”,
“Action”: “sts:AssumeRoleWithSAML”,
“Principal”: {
“Federated”: “arn:aws:iam::385727861301:saml-provider/faa”
},
“Condition”: {
“StringEquals”: {
“SAML:aud”: “https://signin.aws.amazon.com/saml”
}
}
}
]
}

back on the IDP, our netmagic.onmicrosoft.com tenant of course, we configure the SP record:

back on the SP ,we set the access control policy for this now-trusted source of identities:

where we note our final review information:

Role Name

saml

Role ARN

arn:aws:iam::385727861301:role/saml

Trusted Entities

The identity provider arn:aws:iam::385727861301:saml-provider/faa

Permissions

To test, we guess a little and learn from https://console.aws.amazon.com/iam/#home

that our LOCAL signin url is https://385727861301.signin.aws.amazon.com/console

which we customize to https://rapmlsqa.signin.aws.amazon.com/console.

It seems that the amazon flow only supports idp-initiated websso. Which stumps us.

Turning our attention to the openid connect integration alternative, we see that

To configure the SP sides “roles”, we do as follows:

{
“Version”: “2012-10-17″,
“Statement”: [
{
“Effect”: “Allow”,
“Action”: “sts:AssumeRoleWithWebIdentity”,
“Principal”: {
},
“Condition”: {
“StringEquals”: {
}
}
}
]
}

For the now-federated user’s permissions on the SP, we configure

{
“Version”: “2012-10-17″,
“Statement”: [
{
“Sid”: “Stmt1414516823000″,
“Effect”: “Allow”,
“Action”: [
“aws-portal:*”
],
“Resource”: [
“*”
]
}
]
}

Giving

With this we end our exploration – unable so far to invoke the process of landing on the amazon console using a federated-user!

But we learned a lot – about amazons policy-based RP-STS service, for hosted webapps/mobile-apps

Posted in openid connect

## American SSL crypto is just crap (even if its really Taiwanese electronics)

After 25 years of doing SSL, its worse than it was at the outset. But then, its US government policy to ensure it DOESN’T really work. The vendors duly facilitate; participating in the system deception of the public.

The issues is not only dlink’s firmware producing certificates with bad dates. Its also Microsoft platform that is the problem. It allowed the ActiveX control to load, for an invalid certificate. Because the cert is on a “trusted” chain, this policy overrides the invalidity.

But then, what Microsoft program in the crypto/trust area is essentially NSA policy – to ensure that their folk can invade a foreigner’s PC (with impunity, since we have little more than slave rights to the mighty, exceptional Americans); and set such setting (when it facilitates an easy crypto win).

Posted in dunno

Posted in RETS

## RESO/NAR oauth strategy – a humor in the style of a Greek tragedy

Despite hours of effort measured in the 100s,  I have not really managed to get to the source of NAR/RESOs oauth strategy. I’m left with interpreting the technical documents, comparing them with similar documents produced by other profiling groups. In summary, I’m confused. From what I can tell, IMHO, LOL and other teenage terms now used by managers in their fifties and sixties, it seems it starts “out of date”. Quite why it is so backward in its thinking seems to be based on the commercial motives of those doing the profiling: keep real estate IT backward and a nice little earner for a select group. Oh, and ensure it pretends to satisfy anti-trust rules, while in reality blocking the entrance of cloud vendors and their millions of “tenants” into the IT market.

Hmm, the humor element is missing. So let’s add some, English style.

Not being a member of the exceptional class of human being (not seen seen the end of the nazi-era of “the exceptional race”), I looked at Microsoft Azure’s Mobile service for architectural guidance. How might OAuth be applied – TODAY – if one were to adopt cloud-centric thinking? For the purposes of learning by thinking through issues from all perspectives – and not only one’s own – one can now compare the different generations of oauth systems and their profiles. One can start analysis looking at the pre-cloud oauth architectures of 5 years ago – which is where RESO seems to be starting out.

Azure mobile is a post-RESO model – of cloud thinking. It focuses on webapi-enabled mobile apps on several devices held by a community of users – supported by oauth security technology that in turn is supported by classical websso (using SAML and ws-fed).  It comes both with a server-side model (where tokens are delivered from the OAuth AS directly to webapi endpoints in return for session-tokens) and a client-side model (in which trusted apps maintain caches of tokens, that they choose to renew and/or supply to one or more webapis).

If we characterize RESO’s design in terms of Azure Mobile, RESO is a clumsy server-side scheme – not seen the early days of oauth when folks invested perhaps at most of couple of days development effort, as they bootstrapped their ideas linking up some existing login page to the early webapis. But why would NSA be interested in this class of effort?

It’s just preposterous to think of NSA vs. RESO in terms of NSA vs Web (and the “perhaps” humorous model of plantations of enslaved web users, with NSA as the model “master” keeping the 8 billion non-exceptional slaves – like me – in line with systemic threat and surveillance culture, all wrapped up in a flag). It is,  none-the-less, more proper see the security planning of small profiling groups in terms of more well-intentioned national programs, such as those in identity management or cybersecurity preparation. What a contrast (NSA the spy, NSA the protector)!

If NSA was “subverting” RESO, in the sense that it compromised many a cipher device in the 70s through cozy relations with vendors and standards groups, why would it be attempting to keep things so backwards – in such an important economic sector as real estate? Why would the security technology be out of date, before the spec is even voted on? Why would the management dynamics of the standards group be so contrived? Why would “they” want it so “small-minded”?

I’ve heard several proposed answers to such questions. One answer – the vendor rationale – is founded on the program being a last gasp attempt to prevent the cloud-revolution replacing the traditional set of vendors. Another last gasp theory – the NAR rationale – contends that it is all founded in trying to deter dis-intermediation, protecting against entrance of the “web assurances of new economic players with novel approaches to “facilitating property conveyance” and making it hard for them to replace the core assurances provided by the human Realtor. One slightly-more realistic answer – the NSA paranoia rationale – proposes that NSA is simply doing what it knows how to do well as an institution – motivate adoption of modern cloud in those areas of real critical infrastructure while keeping things nicely server-side for the rest of the general economy on which it spies, engendering adoption of older standards for which the subversion techniques are well known, are known to work well and can be assured to work by deception through by financial bribes (or golf trips) to the right people, in the right places.  As they contrive.

Well the humor never ceases – as the game of crypto and cryptanalysis plays out.

Posted in coding theory

Configuration parameters.

Posted in coding theory

## visualizing odata source model

we following  the instructions noted below:

We had to install visual studio 2012, specifically, and the noted plug-in. Since the netflix feed no longer exists, we used an alternative:

http://odata.msteched.com/sessions.svc/

The result was a model diagram:

Posted in coding theory

## Prep for RESO plugfest

From

To use this, we will need to (a) not use default credentials, (b) change the URl, and (c) amend the xml parsing to match the entities being returned by a RESO webapi server.

#Construct the WebClient object

\$wc = New-Object System.Net.WebClient

\$wc.UseDefaultCredentials = \$true

#Query OData service with url

\$queryUri = http://localhost:42203/NorthwindOData.svc/Categories(1)/Products?\$top=5

#Parse query result in XML format

\$entities = \$responseXML.SelectNodes(“//*[local-name() = ‘properties’ ]”)

ForEach(\$entity in \$entities) {

\$prodID = \$entity.SelectSingleNode(“./*[local-name() = ‘ProductID’]”).InnerText;

\$prodName = \$entity.SelectSingleNode(“./*[local-name() = ‘ProductName’]”).InnerText;

\$prodUnitPrice = \$entity.SelectSingleNode(“./*[local-name() = ‘UnitPrice’]”).InnerText;

\$prodLine = [string]::Format(“ID:{0}, Name:{1}, UnitPrice:{2}”, \$prodID, \$prodName,\$prodUnitPrice)

Write-Host \$prodLine -foregroundcolor Green

}

Posted in coding theory

## commodity crypto entrapment by the likes of Google, Apple et al

I’m being really unfair in only including Google and Apple in the title, above. IT really ought to include… almost any large American firm. And, what’s more, any large French, British or German firm. too. And Russian and Chinese firm… (etc).

But one difference between the Russian and American firms is the manner of the deception. One claims to be honest (while having enormous capability for spying on all of us), the other “everyone knows” its spying (while having self-imposed limits). One markets the appearance of strong “unbreakable” commodity crypto and computer security to those who are “trusted” (today), the other openly admits its crypto and security is by NO means unbreakable – by official trust policy.

In the American model, citizenship means that one is supposed to by into the social need to accept the social lie – that crypto works ALL THE TIME, ONCE PURCHASED – in order that foreigners can be duped into the same belief system. The citizen gives up actual individual security in favor security provider by the state, in order that the state can spy on all foreigners – who “unwittingly” happen to use American crypto – which doesn’t work, once  triggered.

But therein lies the lie. Of course American crypto “works” – its just that the implementation you use doesn’t – once a number of things happen. And how those triggers get applies are the secrets MUCH bigger than crypto secrets. They are that which turn off even the appearance of crypto actually working as ONE EXPECTS!

The typical American doesn’t realize the following things, done in the name of trusted citizenship (and doesn’t want to know, being trusted):

1. Video cameras track every vehicle license plate

2. Traffic signals induction loops count bodies in cars

3. devices based on software are typically trivially enabled to remote software switch (so that the vendors marketing “assurances’ no longer apply, since the software security changes…)

4. GSM can enables certain spy satellites, once launched, to track the act of RECEPTION of GSM signals

5. other electronics can be induced, on one channel, to act as a radio receiver – and thereby behave largely as 4 on another channel

All in all folks should know that the very “electronics” infrastructure is carefully wired for tracking the source of signals – which is of course the art of first breaking the cryptoNET (before breaking the crypto). Folks MUST know that the Google’s and Apple’s actively participate in the systemic deception intended to persuade the typical foreigner that the crypto works (until the crypto net surveillance and tracking built into civilian infrastructure – in the name of national security (not military infrastructure subject to arms treaties) – makes it otherwise.

Posted in coding theory

## learning from reso odata and oauth – and onto documentdb’s security model

it was fun to see how NAR politics has manipulated the RESO process, in both the odata and oauth spaces. But we can move on, since there is no more R&D to be done in the area for now.

Now, what we did learn was all about node.js, mongodb and the apparatus that goes with it. We can start to look into the windows worlds equivalent (documentdb) and see what a “pure” json-centric webapi server looks like – when constructed using azure cloud principles and microsoft libraries.

in particular, we need to see just how folks have orchestrated how an azure mobile site really works with a documentdb collection and the db account, in terms of identity, identity pass through, 3tier systems, and the like.

In the NAR node.js prototype, we saw how the webapi server expected to work with mongodb accounts. Inb the azure mobile with sql and odata/table stores, we saw how the schema or the masterpartition key was used to segment the spaces by either app or by user (within the app).  With documentdb, lets hope we have a much better though through model – in the sense that it all cues off tokens, token handoffs, and claims, etc.

Before we go there, lets go have a looksie and modern joomla hosting in azure websites and Amazon EC2 web services (and its security model). It will be interesting to compare azure and amazon, and see how much commonality there is around upper layers security.

Posted in coding theory

## A 3-tier odata solution for NAR/RESO, with server-side oauth

which shows how to create a 3 element system, composed of a windows store client (armed with the ability to use the login  capabilities of the azure mobile packages), a azure mobile middleware site (with which the app collaborates to perform what is technically server-side login), and an odata webapi instance (hosted in a azure store table service).

Though this client gives us one viewer, it is very tuned to login and end user interaction. We can also run the more basic odata–centric viewer client allowing us to directly talk odata/webapi repository of entities underlying the above. We see:

We see, from a fiddler trace, the interaction between this odatda client and the odata server (for a JSON-encoded response):

alternative, we can remove the accept header and allow the default output format to take emerge:

End.

Posted in coding theory

## using azure table service for NAR RESO odata

Some notes on what we did to call the azure table services entity endpoints, and \$metadata.

First we built a code app for table storage management:

The tool needs a webapi server – i.e. the cloud table service. So we created ourselves an “instance” of the webapi:

Server and data repository created, we used the tool to post and update of an entity.

Since this tool does not actually bother to read (or update) metadata for the entities it posts, we found ourselves using other samples to make a simple command line tool that simply issues canned queries (with suitable security headers attached). This at least now reads the metadata (EDMX)

```using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Net;
using System.Security.Cryptography;
using System.Text;
using System.Xml.Linq;

namespace ConsoleApplication1
{
class Program
{
static HttpWebRequest GenerateODataWebRequestForAzureStorage(string url, string accountName, string accountKey)
{
var request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
var resource = request.RequestUri.AbsolutePath;
string stringToSign = string.Format("{0}\n/{1}{2}", request.Headers["x-ms-date"], accountName,
resource
);
var hasher = new HMACSHA256(Convert.FromBase64String(accountKey));
string signedSignature = Convert.ToBase64String(hasher.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
string authorizationHeader = string.Format("{0} {1}:{2}", "SharedKeyLite", accountName, signedSignature);
return request;
}

static void QueryTableItemsWithRawODataHttp()
{
var accountName = "reso";
var accountKey = "ai8cqXvEv6hE17qsgnFw2TK0rEkMA84EU8xQ/dIrH4KAHfTaB27HFFOowM9LVQroClukysG223KcUPb6md2PRA==";
var queryUrl = string.Format("http://{0}.table.core.windows.net/Property?\$top=5", accountName);
var queryUrl2 = string.Format("http://{0}.table.core.windows.net/Tables()", accountName);

var request1 = GenerateODataWebRequestForAzureStorage(queryUrl1, accountName, accountKey);
var response1 = request1.GetResponse();
using (var sr = new StreamReader(response1.GetResponseStream()))
{
Console.WriteLine(doc);
}

Console.WriteLine();
Console.WriteLine();
Console.WriteLine();

var request2 = GenerateODataWebRequestForAzureStorage(queryUrl2, accountName, accountKey);
var response2 = request2.GetResponse();
using (var sr = new StreamReader(response2.GetResponseStream()))
{
Console.WriteLine(doc);
}

Console.WriteLine();
Console.WriteLine();
Console.WriteLine();

var request = GenerateODataWebRequestForAzureStorage(queryUrl, accountName, accountKey);
var response = request.GetResponse();
using (var sr = new StreamReader(response.GetResponseStream()))
{
foreach (var elmItem in doc.Descendants(XName.Get("properties", nsMetadata)))
{
var name = elmItem.Descendants(XName.Get("ListingRID", nsSchema)).First().Value;
var partitionKey = elmItem.Descendants(XName.Get("PartitionKey", nsSchema)).First().Value;
Console.WriteLine("ListingRID:{0}, RowKey:{1}", name, partitionKey);
}
}
}

static void Main(string[] args)
{

QueryTableItemsWithRawODataHttp();
}
}
}

```

This allows us to see in fiddler what is passing on the wire (in XML format):

Posted in coding theory

## p-adics – the hidden cryptanalysis model

One thing Turing was known for, in the first computer period to follow WWII, was his showcasing how to add large numbers, having written “numbers” out, unusually, from right to left. The carry rule would move right to left still, yet enable correct arithmetic. Of course, any first year computer science student studies one’s and two’s complement “representations” – that computer representation of signed integers that makes them amenable to digital electronics doing adding. The study of all this has much deeper background, however, coming from the historical interplay between the art of machine based cryptanalytical methods and the study of the limits of machine computability – a  historical topic still largely un-discussed and probably formally viewed even today as  being treated as if the theory was still a “secret weapon.

The presenter of the video at https://www.youtube.com/watch?v=vdjYiU6skgE does a good job of stating what folks new in the 1930s about the mathematical toolkit of “techniques” underlying cryptanalytical computing. The cryptanalytical “math secrets” of the WWII period also concerned this ”theory” of computing/cryptanalysis – founded as we see from the video presentation in projective geometry, p-adic distance metrics, special relativity (for timebases), conformal field ideas, cauchy sequences, and notions of causality between regions of certain geometric spaces. The use of such apparatus elements to represent large numbers had a simple initial objective: express number and fractions at high accuracy so as to finesse colossus style processing of correlations to be found in noisy signaling channels. This would have been enabling cryptanalysts even in 1947 to be having , say, 200-digits-of accuracy decimals, stored in the flexible 1946-era drum memory used by the first manchester computers.  The resulting adders could then do the p-adic arithmetic – or two complement (when p=2). And “this was the secret”.

But p-adics are more fundamental to cryptanalysis than mere twos-complement – being the “raw theory” of branching. When the Yanquis claim that only their own first research machines did the modern (ie Turing) “stored program concept” based on on a tape’s stored instruction indicating a branching instruction, we might ask if only that only one form of the underlying branching theory – not that you will hear such a question posed by the NSA-crypto and therefore american historians.

P-adics should be thought of as Turing was taught them in Newman’s “special” Topology classes – being the nesting of certain geometric relations and then the mapping of the nested spaces onto 1 dimensional lattices and tree structures underlying space-filling curves – where the branching of such trees is rather intuitive. Branching in this sense – through implied spaces – is rather more “algorithmic” that merely having some stored processor recognize the ‘conditional jump’ bit pattern on a tape!

The art of cryptanalysis is now, and was in Turing’s day, all about having computing models that noone thinks you even might have…let alone think you have reduced to practice (and certainly not have made into cost-effective technologies). Iterative algorithms for updating colossus style calculations of conditional probabilities – teasing out the most liable candidates for solving some crypto puzzle – all have their roots in discrete schrodinger equation calculations, hyperbolic surfaces, conformal coordinates, and leverage symmetric matrices to update the convergence-based newtonian “root-finding algorithms” – as more and more depths of information were fed into the graph-search.

Posted in coding theory

## NAR oauth and odata servers

Let’s take a solution and to it add our 2 working projects – an odata feed and endpoint, and an authorization server (with grant management UI, too).

Now we need a client, that can orchestrate the handshake.

Posted in oauth, odata

## using NAR OAUTH server management plane

we were able to register a first client, entitled to seek an authorization code grant:

The values, distinct from the display name we entered as “clientid” are all generated values:

Name:
clientid(Subscribers will use this)

client_id:
7ClcPY1GXEephAZZ

client_secret:
gBtUn2I2o5gx38cVearV4WrKoVOcz1Jfr7r3OdlT

redirect_uri:
http://localhost/

we used the

Looking at the code, we see that each form post back to a route, registered in the setup:

This step is performed authorization server manager, of course, the custodian of the data (acting on behalf of the resource owner). The next step is performed by the administrative user of the client, who must authenticate to the AS in order to get the one time code – that seeds obtaining the first access token (and any refresh token):

For some reason, NAR refer to this process as one of “enablement”.

In another experiment, we can start to hook up the webapi server and authorization server.

Posted in oauth

## NAR/CRT OAUTH2 server for webAPI

We managed to get the NAR/CRT OAUTH2 server booted and able to serve some pages.

On Windows these were the steps we took:

1) we created a node web application (microsoft model, for easy azure hosting)

2) on the npm line, we right clicked to add a node package, using the package manager who we primed to search on reso-

3) per instructions, we copied config files and directory from the samples directory to the root

4) we amended the default server.js file

and started the mongodb server

we see

the config file was changed, form the original, to turn off ssl/encrypted-traffic

{
“port” : 1340,
“security”: {
“tokenLife” : 3600
},
“mongoose”: {
“uri”: “mongodb://localhost/oauth”
},
“domain”: “localhost”,
“entry_point”: “index.html”,
“encrypted_traffic”: false,
“certificate”: “./ssl/server.crt”,
“key”: “./ssl/server.key”
}

we see that various collections of objects are present in in the mongodb, now.

Posted in odata

## Excel Power Query talking to NAR odata/atom feed; a first experiment

Now that we can issue basic http requests to the NAR webapi server, a set of node.js packages and setup, we configure excel power query to talk to the same server, modified slightly to allow a proxy to intercept the communications between excel and webapi server.

First, we host the server on the me, domain (rather than localhost), amending host file.  Then we alter excel so it uses the reverse proxy, listening on a particular port.

The latter step is achieved by adding a Microsoft.Mashup.Container.exe.config  fgile in C:\Program Files (x86)\Microsoft Power Query for Excel\bin

<?xml version=”1.0″ encoding=”utf-8″ ?>
configuration>
<system.net>
<defaultProxy>
<proxy autoDetect=”false” bypassonlocal=”false” proxyaddress=”http://127.0.0.1:8888″ usesystemdefault=”false” />
</defaultProxy>
</system.net>
</configuration>

Of course, the windows system hosts file exposes me, for lookback adaptor

::1             localhost me

the webapi server itself is instructed next to no longer listen on localhost but on me:

listener, changed to host = me

The webapi server does not seem to have the logic to issue authorization challenges, should there by missing authorization headers. So, we pre-configure excel with basic credentials:

When we have excel talk to = OData.Feed(http://me:42999/DataSystem.svc/?DataSystem), we get

The fiddler proxy, spying on the communications now, has two sessions:

excel issues two http requests, with different accept headers:

Accept: application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8 REQUEST

We list the messages here:

GET http://me:42999/DataSystem.svc/?DataSystem HTTP/1.1
MaxDataServiceVersion: 3.0
Authorization: Basic YWRtaW46YWRtaW4=
Accept-Encoding: gzip, deflate
Host: me:42999
Connection: Keep-Alive

WITH RESPONSE

HTTP/1.1 200 OK
content-type: application/xml;charset=UTF-8
Vary: Accept-Encoding
Date: Wed, 03 Sep 2014 00:45:49 GMT
Connection: keep-alive
Content-Length: 7970

<?xml version=”1.0″ encoding=”utf-8″?>
<id>http://me:42999/DataSystem.svc</id>
<title type=”text”>DataSystem</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<entry>
<id>http://me:42999/DataSystem.svc/DataSystem(‘RESO API Server’)</id>
<link rel=”edit” title=”DataSystem” href=”DataSystem(‘RESO API Server’)” />
<title>Data Services for RESO API Server</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<author>
<name>Center for REALTOR Technology</name>
</author>
<content type=”application/xml”>
<m:properties>
<d:Name>RESO API Server</d:Name>
<d:ServiceURI>http://me:42999/DataSystem.svc</d:ServiceURI>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:45:47.726Z</d:DateTimeStamp>
<d:TransportVersion>0.9</d:TransportVersion>
<d:Resources m:type=”Collection(RESO.OData.Transport.Resource)”>
<d:element>
<d:Name>Property</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Property Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Member</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Member Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Office</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Office Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Contact</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Contact Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Media</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Media Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>History</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard History Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>SavedSearch</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard SavedSearch Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OpenHouse</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OpenHouse Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Green</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Green Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Room</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Room Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>UnitType</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard UnitType Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OtherPhone</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OtherPhone Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>SocialMedia</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard SocialMedia Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OfficeUrlType</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OfficeUrlType Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>UserDefinedField</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard UserDefinedField Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
</d:Resources>
<d:ID>RESO API Server</d:ID>
</m:properties>
</content>
</entry>
</feed>

# Accept: application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8 REQUEST

GET http://me:42999/DataSystem.svc/?DataSystem HTTP/1.1
MaxDataServiceVersion: 3.0
Accept: application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8
Authorization: Basic YWRtaW46YWRtaW4=
Accept-Encoding: gzip, deflate
Host: me:42999

WITH RESPONSE

HTTP/1.1 200 OK
content-type: application/xml;charset=UTF-8
Vary: Accept-Encoding
Date: Wed, 03 Sep 2014 00:45:49 GMT
Connection: keep-alive
Content-Length: 7970

<?xml version=”1.0″ encoding=”utf-8″?>
<id>http://me:42999/DataSystem.svc</id>
<title type=”text”>DataSystem</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<entry>
<id>http://me:42999/DataSystem.svc/DataSystem(‘RESO API Server’)</id>
<link rel=”edit” title=”DataSystem” href=”DataSystem(‘RESO API Server’)” />
<title>Data Services for RESO API Server</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<author>
<name>Center for REALTOR Technology</name>
</author>
<content type=”application/xml”>
<m:properties>
<d:Name>RESO API Server</d:Name>
<d:ServiceURI>http://me:42999/DataSystem.svc</d:ServiceURI>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:45:52.836Z</d:DateTimeStamp>
<d:TransportVersion>0.9</d:TransportVersion>
<d:Resources m:type=”Collection(RESO.OData.Transport.Resource)”>
<d:element>
<d:Name>Property</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Property Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Member</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Member Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Office</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Office Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Contact</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Contact Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Media</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Media Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>History</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard History Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>SavedSearch</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard SavedSearch Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OpenHouse</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OpenHouse Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Green</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Green Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Room</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Room Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>UnitType</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard UnitType Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OtherPhone</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OtherPhone Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>SocialMedia</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard SocialMedia Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OfficeUrlType</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OfficeUrlType Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>UserDefinedField</d:Name>
<d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard UserDefinedField Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
</d:Resources>
<d:ID>RESO API Server</d:ID>
</m:properties>
</content>
</entry>
</feed>

If we talk to the listing.svc, we get:

fiddler spying on the communications shows that excel expects the listing.svc to be supported by a \$metadata endpoint:

Posted in odata

## datasystem metadata from NAR webAPI server

THe NAR webAPI server, a node.js package and bootstrap fileset, can be made to do something using the following first test harness:

1. run the node.js server and mongodb instances, per previous research notes.

2. create a fiddler request, in the compose tool, that will have fiddler talk to the http://localhost:42999/DataSystem.svc endpoint

3. use basic authentication settings (only), where the username and password are both “admin”, i.e.  ‘Authorization: Basic YWRtaW46YWRtaW4=’

4. add a querystring parameter, to denote a query that seeks metadata: http://localhost:42999/DataSystem.svc/?DataSystem

the result is a first stream, that shows we have something working!

For the request

GET http://localhost:42999/DataSystem.svc/?DataSystem HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-US,en;q=0.5
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; Touch; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Host: localhost:42999
DNT: 1
Authorization: Basic YWRtaW46YWRtaW4=
Connection: Keep-Alive

we get response body of

<?xml version=”1.0″ encoding=”utf-8″?>

<id>http://localhost:42999/DataSystem.svc</id>
<title type=”text”>DataSystem</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<entry>
<id>http://localhost:42999/DataSystem.svc/DataSystem(‘RESO API Server’)</id>
<link rel=”edit” title=”DataSystem” href=”DataSystem(‘RESO API Server’)” />
<title>Data Services for RESO API Server</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<author>
<name>Center for REALTOR Technology</name>
</author>
<content type=”application/xml”>
<m:properties>
<d:Name>RESO API Server</d:Name>
<d:ServiceURI>http://localhost:42999/DataSystem.svc</d:ServiceURI>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:23:47.006Z</d:DateTimeStamp>
<d:TransportVersion>0.9</d:TransportVersion>
<d:Resources m:type=”Collection(RESO.OData.Transport.Resource)”>
<d:element>
<d:Name>Property</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Property Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Member</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Member Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Office</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Office Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Contact</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Contact Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Media</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Media Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>History</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard History Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>SavedSearch</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard SavedSearch Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OpenHouse</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OpenHouse Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Green</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Green Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>Room</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard Room Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>UnitType</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard UnitType Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OtherPhone</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OtherPhone Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>SocialMedia</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard SocialMedia Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>OfficeUrlType</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard OfficeUrlType Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
<d:element>
<d:Name>UserDefinedField</d:Name>
<d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
<d:Description>RESO Standard UserDefinedField Resource</d:Description>
<d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
<d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
<d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
</d:element>
</d:Resources>
<d:ID>RESO API Server</d:ID>
</m:properties>
</content>
</entry>
</feed>

A quick look at the javascript code generating this response body shows

that the xml is essentially canned, and the resource classes are programmed in ram (and are not obtained from a db).

Fro the data dictionary package, we see the basic model being setup, with 15 entities and the associated entitycontext:

from this, evidently, we see an enumerator run over the \$data and create a resources definition set – for use in configuring the core server’s resourceList and index list (inferred from the metadata and entityset declarations):

So, from the metadata jaydata declarations, we get resourcelists and indexes, that are passed to the callback classes that are attached to the listing.svc and dataservice.svc routes:

End.

Posted in odata

## mongodb in azure

Let’s use the add on service to our Microsoft Azure cloud tenant, augmenting the fabric with mongodb:

http://msopentech.com/blog/2014/04/22/mongolab-offers-new-options-mongodb-developers-azure/

so when we apply the following:

we get a working mongodb instance

mongo ds052837.mongolab.com:52837/mongodb -u pwilliams –p …

https://mongolab.com/databases/mongodb

now we work a tutorial from node.js to our mongodb db/collection (before even attempting the NAR node.js integration with this instance):

our connection string, from azure,  is mongodb://mongodb:yilxWKhgqv17AHdd3P4ur3PkYJxwkwVpdjCE6.CctrA-@ds052837.mongolab.com:52837/mongodb

we learn to install express (the scaffolding tool for the express web app framework)

http://expressjs.com/guide.html#executable

and upon using the tool, we get output similar to that suggested in the tutorial:

Per instructions we ‘npm install’

And install the “mongoose” db driver

Using visual studio and a nominal project, we then modify the application, per instructions (largely, with exceptions in app.js).

we assign the environment connection string, per instructions:

Clearly, we have a basic website up and running:

now, the site doesn’t work (but then we didn’t feel the instructionsquite matched what we were seeing from the expected express scaffolding).

so we start again, this time using the microsoft express project type (to which we then add the controllers, routers, and app hookups, as per the instructions); again.

Now we have success, once we publish to the same website as used earlier, replacing code:

With that done, we see clearly that the webapp’s post b ack page works, since we can see records in the db – using its management console:

Now we have see one express project, talking to our mongodb instance via the mongoose driver, we can go have a more careful look at the NAR code. Perhaps we can see how to make it talk to our mongodb instance, now, and then host it too in the azure cloud.

Posted in nodejs

## how to host NAR node.js webapi demo in azure

With the node.js tool chain for visual studio, making a first node project and publishing to the azure cloud (as a website) could not have been any easier.

We then followed the alternative, more basic instructions based on git publishing:

http://azure.microsoft.com/en-us/documentation/articles/web-sites-nodejs-develop-deploy-mac/

This was not much harder than the visual studio centric method:

We this we are probably set to try and publish the node.js portion of the NAR webapi.

But, first we need an azure hosted mongodb!

Posted in nodejs

## Running NAR’s open source webAPI server

We followed some counsel to augment our Visual Studio 2013 installation with “node.js” tools, with a view to running NAR’s open source webapi server.

http://www.hanselman.com/blog/IntroducingNodejsToolsForVisualStudio.aspx

We learned about NAR’s open source material from a member’s only resource:

http://members.reso.org/display/TRANSPORTDF/NAR+Open+Source+Recap

which referred to

https://github.com/NationalAssociationOfRealtors

Using these sources and the knowhow taught by bloggers, we launched visual studio with a view to getting the point where we can run and debug the NAR webAPI server. This required using git, in visual studio, to clone the repository for the reso-api-server and then make a new node.js project, that imported the (existing) repository files.

Using the npm tools, we download the referenced packages. Next we followed the instructions to get the main package:

we had to change the dependency type from the default (and chose GLOBAL) to avoid the error shown in the console, left.

This all teaches us that perhaps we are going down the wrong path – not know what Global even  is, etc.

So we start again, simply creating a new node.js application, to which we install the reso-api-server as a standard package. Then we following the metadata and configuration setup instructions:

we take a guess at some reasonable configuration parameters:

#
# Configuration file for a RESO API Server
#
#
# API Service
#
COMPRESSION:        false
SERVER_DOMAIN:        localhost
SERVER_NAME:        RESO API Server
SERVER_PATH:        listing.svc
SERVER_PORT:        42999
SERVER_PROTOCOL:    http
#
# Data Processing
#
EXTERNAL_INDEX:        true
LOG_ENTRY:        false
PROCESS_WAIT:        15
RESPONSE_LIMIT:        200
#
# HTTPS Certificates
#
#CA_CERTIFICATE:        ./ssl/ca.crt
#SERVER_CERTIFICATE:    ./ssl/server.crt
#SERVER_KEY:        ./ssl/server.key
#
# Authentication
#
#AUTH_REALM:        RESO API SERVER
#AUTH_TYPE:        Basic
#AUTH_TYPE:        Digest
AUTH_TYPE:        None
#
# OAuth2 Services
#
#OAUTH2_URI:        https://localhost:1340/authorize

Running the sample startup file, we get

we attempt to fix by taking the “alternative” configuration path, using the data dictionary package.

Augmenting the instructions, we consult

This fixes one startup issue, but then we hit a next – obviously related to SSL setup:

http://nodejs.org/api/tls.html

we install openssl (that NSA-friendly package full of strangely broken crypto and security protocols, supported by “american foundations”);

http://slproweb.com/products/Win32OpenSSL.html

Several things seem to fix, on merely installing openssl (and adding the bin directory to the system path).

Without creating any ssl/ directory (or keys), and having adjusted the test.js and service.conf to be in the root project directory (and out of the metadata/ directory), we now get

which we guess means we are missing a mongodb server (listening on the port mentioned).

Lets guess, given the code below, that we followed bad instructions, initially – and somehow were not reading the correct files, in the right directories etc.

as it actually stands, after all this confusion:

a) we have a mongdb instance

and

b) a visual studio debugging instance of the node.js process (THAT evidently has a connection to a mongdb with zero entities).

at which point we seem able to boot a components hosting a couple of services

Posted in odata