web forms proxy, from Azure Active Directory to Amazon Cloud

We perform the steps outlined below, to create a classical ASP.NET web forms application

image

Now we will take the webapp, webforms work from the sample proxy to invoke the Amazon web service’s STS, to translate the openid-connect id-token into a “security credential” usable at the amazon web-based management console (and its APIs ).

Posted in Azure AD

RESO/NAR’s client proxy for odata & oauth–application (vs user) access

In the world of real estate, the big dog is called NAR – the National Association of Realtors. At the national level, it has lots of money to spend on its favorite vendors, leads standards, and does politics. It’s a big-money trade association! …with folks doing what they are supposed to be doing!

On one topic, there is technical strangeness afoot. It concerns the leveraging of cloud-scale security infrastructure to indirectly govern the connectivity of thousands of clients and servers in the states and localities where actual real estate transactions are generated; AND the activities of newer “value-adding opportunity” of “beyond client/server” vendor intermediaries. One wants, as a governance body, to decide (without seeming to violate anti-trust) WHO gets to “add value” (and under what governance rules).

One well known intermediary, not seen to be dis-intermediating NAR’s Realtor brand, is Trulia – which has essentially comes up with a new model (of doing old things). Good! there is a lot of money in advertising to folk visiting realtor listings (not that the realtor or the broker will see much of the cash!).

On technical security topics, NAR is not strong and seems to pick lower skill security vendors for advice. For example, one might want to build a modernized webapp for searching listing that now talks to an odata API (where the webapp presents the required security tokens to the api provider, using the oauth scheme).

If one understood openid connect scheme properly one would KNOW how properly and professionally to use audiences, scopes and permissions to allow only certain app deployments (and only those-authorized web apps, at those certain sites) to allow public access. And one would not need to hide a username/password in the code of the webapp, or other amateurish low-assurance practices. Note, one might store applicationid/secrets (in what is a trusted app); but not a USER-credential set. Note the subtle difference (one allows to app to behave legally AS-IF the user; the other to behave merely “with authority”).

If one does NOT really understand the way the full security model works when enacting governance regimes for hugely distributed systems (even though the requirements are actually well understood in web app land for such as the public’s searching of listings), one does such things – like invent special “high security” web agents proxies with “special powers” (that store a special “username/password”, so as to get some default “USER” credential necessary to access then the guarded webapi).

This is amateurish way of doing things (in the security world).

How would a professional security cloud vendor approach the problem?

take a look at this sample code (NAR/RESO):

https://github.com/AzureADSamples/WebApp-WebAPI-OAuth2-AppIdentity-DotNet

Note how user identities and app identities are handled, when architected properly in multi-tier oauth environments. Compare it with other projects in the sample set – which showcase different uses of the security protocols, for different systemic objectives, where user controls and application controls have different objectives.

Posted in RETS

Amazon cloud’s Identity fabric–interworking with AAD

We started reading a book on amazon’s cloud in the course of which we started playing with the cloud console (and some of its identity management features). By change, our eye was drawn to the security credentials menu option ; off of which hangs the ability to specify identity providers.

image

https://login.windows.net/bcbf53cf-af9a-4584-b4c9-6d8b01b3781d/federationmetadata/2007-06/federationmetadata.xml

Amazon’s help  system describes how to use identity providers, thus:

image

http://docs.aws.amazon.com/IAM/latest/UserGuide/idp-managing-identityproviders.html

 

We first configured the SAML provider, learning to do some special edits – as noted.

Since our SAML-@P provider is AAD, at https://login.windows.net/bcbf53cf-af9a-4584-b4c9-6d8b01b3781d/federationmetadata/2007-06/federationmetadata.xml, we saved the stream to a file and edited away the xml header.

image

getting us to some success

image

image

Since AAD, as IDP, will requires an SP record, we note the amazon console’s audience name: https://signin.aws.amazon.com/saml.

The resulting “trust policy” is a set of conditions on what claims must be asserted by the IDP to be “trusted” as a source of identity. These are not access control rules!

{
  “Version”: “2012-10-17″,
  “Statement”: [
    {
      "Effect": "Allow",
      "Action": "sts:AssumeRoleWithSAML",
      "Principal": {
        "Federated": "arn:aws:iam::385727861301:saml-provider/faa"
      },
      "Condition": {
        "StringEquals": {
          "SAML:aud": "https://signin.aws.amazon.com/saml"
        }
      }
    }
  ]
}

back on the IDP, our netmagic.onmicrosoft.com tenant of course, we configure the SP record:

image

image

image

back on the SP ,we set the access control policy for this now-trusted source of identities:

image

image

where we note our final review information:

Role Name

saml

Edit Role Name

Role ARN

arn:aws:iam::385727861301:role/saml

Trusted Entities

The identity provider arn:aws:iam::385727861301:saml-provider/faa

Permissions

Administrator Access

To test, we guess a little and learn from https://console.aws.amazon.com/iam/#home

image

that our LOCAL signin url is https://385727861301.signin.aws.amazon.com/console

image

which we customize to https://rapmlsqa.signin.aws.amazon.com/console.

we then add a user (similarly to how one adds pseudo-users to AAD):

image

image

It seems that the amazon flow only supports idp-initiated websso. Which stumps us.

Turning our attention to the openid connect integration alternative, we see that

image

image

image

which seems to read the openid connect metadata

To configure the SP sides “roles”, we do as follows:

image

 

image

image

 

{
  “Version”: “2012-10-17″,
  “Statement”: [
    {
      "Effect": "Allow",
      "Action": "sts:AssumeRoleWithWebIdentity",
      "Principal": {
        "Federated": "arn:aws:iam::385727861301:oidc-provider/login.windows.net/bcbf53cf-af9a-4584-b4c9-6d8b01b3781d/"
      },
      "Condition": {
        "StringEquals": {
          "login.windows.net/bcbf53cf-af9a-4584-b4c9-6d8b01b3781d/:aud": "3865f1d6-4e21-4468-a300-6093f8256c80"
        }
      }
    }
  ]
}

For the now-federated user’s permissions on the SP, we configure

image

 

{
  “Version”: “2012-10-17″,
  “Statement”: [
    {
      “Sid”: “Stmt1414516823000″,
      “Effect”: “Allow”,
      “Action”: [
        "aws-portal:*"
      ],
      “Resource”: [
        "*"
      ]
    }
  ]
}

Giving

image

 

With this we end our exploration – unable so far to invoke the process of landing on the amazon console using a federated-user!

 

But we learned a lot – about amazons policy-based RP-STS service, for hosted webapps/mobile-apps

Posted in openid connect

American SSL crypto is just crap (even if its really Taiwanese electronics)

 

image

 

After 25 years of doing SSL, its worse than it was at the outset. But then, its US government policy to ensure it DOESN’T really work. The vendors duly facilitate; participating in the system deception of the public.

The issues is not only dlink’s firmware producing certificates with bad dates. Its also Microsoft platform that is the problem. It allowed the ActiveX control to load, for an invalid certificate. Because the cert is on a “trusted” chain, this policy overrides the invalidity.

But then, what Microsoft program in the crypto/trust area is essentially NSA policy – to ensure that their folk can invade a foreigner’s PC (with impunity, since we have little more than slave rights to the mighty, exceptional Americans); and set such setting (when it facilitates an easy crypto win).

Posted in dunno

Protected: public comment on RESO position “statement” on security work (for webapi)–password protected until publication

This content is password protected. To view it please enter your password below:

Posted in RETS

RESO/NAR oauth strategy – a humor in the style of a Greek tragedy

Despite hours of effort measured in the 100s,  I have not really managed to get to the source of NAR/RESOs oauth strategy. I’m left with interpreting the technical documents, comparing them with similar documents produced by other profiling groups. In summary, I’m confused. From what I can tell, IMHO, LOL and other teenage terms now used by managers in their fifties and sixties, it seems it starts “out of date”. Quite why it is so backward in its thinking seems to be based on the commercial motives of those doing the profiling: keep real estate IT backward and a nice little earner for a select group. Oh, and ensure it pretends to satisfy anti-trust rules, while in reality blocking the entrance of cloud vendors and their millions of “tenants” into the IT market.

Hmm, the humor element is missing. So let’s add some, English style.

Not being a member of the exceptional class of human being (not seen seen the end of the nazi-era of “the exceptional race”), I looked at Microsoft Azure’s Mobile service for architectural guidance. How might OAuth be applied – TODAY – if one were to adopt cloud-centric thinking? For the purposes of learning by thinking through issues from all perspectives – and not only one’s own – one can now compare the different generations of oauth systems and their profiles. One can start analysis looking at the pre-cloud oauth architectures of 5 years ago – which is where RESO seems to be starting out.

Azure mobile is a post-RESO model – of cloud thinking. It focuses on webapi-enabled mobile apps on several devices held by a community of users – supported by oauth security technology that in turn is supported by classical websso (using SAML and ws-fed).  It comes both with a server-side model (where tokens are delivered from the OAuth AS directly to webapi endpoints in return for session-tokens) and a client-side model (in which trusted apps maintain caches of tokens, that they choose to renew and/or supply to one or more webapis).

If we characterize RESO’s design in terms of Azure Mobile, RESO is a clumsy server-side scheme – not seen the early days of oauth when folks invested perhaps at most of couple of days development effort, as they bootstrapped their ideas linking up some existing login page to the early webapis. But why would NSA be interested in this class of effort?

It’s just preposterous to think of NSA vs. RESO in terms of NSA vs Web (and the “perhaps” humorous model of plantations of enslaved web users, with NSA as the model “master” keeping the 8 billion non-exceptional slaves – like me – in line with systemic threat and surveillance culture, all wrapped up in a flag). It is,  none-the-less, more proper see the security planning of small profiling groups in terms of more well-intentioned national programs, such as those in identity management or cybersecurity preparation. What a contrast (NSA the spy, NSA the protector)!

If NSA was “subverting” RESO, in the sense that it compromised many a cipher device in the 70s through cozy relations with vendors and standards groups, why would it be attempting to keep things so backwards – in such an important economic sector as real estate? Why would the security technology be out of date, before the spec is even voted on? Why would the management dynamics of the standards group be so contrived? Why would “they” want it so “small-minded”?

I’ve heard several proposed answers to such questions. One answer – the vendor rationale – is founded on the program being a last gasp attempt to prevent the cloud-revolution replacing the traditional set of vendors. Another last gasp theory – the NAR rationale – contends that it is all founded in trying to deter dis-intermediation, protecting against entrance of the “web assurances of new economic players with novel approaches to “facilitating property conveyance” and making it hard for them to replace the core assurances provided by the human Realtor. One slightly-more realistic answer – the NSA paranoia rationale – proposes that NSA is simply doing what it knows how to do well as an institution – motivate adoption of modern cloud in those areas of real critical infrastructure while keeping things nicely server-side for the rest of the general economy on which it spies, engendering adoption of older standards for which the subversion techniques are well known, are known to work well and can be assured to work by deception through by financial bribes (or golf trips) to the right people, in the right places.  As they contrive.

Well the humor never ceases – as the game of crypto and cryptanalysis plays out.

Posted in coding theory

dlink camera notification/storage to live.com / outlook.com /hotmail.com

Configuration parameters.

image

Posted in coding theory

visualizing odata source model

 

we following  the instructions noted below:

image

We had to install visual studio 2012, specifically, and the noted plug-in. Since the netflix feed no longer exists, we used an alternative:

image

http://odata.msteched.com/sessions.svc/

The result was a model diagram:

image

Posted in coding theory

Prep for RESO plugfest

From

image

image

To use this, we will need to (a) not use default credentials, (b) change the URl, and (c) amend the xml parsing to match the entities being returned by a RESO webapi server.

 

#Construct the WebClient object

$wc = New-Object System.Net.WebClient

$wc.UseDefaultCredentials = $true

#Query OData service with url

$queryUri = http://localhost:42203/NorthwindOData.svc/Categories(1)/Products?$top=5

[xml] $responseXML = $wc.DownloadString($queryUri)

#Parse query result in XML format

$entities = $responseXML.SelectNodes(“//*[local-name() = 'properties' ]“)

ForEach($entity in $entities) {

$prodID = $entity.SelectSingleNode(“./*[local-name() = 'ProductID']“).InnerText;

$prodName = $entity.SelectSingleNode(“./*[local-name() = 'ProductName']“).InnerText;

$prodUnitPrice = $entity.SelectSingleNode(“./*[local-name() = 'UnitPrice']“).InnerText;

$prodLine = [string]::Format(“ID:{0}, Name:{1}, UnitPrice:{2}”, $prodID, $prodName,$prodUnitPrice)

Write-Host $prodLine -foregroundcolor Green

}

Posted in coding theory

commodity crypto entrapment by the likes of Google, Apple et al

I’m being really unfair in only including Google and Apple in the title, above. IT really ought to include… almost any large American firm. And, what’s more, any large French, British or German firm. too. And Russian and Chinese firm… (etc).

But one difference between the Russian and American firms is the manner of the deception. One claims to be honest (while having enormous capability for spying on all of us), the other “everyone knows” its spying (while having self-imposed limits). One markets the appearance of strong “unbreakable” commodity crypto and computer security to those who are “trusted” (today), the other openly admits its crypto and security is by NO means unbreakable – by official trust policy.

In the American model, citizenship means that one is supposed to by into the social need to accept the social lie – that crypto works ALL THE TIME, ONCE PURCHASED – in order that foreigners can be duped into the same belief system. The citizen gives up actual individual security in favor security provider by the state, in order that the state can spy on all foreigners – who “unwittingly” happen to use American crypto – which doesn’t work, once  triggered.

But therein lies the lie. Of course American crypto “works” – its just that the implementation you use doesn’t – once a number of things happen. And how those triggers get applies are the secrets MUCH bigger than crypto secrets. They are that which turn off even the appearance of crypto actually working as ONE EXPECTS!

The typical American doesn’t realize the following things, done in the name of trusted citizenship (and doesn’t want to know, being trusted):

1. Video cameras track every vehicle license plate

2. Traffic signals induction loops count bodies in cars

3. devices based on software are typically trivially enabled to remote software switch (so that the vendors marketing “assurances’ no longer apply, since the software security changes…)

4. GSM can enables certain spy satellites, once launched, to track the act of RECEPTION of GSM signals

5. other electronics can be induced, on one channel, to act as a radio receiver – and thereby behave largely as 4 on another channel

All in all folks should know that the very “electronics” infrastructure is carefully wired for tracking the source of signals – which is of course the art of first breaking the cryptoNET (before breaking the crypto). Folks MUST know that the Google’s and Apple’s actively participate in the systemic deception intended to persuade the typical foreigner that the crypto works (until the crypto net surveillance and tracking built into civilian infrastructure – in the name of national security (not military infrastructure subject to arms treaties) – makes it otherwise.

Posted in coding theory

learning from reso odata and oauth – and onto documentdb’s security model

it was fun to see how NAR politics has manipulated the RESO process, in both the odata and oauth spaces. But we can move on, since there is no more R&D to be done in the area for now.

Now, what we did learn was all about node.js, mongodb and the apparatus that goes with it. We can start to look into the windows worlds equivalent (documentdb) and see what a “pure” json-centric webapi server looks like – when constructed using azure cloud principles and microsoft libraries.

in particular, we need to see just how folks have orchestrated how an azure mobile site really works with a documentdb collection and the db account, in terms of identity, identity pass through, 3tier systems, and the like.

In the NAR node.js prototype, we saw how the webapi server expected to work with mongodb accounts. Inb the azure mobile with sql and odata/table stores, we saw how the schema or the masterpartition key was used to segment the spaces by either app or by user (within the app).  With documentdb, lets hope we have a much better though through model – in the sense that it all cues off tokens, token handoffs, and claims, etc.

Before we go there, lets go have a looksie and modern joomla hosting in azure websites and Amazon EC2 web services (and its security model). It will be interesting to compare azure and amazon, and see how much commonality there is around upper layers security.

Posted in coding theory

A 3-tier odata solution for NAR/RESO, with server-side oauth

We followed up the blog post at http://blogs.msdn.com/b/paolos/archive/2013/04/09/how-to-use-windows-azure-table-storage-in-windows-azure-mobile-services.aspx

image

which shows how to create a 3 element system, composed of a windows store client (armed with the ability to use the login  capabilities of the azure mobile packages), a azure mobile middleware site (with which the app collaborates to perform what is technically server-side login), and an odata webapi instance (hosted in a azure store table service).

image

 

image

Though this client gives us one viewer, it is very tuned to login and end user interaction. We can also run the more basic odata–centric viewer client allowing us to directly talk odata/webapi repository of entities underlying the above. We see:

image

We see, from a fiddler trace, the interaction between this odatda client and the odata server (for a JSON-encoded response):

image

alternative, we can remove the accept header and allow the default output format to take emerge:

image

End.

Posted in coding theory

using azure table service for NAR RESO odata

 

Some notes on what we did to call the azure table services entity endpoints, and $metadata.

First we built a code app for table storage management:

image

image

The tool needs a webapi server – i.e. the cloud table service. So we created ourselves an “instance” of the webapi:

image

Server and data repository created, we used the tool to post and update of an entity.

Since this tool does not actually bother to read (or update) metadata for the entities it posts, we found ourselves using other samples to make a simple command line tool that simply issues canned queries (with suitable security headers attached). This at least now reads the metadata (EDMX)

using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Net;
using System.Security.Cryptography;
using System.Text;
using System.Threading.Tasks;
using System.Xml.Linq;

namespace ConsoleApplication1
{
    class Program
    {
        static HttpWebRequest GenerateODataWebRequestForAzureStorage(string url, string accountName, string accountKey)
        {
            var request = (HttpWebRequest)WebRequest.Create(url);
            request.Method = "GET";
            request.Headers.Add("x-ms-date", DateTime.UtcNow.ToString("R", CultureInfo.InvariantCulture));
            var resource = request.RequestUri.AbsolutePath;
            string stringToSign = string.Format("{0}\n/{1}{2}", request.Headers["x-ms-date"], accountName,
            resource
            );
            var hasher = new HMACSHA256(Convert.FromBase64String(accountKey));
            string signedSignature = Convert.ToBase64String(hasher.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
            string authorizationHeader = string.Format("{0} {1}:{2}", "SharedKeyLite", accountName, signedSignature);
            request.Headers.Add("Authorization", authorizationHeader);
            return request;
        }

        static void QueryTableItemsWithRawODataHttp()
        {
            var accountName = "reso";
            var accountKey = "ai8cqXvEv6hE17qsgnFw2TK0rEkMA84EU8xQ/dIrH4KAHfTaB27HFFOowM9LVQroClukysG223KcUPb6md2PRA==";
            var queryUrl = string.Format("http://{0}.table.core.windows.net/Property?$top=5", accountName);
            var queryUrl1 = string.Format("http://{0}.table.core.windows.net/$metadata", accountName);
            var queryUrl2 = string.Format("http://{0}.table.core.windows.net/Tables()", accountName);

            var request1 = GenerateODataWebRequestForAzureStorage(queryUrl1, accountName, accountKey);
            var response1 = request1.GetResponse();
            using (var sr = new StreamReader(response1.GetResponseStream()))
            {
                var doc = XElement.Load(sr);
                Console.WriteLine(doc);
            }

            Console.WriteLine();
            Console.WriteLine();
            Console.WriteLine();

            var request2 = GenerateODataWebRequestForAzureStorage(queryUrl2, accountName, accountKey);
            var response2 = request2.GetResponse();
            using (var sr = new StreamReader(response2.GetResponseStream()))
            {
                var doc = XElement.Load(sr);
                Console.WriteLine(doc);
            }

            Console.WriteLine();
            Console.WriteLine();
            Console.WriteLine();

            var request = GenerateODataWebRequestForAzureStorage(queryUrl, accountName, accountKey);
            var response = request.GetResponse();
            using (var sr = new StreamReader(response.GetResponseStream()))
            {
                var doc = XElement.Load(sr);
                var nsMetadata = "http://schemas.microsoft.com/ado/2007/08/dataservices/metadata";
                var nsSchema = "http://schemas.microsoft.com/ado/2007/08/dataservices";
                foreach (var elmItem in doc.Descendants(XName.Get("properties", nsMetadata)))
                {
                    var name = elmItem.Descendants(XName.Get("ListingRID", nsSchema)).First().Value;
                    var partitionKey = elmItem.Descendants(XName.Get("PartitionKey", nsSchema)).First().Value;
                    Console.WriteLine("ListingRID:{0}, RowKey:{1}", name, partitionKey);
                }
            }
        }


        static void Main(string[] args)
        {

            QueryTableItemsWithRawODataHttp();
        }
    }
}

This allows us to see in fiddler what is passing on the wire (in XML format):

image

Posted in coding theory

p-adics – the hidden cryptanalysis model

One thing Turing was known for, in the first computer period to follow WWII, was his showcasing how to add large numbers, having written “numbers” out, unusually, from right to left. The carry rule would move right to left still, yet enable correct arithmetic. Of course, any first year computer science student studies one’s and two’s complement “representations” – that computer representation of signed integers that makes them amenable to digital electronics doing adding. The study of all this has much deeper background, however, coming from the historical interplay between the art of machine based cryptanalytical methods and the study of the limits of machine computability – a  historical topic still largely un-discussed and probably formally viewed even today as  being treated as if the theory was still a “secret weapon.

The presenter of the video at https://www.youtube.com/watch?v=vdjYiU6skgE does a good job of stating what folks new in the 1930s about the mathematical toolkit of “techniques” underlying cryptanalytical computing. The cryptanalytical “math secrets” of the WWII period also concerned this ”theory” of computing/cryptanalysis – founded as we see from the video presentation in projective geometry, p-adic distance metrics, special relativity (for timebases), conformal field ideas, cauchy sequences, and notions of causality between regions of certain geometric spaces. The use of such apparatus elements to represent large numbers had a simple initial objective: express number and fractions at high accuracy so as to finesse colossus style processing of correlations to be found in noisy signaling channels. This would have been enabling cryptanalysts even in 1947 to be having , say, 200-digits-of accuracy decimals, stored in the flexible 1946-era drum memory used by the first manchester computers.  The resulting adders could then do the p-adic arithmetic – or two complement (when p=2). And “this was the secret”.

But p-adics are more fundamental to cryptanalysis than mere twos-complement – being the “raw theory” of branching. When the Yanquis claim that only their own first research machines did the modern (ie Turing) “stored program concept” based on on a tape’s stored instruction indicating a branching instruction, we might ask if only that only one form of the underlying branching theory – not that you will hear such a question posed by the NSA-crypto and therefore american historians.

P-adics should be thought of as Turing was taught them in Newman’s “special” Topology classes – being the nesting of certain geometric relations and then the mapping of the nested spaces onto 1 dimensional lattices and tree structures underlying space-filling curves – where the branching of such trees is rather intuitive. Branching in this sense – through implied spaces – is rather more “algorithmic” that merely having some stored processor recognize the ‘conditional jump’ bit pattern on a tape!

The art of cryptanalysis is now, and was in Turing’s day, all about having computing models that noone thinks you even might have…let alone think you have reduced to practice (and certainly not have made into cost-effective technologies). Iterative algorithms for updating colossus style calculations of conditional probabilities – teasing out the most liable candidates for solving some crypto puzzle – all have their roots in discrete schrodinger equation calculations, hyperbolic surfaces, conformal coordinates, and leverage symmetric matrices to update the convergence-based newtonian “root-finding algorithms” – as more and more depths of information were fed into the graph-search.

Posted in coding theory

NAR oauth and odata servers

Let’s take a solution and to it add our 2 working projects – an odata feed and endpoint, and an authorization server (with grant management UI, too).

 

image

 

Now we need a client, that can orchestrate the handshake.

Posted in oauth, odata

using NAR OAUTH server management plane

 

we were able to register a first client, entitled to seek an authorization code grant:

 

image

 

The values, distinct from the display name we entered as “clientid” are all generated values:

Name:
clientid(Subscribers will use this)

client_id:
7ClcPY1GXEephAZZ

client_secret:
gBtUn2I2o5gx38cVearV4WrKoVOcz1Jfr7r3OdlT

redirect_uri:
http://localhost/

we used the

image

image

 

 

Looking at the code, we see that each form post back to a route, registered in the setup:

 

image

image

 

This step is performed authorization server manager, of course, the custodian of the data (acting on behalf of the resource owner). The next step is performed by the administrative user of the client, who must authenticate to the AS in order to get the one time code – that seeds obtaining the first access token (and any refresh token):

image

image

For some reason, NAR refer to this process as one of “enablement”.

 

In another experiment, we can start to hook up the webapi server and authorization server.

Posted in oauth

NAR/CRT OAUTH2 server for webAPI

We managed to get the NAR/CRT OAUTH2 server booted and able to serve some pages.

image

On Windows these were the steps we took:

1) we created a node web application (microsoft model, for easy azure hosting)

image

2) on the npm line, we right clicked to add a node package, using the package manager who we primed to search on reso-

image

3) per instructions, we copied config files and directory from the samples directory to the root

image

4) we amended the default server.js file

image

and started the mongodb server

 

we see

image

 

image

the config file was changed, form the original, to turn off ssl/encrypted-traffic

 

{
  “port” : 1340,
  “security”: {
    “tokenLife” : 3600
  },
  “mongoose”: {
    “uri”: “mongodb://localhost/oauth”
  },
  “domain”: “localhost”,
  “entry_point”: “index.html”,
  “encrypted_traffic”: false,
  “certificate”: “./ssl/server.crt”,
  “key”: “./ssl/server.key”
}

we see that various collections of objects are present in in the mongodb, now.

image

Posted in odata

Excel Power Query talking to NAR odata/atom feed; a first experiment

Now that we can issue basic http requests to the NAR webapi server, a set of node.js packages and setup, we configure excel power query to talk to the same server, modified slightly to allow a proxy to intercept the communications between excel and webapi server.

First, we host the server on the me, domain (rather than localhost), amending host file.  Then we alter excel so it uses the reverse proxy, listening on a particular port.

image

 

The latter step is achieved by adding a Microsoft.Mashup.Container.exe.config  fgile in C:\Program Files (x86)\Microsoft Power Query for Excel\bin

 

<?xml version=”1.0″ encoding=”utf-8″ ?>
configuration>
<system.net>
<defaultProxy>
  <proxy autoDetect=”false” bypassonlocal=”false” proxyaddress=”http://127.0.0.1:8888″ usesystemdefault=”false” />
</defaultProxy>
</system.net>
</configuration>

 

Of course, the windows system hosts file exposes me, for lookback adaptor

 

::1             localhost me

 

the webapi server itself is instructed next to no longer listen on localhost but on me:

 

image

listener, changed to host = me

 

The webapi server does not seem to have the logic to issue authorization challenges, should there by missing authorization headers. So, we pre-configure excel with basic credentials:

(admin/admin).

image

 

When we have excel talk to = OData.Feed(http://me:42999/DataSystem.svc/?DataSystem), we get

image

The fiddler proxy, spying on the communications now, has two sessions:

image

excel issues two http requests, with different accept headers:

Accept: application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8 REQUEST
Accept: application/json;odata=minimalmetadata;q=1.0,application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8 REQUEST

 

We list the messages here:

#1 : Accept: application/json;odata=minimalmetadata;q=1.0,application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8 REQUEST

GET http://me:42999/DataSystem.svc/?DataSystem HTTP/1.1
User-Agent: Microsoft.Data.Mashup (http://go.microsoft.com/fwlink/?LinkID=304225)
MaxDataServiceVersion: 3.0
Accept: application/json;odata=minimalmetadata;q=1.0,application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8
Authorization: Basic YWRtaW46YWRtaW4=
Accept-Encoding: gzip, deflate
Host: me:42999
Connection: Keep-Alive

WITH RESPONSE

HTTP/1.1 200 OK
content-type: application/xml;charset=UTF-8
Vary: Accept-Encoding
Date: Wed, 03 Sep 2014 00:45:49 GMT
Connection: keep-alive
Content-Length: 7970

<?xml version=”1.0″ encoding=”utf-8″?>
<feed xml:base=”http://me:42999/DataSystem.svc/” xmlns=”http://www.w3.org/2005/Atom” xmlns:d=”http://schemas.microsoft.com/ado/2007/08/dataservices” xmlns:m=”http://schemas.microsoft.com/ado/2007/08/dataservices/metadata” xmlns:georss=”http://www.georss.org/georss” xmlns:gml=”http://www.opengis.net/gml”>
<id>http://me:42999/DataSystem.svc</id>
<title type=”text”>DataSystem</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<link rel=”self” title=”DataSystem” href=”DataSystem” />
<entry>
  <id>http://me:42999/DataSystem.svc/DataSystem(‘RESO API Server’)</id>
  <category term=”RESO.OData.Transport.DataSystem” scheme=”http://schemas.microsoft.com/ado/2007/08/dataservices/scheme” />
  <link rel=”edit” title=”DataSystem” href=”DataSystem(‘RESO API Server’)” />
  <title>Data Services for RESO API Server</title>
  <updated>2014-09-01T02:08:32.000Z</updated>
  <author>
   <name>Center for REALTOR Technology</name>
  </author>
  <content type=”application/xml”>
   <m:properties>
    <d:Name>RESO API Server</d:Name>
    <d:ServiceURI>http://me:42999/DataSystem.svc</d:ServiceURI>
    <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:45:47.726Z</d:DateTimeStamp>
    <d:TransportVersion>0.9</d:TransportVersion>
    <d:Resources m:type=”Collection(RESO.OData.Transport.Resource)”>
    <d:element>
     <d:Name>Property</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Property Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Member</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Member Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Office</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Office Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Contact</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Contact Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Media</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Media Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>History</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard History Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>SavedSearch</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard SavedSearch Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>OpenHouse</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard OpenHouse Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Green</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Green Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Room</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Room Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>UnitType</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard UnitType Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>OtherPhone</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard OtherPhone Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>SocialMedia</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard SocialMedia Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>OfficeUrlType</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard OfficeUrlType Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>UserDefinedField</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard UserDefinedField Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    </d:Resources>
    <d:ID>RESO API Server</d:ID>
   </m:properties>
  </content>
</entry>
</feed>

 

# Accept: application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8 REQUEST

 

GET http://me:42999/DataSystem.svc/?DataSystem HTTP/1.1
User-Agent: Microsoft.Data.Mashup (http://go.microsoft.com/fwlink/?LinkID=304225)
MaxDataServiceVersion: 3.0
Accept: application/atomsvc+xml;q=0.8,application/atom+xml;q=0.8,application/xml;q=0.8,text/plain;q=0.8
Authorization: Basic YWRtaW46YWRtaW4=
Accept-Encoding: gzip, deflate
Host: me:42999

 

WITH RESPONSE

 

HTTP/1.1 200 OK
content-type: application/xml;charset=UTF-8
Vary: Accept-Encoding
Date: Wed, 03 Sep 2014 00:45:49 GMT
Connection: keep-alive
Content-Length: 7970

<?xml version=”1.0″ encoding=”utf-8″?>
<feed xml:base=”http://me:42999/DataSystem.svc/” xmlns=”http://www.w3.org/2005/Atom” xmlns:d=”http://schemas.microsoft.com/ado/2007/08/dataservices” xmlns:m=”http://schemas.microsoft.com/ado/2007/08/dataservices/metadata” xmlns:georss=”http://www.georss.org/georss” xmlns:gml=”http://www.opengis.net/gml”>
<id>http://me:42999/DataSystem.svc</id>
<title type=”text”>DataSystem</title>
<updated>2014-09-01T02:08:32.000Z</updated>
<link rel=”self” title=”DataSystem” href=”DataSystem” />
<entry>
  <id>http://me:42999/DataSystem.svc/DataSystem(‘RESO API Server’)</id>
  <category term=”RESO.OData.Transport.DataSystem” scheme=”http://schemas.microsoft.com/ado/2007/08/dataservices/scheme” />
  <link rel=”edit” title=”DataSystem” href=”DataSystem(‘RESO API Server’)” />
  <title>Data Services for RESO API Server</title>
  <updated>2014-09-01T02:08:32.000Z</updated>
  <author>
   <name>Center for REALTOR Technology</name>
  </author>
  <content type=”application/xml”>
   <m:properties>
    <d:Name>RESO API Server</d:Name>
    <d:ServiceURI>http://me:42999/DataSystem.svc</d:ServiceURI>
    <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:45:52.836Z</d:DateTimeStamp>
    <d:TransportVersion>0.9</d:TransportVersion>
    <d:Resources m:type=”Collection(RESO.OData.Transport.Resource)”>
    <d:element>
     <d:Name>Property</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Property Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Member</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Member Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Office</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Office Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Contact</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Contact Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Media</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Media Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>History</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard History Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>SavedSearch</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard SavedSearch Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>OpenHouse</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard OpenHouse Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Green</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Green Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>Room</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard Room Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>UnitType</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard UnitType Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>OtherPhone</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard OtherPhone Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>SocialMedia</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard SocialMedia Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>OfficeUrlType</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard OfficeUrlType Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    <d:element>
     <d:Name>UserDefinedField</d:Name>
     <d:ServiceURI>http://me:42999/listing.svc</d:ServiceURI>
     <d:Description>RESO Standard UserDefinedField Resource</d:Description>
     <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-03T00:39:36.467Z</d:DateTimeStamp>
     <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
     <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
    </d:element>
    </d:Resources>
    <d:ID>RESO API Server</d:ID>
   </m:properties>
  </content>
</entry>
</feed>

If we talk to the listing.svc, we get:

image

fiddler spying on the communications shows that excel expects the listing.svc to be supported by a $metadata endpoint:

image

Posted in odata

datasystem metadata from NAR webAPI server

image

 

THe NAR webAPI server, a node.js package and bootstrap fileset, can be made to do something using the following first test harness:

 

1. run the node.js server and mongodb instances, per previous research notes.

2. create a fiddler request, in the compose tool, that will have fiddler talk to the http://localhost:42999/DataSystem.svc endpoint

3. use basic authentication settings (only), where the username and password are both “admin”, i.e.  ‘Authorization: Basic YWRtaW46YWRtaW4=’

image

4. add a querystring parameter, to denote a query that seeks metadata: http://localhost:42999/DataSystem.svc/?DataSystem

 

the result is a first stream, that shows we have something working!

For the request

GET http://localhost:42999/DataSystem.svc/?DataSystem HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-US,en;q=0.5
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; Touch; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Host: localhost:42999
DNT: 1
Authorization: Basic YWRtaW46YWRtaW4=
Connection: Keep-Alive

we get response body of

<?xml version=”1.0″ encoding=”utf-8″?>

<feed xml:base=”http://localhost:42999/DataSystem.svc/” xmlns=”http://www.w3.org/2005/Atom” xmlns:d=”http://schemas.microsoft.com/ado/2007/08/dataservices” xmlns:m=”http://schemas.microsoft.com/ado/2007/08/dataservices/metadata” xmlns:georss=”http://www.georss.org/georss” xmlns:gml=”http://www.opengis.net/gml”>
  <id>http://localhost:42999/DataSystem.svc</id>
  <title type=”text”>DataSystem</title>
  <updated>2014-09-01T02:08:32.000Z</updated>
  <link rel=”self” title=”DataSystem” href=”DataSystem” />
  <entry>
    <id>http://localhost:42999/DataSystem.svc/DataSystem(‘RESO API Server’)</id>
    <category term=”RESO.OData.Transport.DataSystem” scheme=”http://schemas.microsoft.com/ado/2007/08/dataservices/scheme” />
    <link rel=”edit” title=”DataSystem” href=”DataSystem(‘RESO API Server’)” />
    <title>Data Services for RESO API Server</title>
    <updated>2014-09-01T02:08:32.000Z</updated>
    <author>
      <name>Center for REALTOR Technology</name>
    </author>
    <content type=”application/xml”>
      <m:properties>
        <d:Name>RESO API Server</d:Name>
        <d:ServiceURI>http://localhost:42999/DataSystem.svc</d:ServiceURI>
        <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:23:47.006Z</d:DateTimeStamp>
        <d:TransportVersion>0.9</d:TransportVersion>
        <d:Resources m:type=”Collection(RESO.OData.Transport.Resource)”>
          <d:element>
            <d:Name>Property</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Property Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>Member</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Member Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>Office</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Office Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>Contact</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Contact Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>Media</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Media Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>History</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard History Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>SavedSearch</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard SavedSearch Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>OpenHouse</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard OpenHouse Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>Green</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Green Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>Room</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard Room Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>UnitType</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard UnitType Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>OtherPhone</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard OtherPhone Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>SocialMedia</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard SocialMedia Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>OfficeUrlType</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard OfficeUrlType Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
          <d:element>
            <d:Name>UserDefinedField</d:Name>
            <d:ServiceURI>http://localhost:42999/listing.svc</d:ServiceURI>
            <d:Description>RESO Standard UserDefinedField Resource</d:Description>
            <d:DateTimeStamp m:type=”Edm.DateTime”>2014-09-02T18:16:34.949Z</d:DateTimeStamp>
            <d:TimeZoneOffset m:type=”Edm.Int32″>-7</d:TimeZoneOffset>
            <d:Localizations m:type=”Collection(RESO.OData.Transport.Localization)” />
          </d:element>
        </d:Resources>
        <d:ID>RESO API Server</d:ID>
      </m:properties>
    </content>
  </entry>
</feed>

 

A quick look at the javascript code generating this response body shows

image

that the xml is essentially canned, and the resource classes are programmed in ram (and are not obtained from a db).

 

Fro the data dictionary package, we see the basic model being setup, with 15 entities and the associated entitycontext:

 

image

 

from this, evidently, we see an enumerator run over the $data and create a resources definition set – for use in configuring the core server’s resourceList and index list (inferred from the metadata and entityset declarations):

 

image

 

So, from the metadata jaydata declarations, we get resourcelists and indexes, that are passed to the callback classes that are attached to the listing.svc and dataservice.svc routes:

image

End.

Posted in odata

mongodb in azure

Let’s use the add on service to our Microsoft Azure cloud tenant, augmenting the fabric with mongodb:

image

http://msopentech.com/blog/2014/04/22/mongolab-offers-new-options-mongodb-developers-azure/

so when we apply the following:

image

image

we get a working mongodb instance

image

mongo ds052837.mongolab.com:52837/mongodb -u pwilliams –p …

https://mongolab.com/databases/mongodb

 

now we work a tutorial from node.js to our mongodb db/collection (before even attempting the NAR node.js integration with this instance):

our connection string, from azure,  is mongodb://mongodb:yilxWKhgqv17AHdd3P4ur3PkYJxwkwVpdjCE6.CctrA-@ds052837.mongolab.com:52837/mongodb

we learn to install express (the scaffolding tool for the express web app framework)

image

http://expressjs.com/guide.html#executable

 

and upon using the tool, we get output similar to that suggested in the tutorial:

image

Per instructions we ‘npm install’

image

And install the “mongoose” db driver

image

Using visual studio and a nominal project, we then modify the application, per instructions (largely, with exceptions in app.js).

image

we assign the environment connection string, per instructions:

image

Clearly, we have a basic website up and running:

image

now, the site doesn’t work (but then we didn’t feel the instructionsquite matched what we were seeing from the expected express scaffolding).

 

so we start again, this time using the microsoft express project type (to which we then add the controllers, routers, and app hookups, as per the instructions); again.

 

Now we have success, once we publish to the same website as used earlier, replacing code:

image

With that done, we see clearly that the webapp’s post b ack page works, since we can see records in the db – using its management console:

 

image

Now we have see one express project, talking to our mongodb instance via the mongoose driver, we can go have a more careful look at the NAR code. Perhaps we can see how to make it talk to our mongodb instance, now, and then host it too in the azure cloud.

Posted in nodejs

how to host NAR node.js webapi demo in azure

With the node.js tool chain for visual studio, making a first node project and publishing to the azure cloud (as a website) could not have been any easier.

 

image

We then followed the alternative, more basic instructions based on git publishing:

image

http://azure.microsoft.com/en-us/documentation/articles/web-sites-nodejs-develop-deploy-mac/

This was not much harder than the visual studio centric method:

image

We this we are probably set to try and publish the node.js portion of the NAR webapi.

But, first we need an azure hosted mongodb!

Posted in nodejs

Running NAR’s open source webAPI server

We followed some counsel to augment our Visual Studio 2013 installation with “node.js” tools, with a view to running NAR’s open source webapi server.

image

http://www.hanselman.com/blog/IntroducingNodejsToolsForVisualStudio.aspx

 

We learned about NAR’s open source material from a member’s only resource:

 

image

http://members.reso.org/display/TRANSPORTDF/NAR+Open+Source+Recap

which referred to

image

https://github.com/NationalAssociationOfRealtors

Using these sources and the knowhow taught by bloggers, we launched visual studio with a view to getting the point where we can run and debug the NAR webAPI server. This required using git, in visual studio, to clone the repository for the reso-api-server and then make a new node.js project, that imported the (existing) repository files.

 

image

 

Using the npm tools, we download the referenced packages. Next we followed the instructions to get the main package:

image

we had to change the dependency type from the default (and chose GLOBAL) to avoid the error shown in the console, left.

This all teaches us that perhaps we are going down the wrong path – not know what Global even  is, etc.

So we start again, simply creating a new node.js application, to which we install the reso-api-server as a standard package. Then we following the metadata and configuration setup instructions:

 

image

we take a guess at some reasonable configuration parameters:

#
# Configuration file for a RESO API Server
#
#
# API Service
#
COMPRESSION:        false   
SERVER_DOMAIN:        localhost
SERVER_NAME:        RESO API Server   
SERVER_PATH:        listing.svc   
SERVER_PORT:        42999
SERVER_PROTOCOL:    http   
#
# Data Processing 
#
EXTERNAL_INDEX:        true   
LOG_ENTRY:        false   
METADATA_DEFINITION:    ./metadata/reso_1_2.js
PROCESS_WAIT:        15   
RESPONSE_LIMIT:        200   
#
# HTTPS Certificates
#
#CA_CERTIFICATE:        ./ssl/ca.crt
#SERVER_CERTIFICATE:    ./ssl/server.crt
#SERVER_KEY:        ./ssl/server.key
#
# Authentication
#
#AUTH_REALM:        RESO API SERVER   
#AUTH_TYPE:        Basic   
#AUTH_TYPE:        Digest   
AUTH_TYPE:        None   
#
# OAuth2 Services
#
#OAUTH2_URI:        https://localhost:1340/authorize   

 

Running the sample startup file, we get

image

we attempt to fix by taking the “alternative” configuration path, using the data dictionary package.

image

image

Augmenting the instructions, we consult

 

This fixes one startup issue, but then we hit a next – obviously related to SSL setup:

image

image

http://nodejs.org/api/tls.html

we install openssl (that NSA-friendly package full of strangely broken crypto and security protocols, supported by “american foundations”);

image

http://slproweb.com/products/Win32OpenSSL.html

Several things seem to fix, on merely installing openssl (and adding the bin directory to the system path).

Without creating any ssl/ directory (or keys), and having adjusted the test.js and service.conf to be in the root project directory (and out of the metadata/ directory), we now get

image

which we guess means we are missing a mongodb server (listening on the port mentioned).

image

image

Lets guess, given the code below, that we followed bad instructions, initially – and somehow were not reading the correct files, in the right directories etc.

image

as it actually stands, after all this confusion:

a) we have a mongdb instance

image

and

b) a visual studio debugging instance of the node.js process (THAT evidently has a connection to a mongdb with zero entities).

image

at which point we seem able to boot a components hosting a couple of services

image

Posted in odata

Sharing queries over an odata feed, using Rapattoni netmagic.

In the last memo, netmagic users shared query within the netmagic community. The queries were designed to run over tables in worksheets that the user, both sharing-user and shared-user, already had –-having downloaded the excel file with the “source” snapshot of listing data. This model works fine, for teams of rea`ltors perhaps. It doesn’t work so well with realtors and homeowners.

So in the next trial, we make a workbook that references a public source – of the snapshot data. That is, we reference the odata feed (of the same data snapshot).

So, as a non-realtor to whom a realtor has granted access to the odata feed (of yesterdays snapshot of listing data, from BARS), we get our own local copy of the source data:

 

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/Barstow.xlsx/Odata/Table_Query_from_Barstow

This process of course avoids having to download an excel file full of data sourced to an ODBC connection (that the typically users will NOT have). Immediately, we see the worksheet fill with the preview rows, and then the odata source is consulted:

image

we can now share to share our data, once we have netmagic/office use odata to sort and filter to look at properties measured in lot square footage.

image

one see then (and note what happens less than perfect data entry is treated this way)

image

we then share our BI query (over an odata source, that tied to a netmagic/sharepoint hosted excel file with table)

 

image

if we sign out (of support170@rapmlsqa.com) and sign into the query tool as niki@rapmlsqa.com

image

we see that niki has access to support170s work, filtering and sorting suitably for their professional conversation.

image

Niki is a power excel user, and loads the data into power pivot (the full cube-oriented data analytics tool that used to come only with high end servers).

image

Posted in odata, RETS

Sharing business intelligence queries on RETS sources using netmagic (and odata)

We loaded our sharepoint-hosted “barstow” table into a fresh excel worksheet, having re-enabled the data source and content (from now untrustworthy sources, for some reason).

Our next task to move away from the traditional excel world used so far and onto the world of power query, where we get all the power of websso, access control, and sharing.

So, we used the signin feature of the power query addon itself – logging into our office365 “netmagic” environment using websso and oauth2. We used user “andy@rapmlsqa.com”. These credentials, sessions, and tokens are stored on our computer, recall (and not in the workbook).

Next, we used the power query featureset and loaded-up a query “from table” in the current workbook – this being the resultset and now “table” of RETS data obtained from the ODBC connection.

 image_thumb[10]

We loaded the result of executing the ‘power query’ into a second worksheet. And then we shared the query, using the office 365 power BI catalog:

image_thumb[15]

 

image_thumb[17]

if we signout of power query (as andy@rapmlsqa.com) and sign back in as another netmagic user (support170@rapmlsqa.com), we see

image

image

image

 

We see that in a new excel workbook (logically operating on a different computer to that used by andy@rapmlsqa.com), we can use the search tool, for ‘organizational” queries, shared within the netmagic community.

image

we can load the preview data into the query itself (vs a workbook, that will attempt to talk to the true data source with all the data)

 

image

 

Of course, if one executes this query from a workbook that does not already contain a sheet with the (local) table referenced by the query, one gets no data:

image

This all nicely shows that the access control is on the queries and their sharing, not on the data (which is assumed to be available). One is a realtor, say, sharing one’s  intelligence and data analysis skills and results (with others).

Posted in odata, RETS

exposing (store) RETS 1. data as odata, with authentication

We added the 32-bit power query addin to our Excel 2013 tool. We also saved to our office 365 sharepoint “bi” subsite (of sharepoint online) our excel workbook, within which we had loaded the results of an RETS 1.0 query issued by the Microsoft Query tool – as feature of traditional Excels “data” centric tools. This uses the “licensing” and websso session held in the office 365 cloud for “netmagic” by the (licensed) excel tool itself. This is actually the user known as admin@netmagic.onmicrosoft.com.

image

Once sees that a basic odata feed emerges straightaway:

image

The data (preloaded, and static) is then available from the “table” based odata feed – but only to those on the access control list set by the owner of the document, in sharepoint, and those who have authenticated (to sharepoint/odata) using the Netmagic websso process:

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/Barstow.xlsx/Odata/Table_Query_from_Barstow

Posted in odata, RETS

32 bit excel 2013 with 32-bit odbc driver

On Windows 8.1, Lets install the 32-bit ODBC drvier, intending to use the traditional excel data connection:

 

image

http://nationalassociationofrealtors.github.io/ezRETS/

 

Using the 32-bit ODBC console, we take parameters from our SSO Portal and use them to configure the ODBC driver. (The SSO portal has a RETS client of its own).

 

image

image

image

image

image

image

 

Fiddler shows that the exRETS driver self-tests fine.

 

image

 

Then in uninstall our 64-bit excel (office 365 licensed) and reinstall 32 bit versions of all the office tools, binding to the office 365 license. That done, we configure a classical data link, which we see works:

image

 

We also see, from fiddler that after login the driver enumerates various metadata sources, successfully.

image

Now to make Excel fully cooperate we learn (from an expert) to use Microsoft Query  – and the ODBC source.

image

image

image

getting to

image

What we see under the hood is the following definitions for the DSN and query

DSN=bars;LoginUrl=http://rets172lax.raprets.com:6103/Barstow/BARS/login.aspx;
UID=rapstaff;;StandardNames=false;UserAgent=Rapatton1Test/1.7.2;UseHttpLogging=true;
HttpLogFile=C:\Users\Peter\Desktop\httpbars;UseBulkMetadata=true;RetsVersion=1.7.2;
IgnoreMetadataType=true;EncodingType=UTF-8 Encoding;HttpLogEverything=true;

SELECT “data:Property:RESI”.ListingRid, “data:Property:RESI”.ListingOfficeMLSID
FROM “data:Property:RESI” “data:Property:RESI”
WHERE (“data:Property:RESI”.ListingRid>0)

from

image

Posted in odata, RETS

basic auth vs marketplace credentials

 

We see from a (non-excel) consumer of the marketplace services one can build a client class, quite easily:

 

image

http://code.msdn.microsoft.com/Walkthrough-Translator-in-7e0be0f7

we see from the code that it wants us to configure the app key, for the azure marketplace account, to be applied to the particular feed:

image

tWXLMnqHpLuZNwFeecxbQGEaPVQOunDX5RpTGIwHe28=

https://api.datamarket.azure.com/Bing/MicrosoftTranslator/

We know that this marketplace access mechanism is nothing other than basic auth because we can use excel with basic auth credentials to read the feed.

 

image

 

image

Posted in odata

basic auth and odata feeds–Azure marketplace

Fiddler shows that excel presents basic auth credentials to enumerate datasets at the Azure marketplace “area” associated with the identity.

When we look at the management page for those credentials, they are a page – that first requires one logon to the marketplace site using websso (and a Microsoft account).

 

image

see https://datamarket.azure.com/account/keys

when we then use excel to access the marketplace data sources (having changed the default “account key’”, we see the classical basic auth challenge back to excels power query and (see above) the resulting UI that seeks input of the key (i.e. basic auth password)

 

image

 

Once we supply the password to excel, we see in fiddler a second attempt to enumerate the feeds:

image

 

We keyed the value BK1xHy4g0tg00Hd5HFPAdsikQ/mG2E8zBMioi42JDlE=, and we note that excel responds with Authorization: Basic RmVlZEtleTpCSzF4SHk0ZzB0ZzAwSGQ1SEZQQWRzaWtRL21HMkU4ekJNaW9pNDJKRGxFPQ==, whose value block is of course, once unpacked: FeedKey:BK1xHy4g0tg00Hd5HFPAdsikQ/mG2E8zBMioi42JDlE=. So now we understand that the user name is fixed, and the “basic auth” is just some password for the feed itself (not necessarily a user).

As we see from the management concept of azure marketplace, the passwords are proxies for the Microsoft account user

Posted in odata

getting a RESO style atom service document from SharePoint Online

In a office 365-hosted sharepoint subsite, called bi, we store an excel file in the documents folder.

 

image

The pwdoc resource is simply an excel file, uploaded.

image

The sharepoint infrastructure has a nice feature – that any “table” within an uploaded excel file becomes an odata feed:

Our first table, with columns mlsid and payment, is known as PW and becomes the “PW” feed

 

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData/PW

The second table is known as NRDS:

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData/NRDS

we can use Excel’s power query tool chain to visit the

 

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData

 

which produces

 

image

 

Comments on the RESO spec, at http://members.reso.org/display/API/2.3.2+URI+Stem, has an example

image

In the example, an optional convention for addressing service documents by URI is, for our domain name, https://netmagic.sharepoint.com/RESO/OData/ .

 

In fiddler, the client’s proxy, we add a rule:

urlreplace /RESO/OData/ /bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData

 

image

note how the address bar no longer matches with the xml base.

 

Another solution is to do a formal redirect, so that the atom document has a base that aligns with the browser:

image

using a redirect (in fiddler, rather than a true load balancer setup)

image

Posted in odata

steps to have excel talk to Graph API

Here are the steps to talk to the AAD graph API using excel 2013 and power query. The steps are similar to those used when talking to the supported odata feed of the CRM Online service. A workaround is given that makes excel talk to the graphAPI.

 

1. open an excel workbook and activate the power query ribbon. Select an odata feed:

image

 

2. Supply the Graph API URI

From http://msdn.microsoft.com/en-us/library/azure/jj126255.asp

in my case the value is

https://graph.windows.net/rapmlsqa.com/users?api-version=2013-04-05

image

 

3. Focus on users

you must select the user form of the URI (and the organizational id option)

image

 

4. Sign in and STAY SIGNED IN

image

image

 

4.save the credential set

image

choosing the users option, and “save”.

 

One then sees what feels like a bug – where upon one must save again. Note a specific workaround is required. You MUST save again, now as anonymous.

image

 

the net result is a query resolution:

image

Posted in odata