mongodb in azure

Let’s use the add on service to our Microsoft Azure cloud tenant, augmenting the fabric with mongodb:

image

http://msopentech.com/blog/2014/04/22/mongolab-offers-new-options-mongodb-developers-azure/

so when we apply the following:

image

image

we get a working mongodb instance

image

mongo ds052837.mongolab.com:52837/mongodb -u pwilliams –p …

https://mongolab.com/databases/mongodb

 

now we work a tutorial from node.js to our mongodb db/collection (before even attempting the NAR node.js integration with this instance):

our connection string, from azure,  is mongodb://mongodb:yilxWKhgqv17AHdd3P4ur3PkYJxwkwVpdjCE6.CctrA-@ds052837.mongolab.com:52837/mongodb

we learn to install express (the scaffolding tool for the express web app framework)

image

http://expressjs.com/guide.html#executable

 

and upon using the tool, we get output similar to that suggested in the tutorial:

image

Per instructions we ‘npm install’

image

And install the “mongoose” db driver

image

Using visual studio and a nominal project, we then modify the application, per instructions (largely, with exceptions in app.js).

image

we assign the environment connection string, per instructions:

image

Clearly, we have a basic website up and running:

image

now, the site doesn’t work (but then we didn’t feel the instructionsquite matched what we were seeing from the expected express scaffolding).

 

so we start again, this time using the microsoft express project type (to which we then add the controllers, routers, and app hookups, as per the instructions); again.

 

Now we have success, once we publish to the same website as used earlier, replacing code:

image

With that done, we see clearly that the webapp’s post b ack page works, since we can see records in the db – using its management console:

 

image

Now we have see one express project, talking to our mongodb instance via the mongoose driver, we can go have a more careful look at the NAR code. Perhaps we can see how to make it talk to our mongodb instance, now, and then host it too in the azure cloud.

Posted in nodejs

how to host NAR node.js webapi demo in azure

With the node.js tool chain for visual studio, making a first node project and publishing to the azure cloud (as a website) could not have been any easier.

 

image

We then followed the alternative, more basic instructions based on git publishing:

image

http://azure.microsoft.com/en-us/documentation/articles/web-sites-nodejs-develop-deploy-mac/

This was not much harder than the visual studio centric method:

image

We this we are probably set to try and publish the node.js portion of the NAR webapi.

But, first we need an azure hosted mongodb!

Posted in nodejs

Running NAR’s open source webAPI server

We followed some counsel to augment our Visual Studio 2013 installation with “node.js” tools, with a view to running NAR’s open source webapi server.

image

http://www.hanselman.com/blog/IntroducingNodejsToolsForVisualStudio.aspx

 

We learned about NAR’s open source material from a member’s only resource:

 

image

http://members.reso.org/display/TRANSPORTDF/NAR+Open+Source+Recap

which referred to

image

https://github.com/NationalAssociationOfRealtors

Using these sources and the knowhow taught by bloggers, we launched visual studio with a view to getting the point where we can run and debug the NAR webAPI server. This required using git, in visual studio, to clone the repository for the reso-api-server and then make a new node.js project, that imported the (existing) repository files.

 

image

 

Using the npm tools, we download the referenced packages. Next we followed the instructions to get the main package:

image

we had to change the dependency type from the default (and chose GLOBAL) to avoid the error shown in the console, left.

This all teaches us that perhaps we are going down the wrong path – not know what Global even  is, etc.

So we start again, simply creating a new node.js application, to which we install the reso-api-server as a standard package. Then we following the metadata and configuration setup instructions:

 

image

we take a guess at some reasonable configuration parameters:

#
# Configuration file for a RESO API Server
#
#
# API Service
#
COMPRESSION:        false   
SERVER_DOMAIN:        localhost
SERVER_NAME:        RESO API Server   
SERVER_PATH:        listing.svc   
SERVER_PORT:        42999
SERVER_PROTOCOL:    http   
#
# Data Processing 
#
EXTERNAL_INDEX:        true   
LOG_ENTRY:        false   
METADATA_DEFINITION:    ./metadata/reso_1_2.js
PROCESS_WAIT:        15   
RESPONSE_LIMIT:        200   
#
# HTTPS Certificates
#
#CA_CERTIFICATE:        ./ssl/ca.crt
#SERVER_CERTIFICATE:    ./ssl/server.crt
#SERVER_KEY:        ./ssl/server.key
#
# Authentication
#
#AUTH_REALM:        RESO API SERVER   
#AUTH_TYPE:        Basic   
#AUTH_TYPE:        Digest   
AUTH_TYPE:        None   
#
# OAuth2 Services
#
#OAUTH2_URI:        https://localhost:1340/authorize   

 

Running the sample startup file, we get

image

we attempt to fix by taking the “alternative” configuration path, using the data dictionary package.

image

image

Augmenting the instructions, we consult

 

This fixes one startup issue, but then we hit a next – obviously related to SSL setup:

image

image

http://nodejs.org/api/tls.html

we install openssl (that NSA-friendly package full of strangely broken crypto and security protocols, supported by “american foundations”);

image

http://slproweb.com/products/Win32OpenSSL.html

Several things seem to fix, on merely installing openssl (and adding the bin directory to the system path).

Without creating any ssl/ directory (or keys), and having adjusted the test.js and service.conf to be in the root project directory (and out of the metadata/ directory), we now get

image

which we guess means we are missing a mongodb server (listening on the port mentioned).

image

image

Lets guess, given the code below, that we followed bad instructions, initially – and somehow were not reading the correct files, in the right directories etc.

image

as it actually stands, after all this confusion:

a) we have a mongdb instance

image

and

b) a visual studio debugging instance of the node.js process (THAT evidently has a connection to a mongdb with zero entities).

image

at which point we seem able to boot a components hosting a couple of services

image

Posted in odata

Sharing queries over an odata feed, using Rapattoni netmagic.

In the last memo, netmagic users shared query within the netmagic community. The queries were designed to run over tables in worksheets that the user, both sharing-user and shared-user, already had –-having downloaded the excel file with the “source” snapshot of listing data. This model works fine, for teams of rea`ltors perhaps. It doesn’t work so well with realtors and homeowners.

So in the next trial, we make a workbook that references a public source – of the snapshot data. That is, we reference the odata feed (of the same data snapshot).

So, as a non-realtor to whom a realtor has granted access to the odata feed (of yesterdays snapshot of listing data, from BARS), we get our own local copy of the source data:

 

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/Barstow.xlsx/Odata/Table_Query_from_Barstow

This process of course avoids having to download an excel file full of data sourced to an ODBC connection (that the typically users will NOT have). Immediately, we see the worksheet fill with the preview rows, and then the odata source is consulted:

image

we can now share to share our data, once we have netmagic/office use odata to sort and filter to look at properties measured in lot square footage.

image

one see then (and note what happens less than perfect data entry is treated this way)

image

we then share our BI query (over an odata source, that tied to a netmagic/sharepoint hosted excel file with table)

 

image

if we sign out (of support170@rapmlsqa.com) and sign into the query tool as niki@rapmlsqa.com

image

we see that niki has access to support170s work, filtering and sorting suitably for their professional conversation.

image

Niki is a power excel user, and loads the data into power pivot (the full cube-oriented data analytics tool that used to come only with high end servers).

image

Posted in odata, RETS

Sharing business intelligence queries on RETS sources using netmagic (and odata)

We loaded our sharepoint-hosted “barstow” table into a fresh excel worksheet, having re-enabled the data source and content (from now untrustworthy sources, for some reason).

Our next task to move away from the traditional excel world used so far and onto the world of power query, where we get all the power of websso, access control, and sharing.

So, we used the signin feature of the power query addon itself – logging into our office365 “netmagic” environment using websso and oauth2. We used user “andy@rapmlsqa.com”. These credentials, sessions, and tokens are stored on our computer, recall (and not in the workbook).

Next, we used the power query featureset and loaded-up a query “from table” in the current workbook – this being the resultset and now “table” of RETS data obtained from the ODBC connection.

 image_thumb[10]

We loaded the result of executing the ‘power query’ into a second worksheet. And then we shared the query, using the office 365 power BI catalog:

image_thumb[15]

 

image_thumb[17]

if we signout of power query (as andy@rapmlsqa.com) and sign back in as another netmagic user (support170@rapmlsqa.com), we see

image

image

image

 

We see that in a new excel workbook (logically operating on a different computer to that used by andy@rapmlsqa.com), we can use the search tool, for ‘organizational” queries, shared within the netmagic community.

image

we can load the preview data into the query itself (vs a workbook, that will attempt to talk to the true data source with all the data)

 

image

 

Of course, if one executes this query from a workbook that does not already contain a sheet with the (local) table referenced by the query, one gets no data:

image

This all nicely shows that the access control is on the queries and their sharing, not on the data (which is assumed to be available). One is a realtor, say, sharing one’s  intelligence and data analysis skills and results (with others).

Posted in odata, RETS

exposing (store) RETS 1. data as odata, with authentication

We added the 32-bit power query addin to our Excel 2013 tool. We also saved to our office 365 sharepoint “bi” subsite (of sharepoint online) our excel workbook, within which we had loaded the results of an RETS 1.0 query issued by the Microsoft Query tool – as feature of traditional Excels “data” centric tools. This uses the “licensing” and websso session held in the office 365 cloud for “netmagic” by the (licensed) excel tool itself. This is actually the user known as admin@netmagic.onmicrosoft.com.

image

Once sees that a basic odata feed emerges straightaway:

image

The data (preloaded, and static) is then available from the “table” based odata feed – but only to those on the access control list set by the owner of the document, in sharepoint, and those who have authenticated (to sharepoint/odata) using the Netmagic websso process:

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/Barstow.xlsx/Odata/Table_Query_from_Barstow

Posted in odata, RETS

32 bit excel 2013 with 32-bit odbc driver

On Windows 8.1, Lets install the 32-bit ODBC drvier, intending to use the traditional excel data connection:

 

image

http://nationalassociationofrealtors.github.io/ezRETS/

 

Using the 32-bit ODBC console, we take parameters from our SSO Portal and use them to configure the ODBC driver. (The SSO portal has a RETS client of its own).

 

image

image

image

image

image

image

 

Fiddler shows that the exRETS driver self-tests fine.

 

image

 

Then in uninstall our 64-bit excel (office 365 licensed) and reinstall 32 bit versions of all the office tools, binding to the office 365 license. That done, we configure a classical data link, which we see works:

image

 

We also see, from fiddler that after login the driver enumerates various metadata sources, successfully.

image

Now to make Excel fully cooperate we learn (from an expert) to use Microsoft Query  – and the ODBC source.

image

image

image

getting to

image

What we see under the hood is the following definitions for the DSN and query

DSN=bars;LoginUrl=http://rets172lax.raprets.com:6103/Barstow/BARS/login.aspx;
UID=rapstaff;;StandardNames=false;UserAgent=Rapatton1Test/1.7.2;UseHttpLogging=true;
HttpLogFile=C:\Users\Peter\Desktop\httpbars;UseBulkMetadata=true;RetsVersion=1.7.2;
IgnoreMetadataType=true;EncodingType=UTF-8 Encoding;HttpLogEverything=true;

SELECT “data:Property:RESI”.ListingRid, “data:Property:RESI”.ListingOfficeMLSID
FROM “data:Property:RESI” “data:Property:RESI”
WHERE (“data:Property:RESI”.ListingRid>0)

from

image

Posted in odata, RETS

basic auth vs marketplace credentials

 

We see from a (non-excel) consumer of the marketplace services one can build a client class, quite easily:

 

image

http://code.msdn.microsoft.com/Walkthrough-Translator-in-7e0be0f7

we see from the code that it wants us to configure the app key, for the azure marketplace account, to be applied to the particular feed:

image

tWXLMnqHpLuZNwFeecxbQGEaPVQOunDX5RpTGIwHe28=

https://api.datamarket.azure.com/Bing/MicrosoftTranslator/

We know that this marketplace access mechanism is nothing other than basic auth because we can use excel with basic auth credentials to read the feed.

 

image

 

image

Posted in odata

basic auth and odata feeds–Azure marketplace

Fiddler shows that excel presents basic auth credentials to enumerate datasets at the Azure marketplace “area” associated with the identity.

When we look at the management page for those credentials, they are a page – that first requires one logon to the marketplace site using websso (and a Microsoft account).

 

image

see https://datamarket.azure.com/account/keys

when we then use excel to access the marketplace data sources (having changed the default “account key’”, we see the classical basic auth challenge back to excels power query and (see above) the resulting UI that seeks input of the key (i.e. basic auth password)

 

image

 

Once we supply the password to excel, we see in fiddler a second attempt to enumerate the feeds:

image

 

We keyed the value BK1xHy4g0tg00Hd5HFPAdsikQ/mG2E8zBMioi42JDlE=, and we note that excel responds with Authorization: Basic RmVlZEtleTpCSzF4SHk0ZzB0ZzAwSGQ1SEZQQWRzaWtRL21HMkU4ekJNaW9pNDJKRGxFPQ==, whose value block is of course, once unpacked: FeedKey:BK1xHy4g0tg00Hd5HFPAdsikQ/mG2E8zBMioi42JDlE=. So now we understand that the user name is fixed, and the “basic auth” is just some password for the feed itself (not necessarily a user).

As we see from the management concept of azure marketplace, the passwords are proxies for the Microsoft account user

Posted in odata

getting a RESO style atom service document from SharePoint Online

In a office 365-hosted sharepoint subsite, called bi, we store an excel file in the documents folder.

 

image

The pwdoc resource is simply an excel file, uploaded.

image

The sharepoint infrastructure has a nice feature – that any “table” within an uploaded excel file becomes an odata feed:

Our first table, with columns mlsid and payment, is known as PW and becomes the “PW” feed

 

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData/PW

The second table is known as NRDS:

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData/NRDS

we can use Excel’s power query tool chain to visit the

 

image

https://netmagic.sharepoint.com/bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData

 

which produces

 

image

 

Comments on the RESO spec, at http://members.reso.org/display/API/2.3.2+URI+Stem, has an example

image

In the example, an optional convention for addressing service documents by URI is, for our domain name, https://netmagic.sharepoint.com/RESO/OData/ .

 

In fiddler, the client’s proxy, we add a rule:

urlreplace /RESO/OData/ /bi/_vti_bin/ExcelRest.aspx/Shared%20Documents/pwdoc.xlsx/OData

 

image

note how the address bar no longer matches with the xml base.

 

Another solution is to do a formal redirect, so that the atom document has a base that aligns with the browser:

image

using a redirect (in fiddler, rather than a true load balancer setup)

image

Posted in odata

steps to have excel talk to Graph API

Here are the steps to talk to the AAD graph API using excel 2013 and power query. The steps are similar to those used when talking to the supported odata feed of the CRM Online service. A workaround is given that makes excel talk to the graphAPI.

 

1. open an excel workbook and activate the power query ribbon. Select an odata feed:

image

 

2. Supply the Graph API URI

From http://msdn.microsoft.com/en-us/library/azure/jj126255.asp

in my case the value is

https://graph.windows.net/rapmlsqa.com/users?api-version=2013-04-05

image

 

3. Focus on users

you must select the user form of the URI (and the organizational id option)

image

 

4. Sign in and STAY SIGNED IN

image

image

 

4.save the credential set

image

choosing the users option, and “save”.

 

One then sees what feels like a bug – where upon one must save again. Note a specific workaround is required. You MUST save again, now as anonymous.

image

 

the net result is a query resolution:

image

Posted in odata

using excel powerquery to talk to CRM odata service (with sso/oauth, etc)

 

 

image

image

one sees a sensible Fiddler trace.

image

image

 

one sees the final phases of the oauth2 handshake with an AS augmented with the ability to do websso with an IDP. One sees several calls to the services of the odata source, each of each of which have an authorization header in the HTTP request, populated with the access token obtained form the oauth2 handshakes.

 

I think Ive also learned the right mental model, for the credentials (and stored tokens) associated with any given data source. One can regenerate the access tokens, per source.

 

image

 

in our trial, we simply signed in as a different Netmagic user – i.e. different CRM netmagic-linked user.

image

Posted in odata

accessing CRM online, using AAD tenant discovery

Let’s build the modern/mobile odata application that talks to the CRM online service that we provisioned in the Microsoft Azure cloud, for some Rapattoni netmagic users. Lets see how it works when the client cooperates with the oauth2 endpoints of AAD/netmagic to obtain access and refresh tokens;  and how the client populates the access tokens into odata requests:

 

image

From the CRM 2013 SDK

 

Once we register our new windows 8 “native application” in Azure AD, the windows 8 application acts conventionally:

 

image

 

We note the code that precedes this invocation of the oauth2 handshake. It learns the tenant name and address of the associated the oauth2 server …from a discovery request. The return of an enhanced www-authorization header (of authority_uri) occurs only when one adds a particular relevant query string (SDKClientVersion=X), with suitable values. That is, clients must “opt in” to the practice.

 

image

 

private const string _clientID = “d00b06a7-4dbb-4eab-bb59-8d63a4783d36″;
public const string CrmServiceUrl = “https://netmagic.crm.dynamics.com/”; 

 

We see the request for discovery and then for odata recovery of an account entity set.

image    

 

image

 

image

Posted in odata

scanning commodity crypto chips–using enhanced MRI

Im not really sure WHY its viewed as SUCH a secret, but, from the stasi-sideeffect I experienced yesterday, the empire is inclined to hit back – over the MRI article.

 

So much for American freedoms of speech. SO much for the exceptionals (who act more like the typical nazi, once the defense of exceptionalism becomes paramount).

 

The typical crypto chip is rather amenable, by design, to MRI scanning – at a discrete level of analysis that may not quite seem plausible. But THAT it IS POSSIBLE is the secret (not that folks are scanning the electrons wandering across silicon gates, in order to subvert the vacuous, marketing-grade FIPS 140-1- boundary). We all know the americans have rigged the chips. The ONLY secret is how cheap it is!

The original motivation for my missive was the pain in my back – which rapidly became a pain in the ass.

Posted in rant

excel consuming Microsoft Graph API, with JWT

We were able to use the power query add in to Excel 2013 to interact with the graph API of AAD:

 

http://msdn.microsoft.com/en-us/library/azure/jj126255.aspx

 

image

https://graph.windows.net/netmagic.onmicrosoft.com/Users?api-version=2013-04-05

We had several issues, however, with the organizational id flow. I’m not quite sure how I got around some bugs, but I evidently did:

 

image

image

Posted in odata

excel visualization of odata source, supported by AAD websso

Using our sharepoint license associated with an office 365 plan, we created a site (bi, for business intelligence) and on that site a list (a business intelligence list of 3 text records):

 

image

https://netmagic.sharepoint.com/bi/_layouts/15/start.aspx#/Lists/bilist/AllItems.aspx

https://netmagic.sharepoint.com/bi/

https://netmagic.sharepoint.com/bi/_vti_bin/ListData.svc

we want to now talk to this list of records using odata. So we launch excel and powerquery:

 

image

image

we note the opportunity to use the websso login:

 

image

image

Selecting the odata service URI ,we get an enumeration of entity sets, including our bilist

image

image

 

We get back a record set, after querying:

image

image

 

lets refresh the query, having activated fiddler to spy on the  wire:

image

image

image

 

in terms of authentication, we see that the odata queries are supported by cookies. That is one does signin using the powerquery tool (using oauth, and ws-federation), in order to get a pretty classical WIF session cookie (fedauth):

image

image

Above, we see the sharepoint users (linked up to AAD managed users)

Posted in odata

NSA got into my MRI scanner…

The hardest part of  looking at the MRI of my back was finding a PC with a cdrom reader!

Then we find that NSA got to it, first (the MRI machine, that is).

 

WIN_20140812_202655

 

 

we should actually thank NSA and the whole optical spying process for having created the computing power and algorithms that induced such (obviously useful) spinoffs. Remember what MRI does – enabling advanced slicing images to be built back into a viable image suitable for human (diagnosis).).

Posted in dunno

diffusing concentrations of probability; oracles based on superposition encodings

We can compare the argument for introducing an ‘oracle’ – that hides the names of the edge/color labels used by one binary tree from the other…

 

image

image

image

 

image

with the intuition given for how the zig-zag product creates rotation algebras that quickly mix a vector into the uniform distribution:

 

image

 

image

http://www.math.ias.edu/~avi/PUBLICATIONS/ReingoldVaWi2000.pdf

 

 

The latter paper goes to describe how to construct the large graph and why it works – to redistribute concentrations of probability:

image

Posted in crypto

Zigzag versus fano plane

On reading how the zigzag graph works, I found myself applying the ideas to turing’s use of the fano plane.

The point to observe is that the fano plane is a set of design blocks: with the 3 point of each line logically affiliated with an opposite point (on the unit circle). One should think of each as a cloud, having a particular conditional distribution with respect the graph as a whole. It’s the zig (or the zag).

Now the point about zigzag is that the semi direct product generates an averaging process that diminish certain vector lengths – those vectors that are, specifically orthogonal to the constant functions in the space. The proportion of diminishing is a function of the spectral gap of the character function.

The nearest neighbors are clouds, that is. Or they ‘re the set of blocks, equivalently.

So what is the hyphen function of zigzag, in the fano plane?

It’s the unit circle, and its permutation centric automorphisms acting as a transducer of entropy induced by the zig, or an entropy generator should the zig have contributed none.

In geometric terms, folks are reducing the angle between the constant function and those vectors in that special space to which the averaging process uniquely applies. What the zag does is renormalize and recentralize that space (so it’s orthogonal again, in the new norm now) ready for the next round.

Thus you see how the  conditional probabilities are transformed.

For the first time I think we understand the 1950s response to the attacks mounted against tunny and purple.

Posted in coding theory

Links between defcon and us military complex

Defcon used to be about naughty hackers (often in trouble). There was a pseudo political agenda.

Looking through speaker list, one sees how most of those doing the speaking and leading of opinion are very much part of the US military-influenced mainstream. That is, we are dealing with folks who are fully indoctrinated.

As a sideline hobby, they hack (for fun, as hobbyists). This seems little more than folks learning techie tools and then showing off technical knowhow.

What you don’t hear is any subversive tone (or even any rebelliousness, even). There is very little dissent from the mainstream.

Posted in coding theory

defcon and animal farm–the morality play

Before Zimmerman delivered his business plan (and rant about his warning on how compromise of a CA/PKI failed to prevent someone spying on everyone in Iran (and America)) some other defcon older type sold his book on UFOs theory, and the giant conspiracy of NSA, CIA and James Bond.

 

Defcon has its major bash tonight, the party that it once WAS. Now its just a business, flogging tawrdy security to folks whose livelihoods depend on keeping well IN STEP with corporate culture (which is a kow tow culture of subservience to NSA, etc).

While I won’t deny anyone their rights to their party week, it is sad to see defcon become a goon show.

In animal farm, the giant CIA movie plot apparently, the dogs get nasty. Which is what Im seeing of defcon culture, as it self-protects (its nice little money making venture).

 

So far there is no Snowden (other than using the theme to sell something). Which tells you what we all know – that the typically means BY WHICH NSA infiltrates the corporate world is , ahem, by using or abusing the very people attending defcon.

 

What worse, those folks are in abject denial (or perhaps its all a giant covert op, by ex military types beholden to the exceptional nation status) that this is ALL their industry really is FOR.

Posted in rant

using AAD graph API to create federated user (or provision one, more formally)

Sample code for openid connect protocol and the graph API can be found, today, at https://github.com/AzureADSamples/WebApp-GraphAPI-DotNet.

image

Having configured this webapplication per the instructions, for our rapmlsqa.com tenant, on one screen we see the UI by using which one creates a user in the directory.

image

http://reso-odatapoc.azurewebsites.net/Users/Create

Using the creation tool lets us see what passes on the wire:

image

we are unable, however, to find which parameters one must pass using this API when we want creation of a “federated” user. So far, we have created only “managed” users – who do not have federated status, by definition.

Spying on powershell commandlets gives us a glimpse however, of the semantic rules concerning federated user creation. Though the service uses a different and non-RESTful protocol, we can see the type of information to be passed during “provisioning

image

        image

OK. so it turns out to be simple:

image

add the immutableID to the binding list, and amend the form to expose the immutableid label and field editor:

image

Then we can create and list users created in certified domains:

image

image

This gets us to proving it works… with our IDP:

image

Posted in AAD

objecting to the club vs NSA analogy; using my CISSP knowhow in ape culture

Somebody found a reason to object to the “writing” of my last missive since it portrayed NSA as a stone age clubber defeating the electronic crypto lock on my front door – by battering down the casing. The objection was founded in the analogy’s inappropriateness – since NSA would not want to show itself to have subverted the lock, as would be rather evident should the analogy be valid.

You have to remember that my apes are American apes. And from what I learned at the RSA Conference, last year, as part of my CISSP re-education camp program, was that there are 3 types of monkeys (in Bali, or elsewhere). The first group are cuddly (like the guys who were first sent to work over the OpenID Foundation); working your over with affection so you happily hand over half of your bananas. The second group mug you for the remaining half, leaving you feeling somewhat used (like the “former government contractors” who somehow always turn up in Google /Microsoft standards forums, with a snarl and a growl). The third group are the intelligent apes, working with the human banana salesman (now that you have no bananas left, after the mugging). These guys use intelligence, scouts, environment funneling and have a certain ape psychology base din the “mutual understanding” between master human, ape and us that if they hound you (think Verizon, Google, BT) by perhaps stealing your wife’s scrunchie from her hair, you will buy some more bananas and negotiate for return of the “emotional attachment item” – to use an Americanism.

So where is NSA? in that story?

NSA is the human signaling to the intelligent apes on who to target, how to hound, and when to return the object given the purchase of a suitable number of additional bananas. The monkeys, typically IETF types, have long learned the signal craft, knowing that even if the target walks off with most of the bananas, the salesman will provide the ape’s real cut (later). its not the apes role to negotiate, only hound and be the front man.

So getting back to my front door, how does the missive end, properly? assuming it’s a standard piece of American writing?

Well I would have to agree that NSA, as a covert operator, would not seem to be properly represented as he who leaves a trail of smashed casings. But then with Verizon and google and others showing that the front doors to our internet portals ARE smashed, thanks to subversion of their clumsy “legal locks” that don’t work, perhaps when we regard NSA as the intelligent monkey minder, we see that in fact the casing analogy was fine all along.

Surely, that was worth an ISC2 CPE, for sheer invention and consideration – particularly since I got 2 of them for listening to some 70 year geezer show a video about his wonderful office staff, while drinking beer. Ah, but then the CISSP head honcho is the true NSA operator serving beer, in order to subvert a mass audience of opinion makers belonging to the real “club”.

Posted in rant

cryptographic games–China vs US- removing the UK’s game rules

image

http://www.cnet.com/news/china-lashes-out-at-google-apple-for-allegedly-stealing-state-secrets/

China would be well advised to consider the relevance of the “playing ball” phrasing.

Any good politician talks through both sides of her mouth – letting two audiences hear what they wish, from the same words. Google play ball, with words.

To Google, as any American firm, this “is” a game (to be won and lost, and competed over). It has to kept as a “game” since there IS no simple political solution. Being upbeat Americans, they get to choose between a mindless, senseless and failure-inducing world … or a game ( in which perhaps politics keeps the ball at least rolling).

Of course, google know that the standards groups are rigged. They know that the IESG might as well be made up of US officials, beholden to the american dogma. Of course they know that policies, practices and standards are are set to meet the commodity market (not the state-secrets market). And that this means that the security works ONLY in the areas its supposed to work (and not in areas “ out of scope”).

According to the American game, that both Google and Microsoft play well, the game rules are set so as to keep anything of much importance in the “out of scope” space – where normal spying techniques work.

Where China vs US seems to be enabling a solid China win is in the area of vendors, e.g. Google, who look increasingly shrill – as they attempt to deny the nature of the game – and their own political doublespeak. Yes there is NO backdoor to my electronic house locks (and no agreement with the local police to allow covert entry, with special lock codes); but two decent sledgehammers blows to the *casing* of any common, wooden house door make it “unnecessary to pick the lock”. The lock resists penetration by Harvard crypto scientists for 2h! (says the google security marketing). It’s just “not google’s problem” that the casing resists for 20s, only, to apes wielding  stone-age clubs.

And so it is with Google and Microsoft Crypto.

Both firms know precisely how to sledgehammer areas of their product other than crypto, making the crypto ineffective. And of course, they spend large amounts of money, in marketing UK-style deceive-and-deflect security standards , ensuring that “those areas” are “out of scope” in the “mind” of the public.

Posted in rant

math and cryptanalysis–some notes

three observations about math and crypto.

 

1. Quaternion “Algebras”

Its fun to look at quaternions as a special kind of polynomial sum, with terms weighted as is usual. Then, its interesting to see how to abstract H, the quaternions, in quaternion algebras – for different basis sets.

In light of how the theory of albelianization of streams, one can feel, quite intuitively, just how this is so crucial to even modern cryptanalysis.

 

image

image

image

http://www.maths.tcd.ie/pub/ims/bull57/S5701.pdf

 

The main point is that F can be anything (including huge number systems) good for cryptanalytical discriminant finding.

 

2. Wave “functions”

image

https://www.youtube.com/watch?v=8mi0PoPvLvs#t=1128

Also fun to review even the basics of quantum mechanics, so well reviewed by Susskind.

He was able to put succinctly how a wave function is just a function of x (just as the are the functions we all learn about, aged 12). Its just that the calculation formulation for that function is an inner product, where x varies (just as it does in the functions we all…).

What he doesn’t do well is just say what Turing said: a wave function is just an array of proportions!

 

3. hyperbolic geometries and generating algebras

 

its from hyperbolic geometry that we can a glimpse of how Turing saw quantum modeling and simulation AT AN INTUITIVE level.  IN particular, in his on permutations paper, we say how he leveraged just 2 and 3 (2 steps between 3 points) to create normed spaces (of 6 elements); then argued how energy functions (i.e. Hamiltonians evolving the energy component of quantum systems over time ) can be represented simply in terms of permutation groups, leveraging the projection of those functions in a hyperbolic geometry onto a projective plane that supports calculation in terms of long expressions of (ordered) swaps, using just the Newman’s core topological knowledge in foundational groups and homotopic equivalence.

 

We got to see how conjugation in a hyperbolic geometry is the reflection operation (when the geometry is the interior, or inner space, of the unit circle). Similarly, we got to see how external point relate to the circle too, and how this quickly gets us not only to the notion of duality but to external product space where constraints between two intertwined systems create a hilbert space where quadratures and spreads are preserved; and in which one can do quantum calculations using only proportions.

Posted in crypto

developing cryptographic intuitions for quantum era

we learned from the description of the us 1945 5205 cryptographic process how, despite all its engineering complexity, the machine counted how many time certain (high scoring, distinguishing) characters appeared in a tunny stream. for example, compute chi (5bits) xor ‘e’ (5 bits). now count the hits in cipher. e will have less uncertainty than it would have, were its to be found in a random cipher stream.

 

we have seen that, in quantum mechanics, a inner product is, on the one hand, a measure of the distinguishability of two states, and  ‘the value e’  on the other. Should one score ‘e’, its a measure of the transition probability. perhaps e is just a representative of its weight class, and perhaps one goes about counting any distinction equivalent to a weight difference.

Posted in crypto

Realty ws-trust IDP interworking with AAD token issuer, in saml bearer grant

image

 

Using fiddler proxy, we were able to craft delivery of custom metadata from our IDP whose endpoint addresses now meets the expectations of the Micosoft ADAL libraries saml-bearer grant flow.

 

The only code change we made to this service was  add a nameid format property to the subject field (of value unspecified). But, Im really not convinved that that has anything to do with sudden interoperability. Making our active and passive STS have configurable values for that property doesn’t seem particularly useful, though is vaguely more correct.

Posted in AAD

cimba.co

 

signup using chrome (not IE) to get a (windows) cert

 

image

image

create microblog and channel, and first post.

 

in IE metro mode, note how to login to site

 

image

 

image

 

after a second prompt to select the cert, we do get a modern UI

 

image

 

not succesful on windows phone 8.1, having loaded up certs/keys, etc; probably as the TLS handshake doesn’t validly induce certificate selector. Too many assumptions being made in cert naming, no doubt.

Posted in webid

k-order propagation in hyperbolic reasoning calculation spaces

image

image

 

if we think in terms of Wildberger’s universal hyberbolic geometry, the case for defenses against linear cryptanalysis is one of ensuring that a wheel of points is assumed to have to each point a binary value and one looks back from the current pointer sufficient pads to get the components of the codeword, that has the requisite number of hamming bits. IN the case of differential cryptanalysis, one looks back enough component terms so one has k supports

Posted in crypto

hamming weights, correlation immunity, proportional bulge algebras

 

 

image

Correlation of Boolean Functions – MIT – Massachusetts Institute

 

 

the original conception of golomb is far more intuitive than others, particularly when taking into consideration

 

image

https://www.youtube.com/watch?v=PSFr6_EhchI#t=1222

this guy does a great job of reasoning much like Turing and co reasoned in 1943, using ratios in a hyperbolic computation graph space that reasons with correlations (contrasting null points on the distinguished circle – like enigma rotor points – with points on the overlying triangle – which represent non-unitary correlations linking plaintext to ciphertext).

 

now it becomes very obvious why Turing, in On permutations,  is so adamant to set the mean of vector length to be 1. He is entirely reasoning in proportions.

It will be fun to see if Wildberger can get us all the way from here – at as he says the elementary stuff that is key to “thinking differet” – to fourier transforms computed in proportional algebras.

We know folks in the cryptanalytical attack on Tunny made exactly that leap.

Posted in crypto