Chris Speers

Azure Table Storage and PowerShell, The Hard Way

In my previous post I gave a quick overview of the Shared Key authentication scheme used by the Azure storage service and demonstrated how authenticate and access the BLOB storage API through PowerShell.  The file and queue services follow an authentication scheme that aligns with the BLOB requirements, however the table service is a bit different.  I felt it might help the more tortured souls out there (like myself) if I tried to describe the nuances.

Azure Storage REST API, Consistently Inconsistent

Like the REST of all things new Microsoft (read Azure), the mantra is consistency.  From a modern administrative perspective you should have a consistent experience across whatever environment and toolset you require.  If you are a traditional administrator/engineer of the Microsoft stack, the tooling takes the form of PowerShell cmdlets.  If you use Python, bash, etc. there is effectively equivalent tooling available.  My gripes outstanding, I think Microsoft has done a tremendous job in this regard.  I also make no claim that my preferences are necessarily the correct ones.  The ‘inconsistencies’  I will be discussing are not really issues for you if you use the mainline SDK(s).  As usual, I’ll be focusing on how things work behind the scenes and my observations.

Shared Key Authentication, but Not All Are Equal

In exploring the shared key authentication to the BLOB REST API, we generated and encoded the HTTP request signature.  The string we needed to encode looked something like this:

GET
/*HTTP Verb*/
/*Content-Encoding*/
/*Content-Language*/
/*Content-Length (include value when zero)*/
/*Content-MD5*/
/*Content-Type*/
/*Date*/
/*Range*/  
x-ms-date:Sun, 11 Oct 2009 21:49:13 GMT x-ms-version:2009-09-19
/*CanonicalizedHeaders*/  
/myaccount/mycontainer\ncomp:metadata\nrestype:container
timeout:20

The table service takes a much simpler and yet arcane format that is encoded in an identical fashion.

GET
application/json;odata=nometadata
Mon, 15 May 2017 17:29:11 GMT
/billing73d55f68/fabriclogae0bced538344887a4021ae5c3b61cd0GlobalTime(PartitionKey='407edc6d872271f853085a7a18387784',RowKey='02519075544040622622_407edc6d872271f853085a7a18387784_ 0_2952_2640')

In this case there are far fewer headers and query parameters to deal with, however there are now fairly rigid requirements. A Date header must be specified as opposed to either Date or x-ms-date, or both in the BLOB case.  A Content-Type header must also be specified as part of the signature, and no additional header details are required.  The canonical resource component is very different from the BLOB service.  The canonical resource still takes a format of <storage account name>/<table name>/<query parameters>.  At the table service level only the comp query parameter is to be included.  As an example, to query the table service properties for the storage account the request would look something like https://myaccount.table.core.windows.net?restype=service&comp=properties. The canonical resource would be /myaccount/?comp=properties.

Generating the Signature with PowerShell

We will reuse our encoding function from the previous post and include a new method for generating the signature.


Function EncodeStorageRequest
{     
    [CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
        [String[]]$StringToSign,
        [Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
        [String]$SigningKey
    )     
    PROCESS
    {         
        foreach ($item in $StringToSign)
        {             
            $KeyBytes = [System.Convert]::FromBase64String($SigningKey)
            $HMAC = New-Object System.Security.Cryptography.HMACSHA256
            $HMAC.Key = $KeyBytes
            $UnsignedBytes = [System.Text.Encoding]::UTF8.GetBytes($item)
            $KeyHash = $HMAC.ComputeHash($UnsignedBytes)
            $SignedString=[System.Convert]::ToBase64String($KeyHash)
            Write-Output $SignedString
        }     
    } 
}

$AccountName='myaccount'
$AccessKey='vyAEEzbcnIAkLKti1leDbfrAOQBu5bx52zyCkW0fGIBCsS+DDGXpfidOeAWyg7do8ujft1mFhnz9kmliycmiXA=='
$Uri="https://$AccountName.table.core.windows.net/tables"
$SignatureParams=@{
    Resource=$Uri;
    Date=[DateTime]::UtcNow.ToString('R');
    Verb='GET';
    ContentType='application/json;odata=nometadata';
}
$RequestSignature=GetTableTokenStringToSign @SignatureParams $TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey
$TableHeaders=[ordered]@{
    'x-ms-version'= '2016-05-31';
    'DataServiceVersion'='3.0;Netfx';
    'Accept-Charset'='UTF-8';
    'Accept'='application/json;odata=fullmetadata';
    'Date'=$SignatureParams.Date;
    'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$RequestParams=@{
    Uri=$SignatureParams.Resource;
    Method=$SignatureParams.Verb;
    Headers=$TableHeaders;
    ContentType=$SignatureParams.ContentType;
    ErrorAction='STOP'
}
$Response=Invoke-WebRequest @RequestParams -Verbose $Tables=$Response.Content | ConvertFrom-Json | Select-Object -ExpandProperty value


PS C:\WINDOWS\system32> $Tables|fl
odata.type : acestack.Tables odata.id : https://acestack.table.core.windows.net/Tables('provisioninglog') odata.editLink : Tables('provisioninglog') TableName : provisioninglog

The astute reader will notice we had to pass some different headers along.  All table requests require either or both a DataServiceVersion or MaxDataServiceVersion.  These values align with maximum versions of the REST API, which I won't bother belaboring.  We also  retrieved JSON rather than XML, and have a number of content types available to take the format in which are dictated by the Accept header.   In the example we retrieved it with full OData metadata; other valid types include minimalmetadata and nometadata (atom/xml is returned from earlier data service versions).  In another peculiarity XML is the only format returned for retrieving Service properties or stats.

Putting It to Greater Use With Your Old Friend OData

You likely want to actually read some data out of tables.  Now that authorizing the request is out of the way it is a 'simple' manner of applying the appropriate OData query parameters.  We will start with retrieving a list of all entities within a table.  This will return a maximum of 1000 results (unless limited using the $top parameter) and a link to any subsequent pages of data will be returned in the response headers.  In the following example we will query all entities in the fabriclogaeGlobalTime table in the fabrixstuffz storage account.  In the interest of brevity I will limit this to 3 results.


$TableName='fakecustomers'
$Uri="https://$AccountName.table.core.windows.net/$TableName"
$SignatureParams=@{
    Resource=$Uri;
    Date=[DateTime]::UtcNow.ToString('R');
    Verb='POST';
    ContentType='application/json;odata=nometadata'; 
} 
$RequestSignature=GetTableTokenStringToSign @SignatureParams $TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey
$TableHeaders=[ordered]@{
    'x-ms-version'= '2016-05-31'
    'DataServiceVersion'='3.0;Netfx'
    'Accept-Charset'='UTF-8'
    'Accept'='application/json;odata=fullmetadata';
    'Date'=$SignatureParams.Date;
    'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$PartitionKey='mypartitionkey'
$RowKey='row771'
$TableEntity=New-Object PSobject @{
    "Address"="Mountain View";
    "Name"="Buckaroo Banzai";
    "Age"=33;
    "AmountDue"=200.23;
    "FavoriteItem"="oscillation overthruster";
    "CustomerCode@odata.type"="Edm.Guid";
    "CustomerCode"="c9da6455-213d-42c9-9a79-3e9149a57833";
    "CustomerSince@odata.type"="Edm.DateTime";
    "CustomerSince"="2008-07-10T00:00:00";
    "IsActive"=$true;
    "NumberOfOrders@odata.type"="Edm.Int64"
    "NumberOfOrders"="255";
    "PartitionKey"=$PartitionKey;
    "RowKey"=$RowKey
}
$RequestParams=@{
    Uri=$SignatureParams.Resource;
    Method=$SignatureParams.Verb;
    Headers=$TableHeaders;
    ContentType=$SignatureParams.ContentType;
    ErrorAction='STOP'
}
$Response=Invoke-WebRequest @RequestParams

This should yield a result looking like this.


Cache-Control: no-cache
Transfer-Encoding: chunked
Content-Type: application/json;odata=nometadata;streaming=true;charset=utf-8
Server: Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0
x-ms-request-id: 56afccf3-0002-0104-0285-d382b4000000
x-ms-version: 2016-05-31
X-Content-Type-Options: nosniff
x-ms-continuation-NextPartitionKey: 1!44!NDA3ZWRjNmQ4NzIyNzFmODUzMDg1YTdhMTgzODc3ODQ-
x-ms-continuation-NextRowKey: 1!88!MDI1MTkwNjc4NDkwNDA1NzI1NjlfNDA3ZWRjNmQ4NzIyNzFmODUzMDg1YTdhMTgzODc3ODRfMF8yOTUyXzI2NDA- Date: Tue, 23 May 2017 05:27:28 GMT
{
    "value":  [
                  {
                      "PartitionKey":  "407edc6d872271f853085a7a18387784",
                      "RowKey":  "02519067840040580939_407edc6d872271f853085a7a18387784_0_2952_2640",
                      "Timestamp":  "2017-05-23T05:25:55.6307353Z",
                      "EventType":  "Time",
                      "TaskName":  "FabricNode",
                      "dca_version":  -2147483648,
                      "epoch":  "1",
                      "localTime":  "2017-05-23T05:21:07.4129436Z",
                      "lowerBound":  "2017-05-23T05:19:56.173659Z",
                      "upperBound":  "2017-05-23T05:19:56.173659Z"
                  },
                  {
                      "PartitionKey":  "407edc6d872271f853085a7a18387784",
                      "RowKey":  "02519067843040711216_407edc6d872271f853085a7a18387784_0_2952_2640",
                      "Timestamp":  "2017-05-23T05:20:53.9265804Z",
                      "EventType":  "Time",
                      "TaskName":  "FabricNode",
                      "dca_version":  -2147483648,
                      "epoch":  "1",
                      "localTime":  "2017-05-23T05:16:07.3678218Z",
                      "lowerBound":  "2017-05-23T05:14:56.1606307Z",
                      "upperBound":  "2017-05-23T05:14:56.1606307Z"
                  },
                  {
                      "PartitionKey":  "407edc6d872271f853085a7a18387784",
                      "RowKey":  "02519067846040653329_407edc6d872271f853085a7a18387784_0_2952_2640",
                      "Timestamp":  "2017-05-23T05:15:52.7217857Z",
                      "EventType":  "Time",
                      "TaskName":  "FabricNode",
                      "dca_version":  -2147483648,
                      "epoch":  "1",
                      "localTime":  "2017-05-23T05:11:07.3406081Z",
                      "lowerBound":  "2017-05-23T05:09:56.1664211Z",
                      "upperBound":  "2017-05-23T05:09:56.1664211Z"
                  }
              ]
}

You should recognize a relatively standard OData response, with our desired values present within an array as the value property. There are two response headers to note here; x-ms-continuation-NextPartitionKey and x-ms-continuation-NextRowKey. These headers are the continuation token for retrieving the next available value(s). The service will return results in pages with a maximum length of 1000 results, unless limited using the $top query parameter like the previous example. If one were so inclined, they could continue to send GET requests, including the continuation token(s) until all results are enumerated.

Creating (or updating) table entities is a slightly different exercise, which can become slightly convoluted (at least in PowerShell or other scripts).  Conceptually, all that is required to create an entity is a POST  request to the table resource URI with a body containing the entity and the appropriate required headers.  The complexity is primarily a result of the metadata overhead associated with the server OData implementation. We'll examine this by inserting an entity into a fictional customers table.

You should end up receiving the inserted object as a response:


PS C:\Windows\system32> $Response.Content | ConvertFrom-Json
PartitionKey : mypartitionkey
RowKey : row772
Timestamp : 2017-05-23T06:17:53.7244968Z
CustomerCode : c9da6455-213d-42c9-9a79-3e9149a57833
FavoriteItem : oscillation overthruster
AmountDue : 200.23
IsActive : True
CustomerSince : 2008-07-10T00:00:00
Name : Buckaroo Banzai
NumberOfOrders : 255
Age : 33
Address : Mountain View 

You should notice that the object we submitted had some extra properties not present on the inserted entity. The API requires that for any entity property where the (.Net) data type can not be automatically inferred, a type annotation must be specified. In this case CustomerCode=c9da6455-213d-42c9-9a79-3e9149a57833 is a GUID (as opposed to a string) requires a property CustomerCode@odata.type=Edm.Guid.  If you would like a more complete explanation the format is detailed here.

Three ways to do the same thing

You've got to give it to Microsoft, they certainly keep things interesting.  In the above example, I showed one of three ways that you can insert an entity into a table.  The service supports Insert, Insert or Merge (Upsert), and Insert or Replace operations (there are also individual Replace and Merge operations).  In the following example I will show the Upsert operation using the same table and entity as before.


$Uri="https://$AccountName.table.core.windows.net/$TableName(PartitionKey='$PartitionKey',RowKey='$RowKey')"
$SignatureParams=@{
    Resource=$Uri;
    Date=[DateTime]::UtcNow.ToString('R');
    Verb='MERGE';
    ContentType='application/json;odata=nometadata';
} 
$RequestSignature=GetTableTokenStringToSign @SignatureParams
$TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey $TableEntity | Add-Member -MemberType NoteProperty -Name 'NickName' -Value 'MrMan'
$TableHeaders=[ordered]@{
    'x-ms-version'= '2016-05-31'
    'DataServiceVersion'='3.0;Netfx'
    'Accept-Charset'='UTF-8'
    'Accept'='application/json;odata=fullmetadata';
    'Date'=$SignatureParams.Date;
    'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$RequestParams = @{
    Method= 'MERGE';
    Uri= $Uri;
    Body= $($TableEntity|ConvertTo-Json);
    Headers= $TableHeaders;
    ContentType= 'application/json;odata=fullmetadata'
}
$Response=Invoke-WebRequest @RequestParams 

This should yield a response with the meaningful details of the operation in the headers.


PS C:\Windows\system32> $Response.Headers
Key                    Value
---                    -----  
x-ms-request-id        48489e3d-0002-005c-6515-d545b8000000
x-ms-version           2016-05-31 
X-Content-Type-Options nosniff
Content-Length         0
Cache-Control          no-cache
Date                   Thu, 25 May 2017 05:08:58 GMT
ETag                   W/"datetime'2017-05-25T05%3A08%3A59.5530222Z'"
Server                 Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0

Now What?

I'm sure I've bored most of you enough already so I won't belabor any more of the operations, but I hope that I've given you a little more insight into the workings of another key element of the Azure Storage Service(s). As always, if you don't have a proclivity for doing things the hard way, feel free to check out a module supporting most of the Table (and BLOB) service functionality on the Powershell Gallery or GitHub.

Azure BLOB Storage and PowerShell: The Hard Way

Shared Key Authentication Scheme

In a previous post I covered my general love/hate affair with PowerShell; particularly with respect to the Microsoft Cloud.  For the majority of you than can not be bothered to read, I expressed a longstanding grudge against the Azure Cmdlets, rooted in the Switch-AzureMode fiasco.  As an aside, those of you enjoying the Azure Stack technical previews may notice as similar problem arising with  'AzureRM Profile', but I digress. More importantly, there was a general theme of understanding the abstractions placed in front of you as an IT professional.   By now, most of you should be familiar with the OAuth Bearer tokens used throughout the Microsoft cloud.  They are nearly ubiquitous, with the exception of a few services, most importantly storage.  The storage service is authenticated with a Shared Key Authentication or a Shared Access Signature. I will be focusing on the former.

Anatomy of the Signature

The Authentication header of HTTP requests backing the Azure Storage Services take the following form:

Authorization: SharedKey <Storage Account Name>:<AccessSignature>

The access signature is an HMAC 256 encoded string (Signature) which is constructed mostly of the components of the backing HTTP request. The gritty details are (somewhat) clearly detailed at MSDN, but for example the string to be encoded for getting the list of blobs in a container, looks something like this.


GET
x-ms-date:Mon, 08 May 2017 23:28:20 GMT x-ms-version:2016-05-31 /nosaashere/certificates comp:list restype:container

Let's examine the properties of a request for creating a BLOB Snapshot.

GET https://nosaashere.blob.core.windows.net/nosaashere/managedvhds/Provisioned.vhdx?comp=snapshot

Canonical Resource comp:snapshot Canonical Resource Query

PUT VERB x-ms-date:Mon, 08 May 2017 23:28:21 GMT Canonical Date Header x-ms-version:2016-05-31 Canonical Header /nosaashere/managedvhds/Provisioned.vhdx

A more advanced request (like this example for appending data to a Page BLOB) will show how additional headers come into scope as we include an MD5 Hash to verify the content, a content-length, and other required API headers.


PUT
4096000
32qczJv1wUlqnJPQRdBUzw==
x-ms-blob-type:PageBlob
x-ms-date:Mon, 08 May 2017 23:28:39 GMT
x-ms-page-write:Update x-ms-range:bytes=12288000-16383999
x-ms-version:2016-05-31 /nosaashere/managedvhds/Provisioned.vhdx comp:page

The general idea is the verb, standard and custom request headers, canonical headers, canonical resource and query are presented as a newline delimited string.  This string is encoded using the HMAC256 algorithm with the storage account key.  This base64 encoded string is used for crafting the Authorization header.  The Authorization header is passed with the other headers used to sign the request.  If the server is able to match the signature, the request is authenticated.

Putting this in some PoSh

First things first, we need to generate the string to sign.  This function will take arguments for the desired HTTP request (URI, Verb, Query, Headers) parameters and create the previously described string.


Function GetTokenStringToSign
{
    [CmdletBinding()]     
    param
    (
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [ValidateSet('GET','PUT','DELETE')]
        [string]$Verb="GET",
        [Parameter(Mandatory=$true,ValueFromPipelineByPropertyName = $true)]
        [System.Uri]$Resource,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [long]$ContentLength,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [String]$ContentLanguage,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [String]$ContentEncoding,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [String]$ContentType,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [String]$ContentMD5,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [long]$RangeStart,
        [Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
        [long]$RangeEnd,[Parameter(Mandatory = $true,ValueFromPipelineByPropertyName = $true)]
        [System.Collections.IDictionary]$Headers
    )

    $ResourceBase=($Resource.Host.Split('.') | Select-Object -First 1).TrimEnd("`0")
    $ResourcePath=$Resource.LocalPath.TrimStart('/').TrimEnd("`0")
    $LengthString=[String]::Empty
    $Range=[String]::Empty
    if($ContentLength -gt 0){$LengthString="$ContentLength"}
    if($RangeEnd -gt 0){$Range="bytes=$($RangeStart)-$($RangeEnd-1)"}

    $SigningPieces = @($Verb, $ContentEncoding,$ContentLanguage, $LengthString,$ContentMD5, $ContentType, [String]::Empty, [String]::Empty, [String]::Empty, [String]::Empty, [String]::Empty, $Range)
    foreach ($item in $Headers.Keys)
    {
        $SigningPieces+="$($item):$($Headers[$item])"
    }
    $SigningPieces+="/$ResourceBase/$ResourcePath"

    if ([String]::IsNullOrEmpty($Resource.Query) -eq $false)
    {
        $QueryResources=@{}
        $QueryParams=$Resource.Query.Substring(1).Split('&')
        foreach ($QueryParam in $QueryParams)
        {
            $ItemPieces=$QueryParam.Split('=')
            $ItemKey = ($ItemPieces|Select-Object -First 1).TrimEnd("`0")
            $ItemValue = ($ItemPieces|Select-Object -Last 1).TrimEnd("`0")
            if($QueryResources.ContainsKey($ItemKey))
            { 
                $QueryResources[$ItemKey] = "$($QueryResources[$ItemKey]),$ItemValue"    
            }
            else
            {
                $QueryResources.Add($ItemKey, $ItemValue)
            }
        }
        $Sorted=$QueryResources.Keys|Sort-Object
        foreach ($QueryKey in $Sorted)
        {
            $SigningPieces += "$($QueryKey):$($QueryResources[$QueryKey])"
        }
    }

    $StringToSign = [String]::Join("`n",$SigningPieces)
    Write-Output $StringToSign 
}

Once we have the signature, it is a simple step create the required HMACSHA256 Hash using the storage account key. The following function takes the two arguments and returns the encoded signature.


Function EncodeStorageRequest
{
    [CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
        [String[]]$StringToSign,
        [Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
        [String]$SigningKey
    )
    PROCESS
    {         
        foreach ($item in $StringToSign)
        {
            $KeyBytes = [System.Convert]::FromBase64String($SigningKey)
            $HMAC = New-Object System.Security.Cryptography.HMACSHA256
            $HMAC.Key = $KeyBytes
            $UnsignedBytes = [System.Text.Encoding]::UTF8.GetBytes($item)
            $KeyHash = $HMAC.ComputeHash($UnsignedBytes)
            $SignedString=[System.Convert]::ToBase64String($KeyHash)
            Write-Output $SignedString 
        }     
    }
}

Now that we have a signature it is time to pass it on to the storage service API, for the following examples we will focus on BLOB. Let's return to the first example, retrieving a list of the BLOBs in the certificates container of the nosaashere storage account. This only requires the date and version API headers. This request would take the format:

GET https://nosaashere.blob.core.windows.net/certificates?restype=container&amp;comp=list x-ms-date:Mon, 08 May 2017 23:28:20 GMT x-ms-version:2016-05-31

To create the signature we can use the above function.


$StorageAccountName='nosaashere'
$ContainerName='certificates'
$AccessKey="WMTyrXNLHL+DF4Gwn1HgqMrpl3s8Zp7ttUevo0+KN2adpByHaYhX4OBY7fLNyzw5IItopGDAr8iQDxrhoHHiRg=="
$BlobContainerUri="https://$StorageAccountName.blob.core.windows.net/$ContainerName?restype=container&comp=list"
$BlobHeaders= @{
    "x-ms-date"=[DateTime]::UtcNow.ToString('R');
     "x-ms-version"='2016-05-31'; 
}
$UnsignedSignature=GetTokenStringToSign -Verb GET -Resource $BlobContainerUri -AccessKey $AccessKey -Headers $BlobHeaders $StorageSignature=EncodeStorageRequest -StringToSign $UnsignedSignature -SigningKey $SigningKey 
#Now we should have a 'token' for our actual request. 
$BlobHeaders.Add('Authorization',"SharedKey $($StorageAccountName):$($StorageSignature)") 
$Result=Invoke-RestMethod -Uri $Uri -Headers $BlobHeaders –UseBasicParsing

If you make your call without using the -OutFile parameter you will find a weird looking string rather than the nice friendly XmlDocument you were expecting.

<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ServiceEndpoint="https://nosaashere.blob.core.windows.net/" ContainerName="certificates">
    <Blobs>
        <Blob>
            <Name>azurestackroot.as01.cer</Name>
            <Properties>
                <Last-Modified>Fri, 05 May 2017 20:31:33 GMT</Last-Modified>
                <Etag>0x8D493F5B8410E96</Etag>
                <Content-Length>1001</Content-Length>
                <Content-Type>application/octet-stream</Content-Type>
                <Content-Encoding />
                <Content-Language />
                <Content-MD5>O2/fcFtzb9R6alGEgXDZKA==</Content-MD5>
                <Cache-Control />
                <Content-Disposition />
                <BlobType>BlockBlob</BlobType>
                <LeaseStatus>unlocked</LeaseStatus>
                <LeaseState>available</LeaseState>
                <ServerEncrypted>false</ServerEncrypted>
            </Properties>
        </Blob>
        <Blob>
            <Name>azurestackroot.as02.cer</Name>
            <Properties>
                <Last-Modified>Wed, 03 May 2017 22:54:49 GMT</Last-Modified>
                <Etag>0x8D4927767174A24</Etag>
                <Content-Length>1001</Content-Length>
                <Content-Type>application/octet-stream</Content-Type>
                <Content-Encoding />
                <Content-Language />
                <Content-MD5>arONICHXLfRUr61IH/XHbw==</Content-MD5>
                <Cache-Control />
                <Content-Disposition />
                <BlobType>BlockBlob</BlobType>
                <LeaseStatus>unlocked</LeaseStatus>
                <LeaseState>available</LeaseState>
                <ServerEncrypted>false</ServerEncrypted>
            </Properties>
        </Blob>
        <Blob>
            <Name>azurestackroot.as03.cer</Name>
            <Properties>
                <Last-Modified>Wed, 15 Mar 2017 19:43:50 GMT</Last-Modified>
                <Etag>0x8D46BDB9AB84CFD</Etag>
                <Content-Length>1001</Content-Length>
                <Content-Type>application/octet-stream</Content-Type>
                <Content-Encoding />
                <Content-Language />
                <Content-MD5>sZZ30o/oMO57VMfVR7ZBGg==</Content-MD5>
                <Cache-Control />
                <Content-Disposition />
                <BlobType>BlockBlob</BlobType>
                <LeaseStatus>unlocked</LeaseStatus>
                <LeaseState>available</LeaseState>
                <ServerEncrypted>false</ServerEncrypted>
            </Properties>
        </Blob>
        <Blob>
            <Name>azurestackroot.as04.cer</Name>
            <Properties>
                <Last-Modified>Wed, 26 Apr 2017 22:45:41 GMT</Last-Modified>
                <Etag>0x8D48CF5F7534F4B</Etag>
                <Content-Length>1001</Content-Length>
                <Content-Type>application/octet-stream</Content-Type>
                <Content-Encoding />
                <Content-Language />
                <Content-MD5>rnkI6VPz9i1pXOick4qDSw==</Content-MD5>
                <Cache-Control />
                <Content-Disposition />
                <BlobType>BlockBlob</BlobType>
                <LeaseStatus>unlocked</LeaseStatus>
                <LeaseState>available</LeaseState>
                <ServerEncrypted>false</ServerEncrypted>
            </Properties>
        </Blob>
    </Blobs>
    <NextMarker />
</EnumerationResults>

What, pray tell is this  ? In a weird confluence of events there is a long standing 'issue' with the Invoke-RestMethod and Invoke-WebRequest Cmdlets with the UTF-8 BOM. Luckily, .Net has lots of support for this stuff. Generally, most people just use the OutFile parameter and pipe it along to the Get-Content Cmdlet. If you are like me, we'll look for the UTF-8 preamble and strip it from the string.


$UTF8ByteOrderMark=[System.Text.Encoding]::Default.GetString([System.Text.Encoding]::UTF8.GetPreamble())
if($Result.StartsWith($UTF8ByteOrderMark,[System.StringComparison]::Ordinal))
{
    $Result=$Result.Remove(0,$UTF8ByteOrderMark.Length)
}
[Xml]$EnumerationResult=$Result

Now you'll see something you should be able to work with:


PS C:\Users\chris> $ResultXml.EnumerationResults
ServiceEndpoint                           ContainerName Blobs NextMarker
---------------                           ------------- ----- ----------
https://nosaashere.blob.core.windows.net/ certificates Blobs
PS C:\Users\chris> $ResultXml.EnumerationResults.Blobs.Blob
Name                    Properties
----                    ----------
azurestackroot.as01.cer Properties 
azurestackroot.as02.cer Properties 
azurestackroot.as03.cer Properties
azurestackroot.as04.cer Properties

All storage service requests return a good deal of information in the response headers.  Enumeration style operations , like the previous example return the relevant data in the response body.  Many operations, like retrieving container or BLOB metadata return only relevant data in the response headers.  Let’s modify our previous request, noting the change in the query parameter.  You will also need to use the Invoke-WebRequest Cmdlet (or your other favorite method) so that you can access the response headers.


$BlobContainerUri="https://$StorageAccountName.blob.core.windows.net/$ContainerName?restype=container&comp=metadata"
$BlobHeaders= @{ "x-ms-date"=[DateTime]::UtcNow.ToString('R'); "x-ms-version"='2016-05-31'; }
$UnsignedSignature=GetTokenStringToSign -Verb GET -Resource $BlobContainerUri `
    -AccessKey $AccessKey -Headers $BlobHeaders $StorageSignature=EncodeStorageRequest `
    -StringToSign $UnsignedSignature -SigningKey $SigningKey
$BlobHeaders.Add('Authorization',"SharedKey $($StorageAccountName):$($StorageSignature)")
$Response=Invoke-WebRequest -Uri $Uri -Headers $BlobHeaders –UseBasicParsing
$ContainerMetadata=$Response.Headers

We should have the resulting metadata key-value pairs present in the form x-ms-meta-<Key Name>.


C:\Users\chris> $ContainerMetaData
Key                      Value
---                      ----- 
Transfer-Encoding        chunked
x-ms-request-id          5f15423e-0001-003d-066d-ca0167000000
x-ms-version             2016-05-31
x-ms-meta-dispo          12345
x-ms-meta-stuff          test
Date                     Thu, 11 May 2017 15:41:16 GMT
ETag                     "0x8D4954F4245F500"
Last-Modified            Sun, 07 May 2017 13:45:01 GMT
Server                   Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

Where to go from here?

With the authentication scheme in hand, you can now access the all of the storage service. This includes creating snapshots, uploading and downloading files. If you are not inclined to do things the hard way, feel free to check out a module supporting most of the BLOB service functionality on the Powershell Gallery or GitHub.

Azure Stack TP2 Hacks: Custom Domain Names and Exposing to the Internet

ss.png

In some previous posts, we covered some "hacks" to Azure Stack TP1, primarily enabling a customized domain name and exposing to the internet.  If you have not noticed yet, the installation has changed greatly. The process is now driven by ECEngine and should be far more indicative of how the final product gets deployed. While the installer has greatly changed, fortunately, the process to expose the stack publicly has only changed in a few minor ways. Without getting too involved in how it works, the installation operates from a series of PowerShell modules and Pester tests tied to a configuration composed from a number XML configuration files. The configuration files support use of variables and parameters to drive most of the PowerShell action. As with TP1, the stack is wired so that the DNS domain name for Active Directory must match the public DNS domain name (think certificates and host headers). This is a much less involved change to TP2, it mostly requires replacing a couple of straggling hard coded entries with variables in some OneNodeConfig.xml files and changing the installer bootstrapper to use it.  Once again, I will admonish you that this is wholly unsupported.

There are 6 files that need minor changes, we will start with the XML config files.

Config Files

C:\CloudDeployment\Configuration\Roles\Fabric\IdentityProvider\OneNodeRole.xml Line 11 From

[xml] <IdentityApplication Name="Deployment" ResourceId="https://deploy.azurestack.local/[Deployment_Guid]" DisplayName="Deployment Application" CertPath="{Infrastructure}\ASResourceProvider\Cert\Deployment.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Deployment.IdentityApplication.Configuration.json" > </IdentityApplication> [/xml]

To

[xml] <IdentityApplication Name="Deployment" ResourceId="https://deploy.[DOMAINNAMEFQDN]/[Deployment_Guid]" DisplayName="Deployment Application" CertPath="{Infrastructure}\ASResourceProvider\Cert\Deployment.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Deployment.IdentityApplication.Configuration.json" > </IdentityApplication> [/xml]

C:\CloudDeployment\Configuration\Roles\Fabric\KeyVault\OneNodeRole.xml Line 12 From

[xml] <IdentityApplication Name="KeyVault" ResourceId="https://vault.azurestack.local/[Deployment_Guid]" DisplayName="AzureStack KeyVault" CertPath="{Infrastructure}\ASResourceProvider\Cert\KeyVault.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\KeyVault.IdentityApplication.Configuration.json" > <AADPermissions> <ApplicationPermission Name="ReadDirectoryData" /> </AADPermissions> <OAuth2PermissionGrants> <FirstPartyApplication FriendlyName="PowerShell" /> <FirstPartyApplication FriendlyName="VisualStudio" /> <FirstPartyApplication FriendlyName="AzureCLI" /> </OAuth2PermissionGrants> </IdentityApplication> [/xml]

To

[xml] <IdentityApplication Name="KeyVault" ResourceId="https://vault.[DOMAINNAMEFQDN]/[Deployment_Guid]" DisplayName="AzureStack KeyVault" CertPath="{Infrastructure}\ASResourceProvider\Cert\KeyVault.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\KeyVault.IdentityApplication.Configuration.json" > <AADPermissions> <ApplicationPermission Name="ReadDirectoryData" /> </AADPermissions> <OAuth2PermissionGrants> <FirstPartyApplication FriendlyName="PowerShell" /> <FirstPartyApplication FriendlyName="VisualStudio" /> <FirstPartyApplication FriendlyName="AzureCLI" /> </OAuth2PermissionGrants> </IdentityApplication> [/xml]

Line 26 From

[xml] <AzureKeyVaultSuffix>vault.azurestack.local</AzureKeyVaultSuffix> [/xml]

To

[xml] <AzureKeyVaultSuffix>vault[DOMAINNAMEFQDN]</AzureKeyVaultSuffix> [/xml]

C:\CloudDeployment\Configuration\Roles\Fabric\WAS\OneNodeRole.xml Line(s) 96-97 From

[xml] <IdentityApplication Name="ResourceManager" ResourceId="https://api.azurestack.local/[Deployment_Guid]" HomePage="https://api.azurestack.local/" DisplayName="AzureStack Resource Manager" CertPath="{Infrastructure}\ASResourceProvider\Cert\ResourceManager.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\ResourceManager.IdentityApplication.Configuration.json" Tags="MicrosoftAzureStack" > [/xml]

To

[xml] <IdentityApplication Name="ResourceManager" ResourceId="https://api.[DOMAINNAMEFQDN]/[Deployment_Guid]" HomePage="https://api.[DOMAINNAMEFQDN]/" DisplayName="AzureStack Resource Manager" CertPath="{Infrastructure}\ASResourceProvider\Cert\ResourceManager.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\ResourceManager.IdentityApplication.Configuration.json" Tags="MicrosoftAzureStack" > [/xml]

Line(s) 118-120 From

[xml] <IdentityApplication Name="Portal" ResourceId="https://portal.azurestack.local/[Deployment_Guid]" HomePage="https://portal.azurestack.local/" ReplyAddress="https://portal.azurestack.local/" DisplayName="AzureStack Portal" CertPath="{Infrastructure}\ASResourceProvider\Cert\Portal.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Portal.IdentityApplication.Configuration.json" > [/xml]

To

[xml] <IdentityApplication Name="Portal" ResourceId="https://portal.[DOMAINNAMEFQDN]/[Deployment_Guid]" HomePage="https://portal.[DOMAINNAMEFQDN]/" ReplyAddress="https://portal.[DOMAINNAMEFQDN]/" DisplayName="AzureStack Portal" CertPath="{Infrastructure}\ASResourceProvider\Cert\Portal.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Portal.IdentityApplication.Configuration.json" > [/xml]

Line 129 From

[xml] <ResourceAccessPermissions> <UserImpersonationPermission AppURI="https://api.azurestack.local/[Deployment_Guid]" /> </ResourceAccessPermissions> [/xml]

To

[xml] <ResourceAccessPermissions> <UserImpersonationPermission AppURI="https://api.[DOMAINNAMEFQDN]/[Deployment_Guid]" /> </ResourceAccessPermissions> [/xml]

Line 133 From

[xml] <IdentityApplication Name="Policy" ResourceId="https://policy.azurestack.local/[Deployment_Guid]" DisplayName="AzureStack Policy Service" CertPath="{Infrastructure}\ASResourceProvider\Cert\Policy.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Policy.IdentityApplication.Configuration.json" > [/xml]

To

[xml] <IdentityApplication Name="Policy" ResourceId="https://policy.[DOMAINNAMEFQDN]/[Deployment_Guid]" DisplayName="AzureStack Policy Service" CertPath="{Infrastructure}\ASResourceProvider\Cert\Policy.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Policy.IdentityApplication.Configuration.json" > [/xml]

Line 142 From

[xml] <IdentityApplication Name="Monitoring" ResourceId="https://monitoring.azurestack.local/[Deployment_Guid]" DisplayName="AzureStack Monitoring Service" CertPath="{Infrastructure}\ASResourceProvider\Cert\Monitoring.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Monitoring.IdentityApplication.Configuration.json" > </IdentityApplication> [/xml]

To

[xml] <IdentityApplication Name="Monitoring" ResourceId="https://monitoring.[DOMAINNAMEFQDN]/[Deployment_Guid]" DisplayName="AzureStack Monitoring Service" CertPath="{Infrastructure}\ASResourceProvider\Cert\Monitoring.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Monitoring.IdentityApplication.Configuration.json" > </IdentityApplication> [/xml]

C:\CloudDeployment\Configuration\Roles\Fabric\FabricRingServices\XRP\OneNodeRole.xml Line 114 From

[xml] <IdentityApplication Name="Monitoring" ResourceId="https://monitoring.azurestack.local/[Deployment_Guid]" DisplayName="AzureStack Monitoring Service" CertPath="{Infrastructure}\ASResourceProvider\Cert\Monitoring.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Monitoring.IdentityApplication.Configuration.json" > </IdentityApplication> [/xml]

To

[xml] <IdentityApplication Name="Monitoring" ResourceId="https://monitoring.[DOMAINNAMEFQDN]/[Deployment_Guid]" DisplayName="AzureStack Monitoring Service" CertPath="{Infrastructure}\ASResourceProvider\Cert\Monitoring.IdentityApplication.ClientCertificate.pfx" ConfigPath="{Infrastructure}\ASResourceProvider\Config\Monitoring.IdentityApplication.Configuration.json" > </IdentityApplication> [/xml]

Scripts

Now we will edit the installation bootstrapping scripts.

We will start by adding two new parameters ($ADDomainName and $DomainNetbiosName) to C:\CloudDeployment\Configuration\New-OneNodeManifest.ps1 and have the manifest generation use them.

[powershell] param ( [Parameter(Mandatory=$true)] [Xml] $InputXml,

[Parameter(Mandatory=$true)] [String] $OutputFile,

[Parameter(Mandatory=$true)] [System.Guid] $DeploymentGuid,

[Parameter(Mandatory=$false)] [String] $Model,

[Parameter(Mandatory=$true)] [String] $HostIPv4Address,

[Parameter(Mandatory=$true)] [String] $HostIPv4DefaultGateway,

[Parameter(Mandatory=$true)] [String] $HostSubnet,

[Parameter(Mandatory=$true)] [bool] $HostUseDhcp,

[Parameter(Mandatory=$true)] [string] $PhysicalMachineMacAddress,

[Parameter(Mandatory=$true)] [String] $HostName,

[Parameter(Mandatory=$true)] [String] $NatIPv4Address,

[Parameter(Mandatory=$true)] [String] $NATIPv4Subnet,

[Parameter(Mandatory=$true)] [String] $NatIPv4DefaultGateway,

[Parameter(Mandatory=$false)] [Int] $PublicVlanId,

[Parameter(Mandatory=$true)] [String] $TimeServer,

[Parameter(Mandatory=$true)] [String] $TimeZone,

[Parameter(Mandatory=$true)] [String[]] $EnvironmentDNS,

[Parameter(Mandatory=$false)] [String] $ADDomainName='azurestack.local',

[Parameter(Mandatory=$false)] [String] $DomainNetbiosName='azurestack',

[Parameter(Mandatory=$true)] [string] $AADDirectoryTenantName,

[Parameter(Mandatory=$true)] [string] $AADDirectoryTenantID,

[Parameter(Mandatory=$true)] [string] $AADAdminSubscriptionOwner,

[Parameter(Mandatory=$true)] [string] $AADClaimsProvider ) $Xml.InnerXml = $Xml.InnerXml.Replace('[PREFIX]', 'MAS') $Xml.InnerXml = $Xml.InnerXml.Replace('[DOMAINNAMEFQDN]', $ADDomainName) $Xml.InnerXml = $Xml.InnerXml.Replace('[DOMAINNAME]', $DomainNetbiosName) [/powershell]

The final edit(s) we need to make are to C:\CloudDeployment\Configuration\InstallAzureStackPOC.ps1. We will start by adding the same parameters to this script.

[powershell] [CmdletBinding(DefaultParameterSetName="DefaultSet")] param ( [Parameter(Mandatory=$false, ParameterSetName="RerunSet")] [Parameter(Mandatory=$true, ParameterSetName="AADSetStaticNAT")] [Parameter(Mandatory=$true, ParameterSetName="DefaultSet")] [SecureString] $AdminPassword,

[Parameter(Mandatory=$false)] [PSCredential] $AADAdminCredential,

[Parameter(Mandatory=$false)] [String] $AdDomainName='azurestack.local',

[Parameter(Mandatory=$false)] [String] $DomainNetbiosName='AzureStack',

[Parameter(Mandatory=$false)] [String] $AADDirectoryTenantName,

[Parameter(Mandatory=$false)] [ValidateSet('Public Azure','Azure - China', 'Azure - US Government')] [String] $AzureEnvironment = 'Public Azure',

[Parameter(Mandatory=$false)] [String[]] $EnvironmentDNS,

[Parameter(Mandatory=$true, ParameterSetName="AADSetStaticNAT")] [String] $NATIPv4Subnet,

[Parameter(Mandatory=$true, ParameterSetName="AADSetStaticNAT")] [String] $NATIPv4Address,

[Parameter(Mandatory=$true, ParameterSetName="AADSetStaticNAT")] [String] $NATIPv4DefaultGateway,

[Parameter(Mandatory=$false)] [Int] $PublicVlanId,

[Parameter(Mandatory=$false)] [string] $TimeServer = 'time.windows.com',

[Parameter(Mandatory=$false, ParameterSetName="RerunSet")] [Switch] $Rerun ) [/powershell]

The next edit will occur at lines 114-115 From

[powershell] $FabricAdminUserName = 'AzureStack\FabricAdmin' $SqlAdminUserName = 'AzureStack\SqlSvc' [/powershell]

To

[powershell] $FabricAdminUserName = "$DomainNetbiosName\FabricAdmin" $SqlAdminUserName = "$DomainNetbiosName\SqlSvc" [/powershell]

Finally we will modify the last statement of the script from line 312 to pass the new parameters.

[powershell] & $PSScriptRoot\New-OneNodeManifest.ps1 -InputXml $xml ` -OutputFile $outputConfigPath ` -Model $model ` -DeploymentGuid $deploymentGuid ` -HostIPv4Address $hostIPv4Address ` -HostIPv4DefaultGateway $hostIPv4Gateway ` -HostSubnet $hostSubnet ` -HostUseDhcp $hostUseDhcp ` -PhysicalMachineMacAddress $physicalMachineMacAddress ` -HostName $hostName ` -NATIPv4Address $NATIPv4Address ` -NATIPv4Subnet $NATIPv4Subnet ` -NATIPv4DefaultGateway $NATIPv4DefaultGateway ` -PublicVlanId $PublicVlanId ` -TimeServer $TimeServer ` -TimeZone $timezone ` -EnvironmentDNS $EnvironmentDNS ` -AADDirectoryTenantName $AADDirectoryTenantName ` -AADDirectoryTenantID $AADDirectoryTenantID ` -AADAdminSubscriptionOwner $AADAdminSubscriptionOwner ` -AADClaimsProvider $AADClaimsProvider ` -ADDomainName $AdDomainName ` -DomainNetbiosName $DomainNetbiosName [/powershell]

NAT Configuration

So, you now have customized the Domain for your one node Azure Stack install and want to get it on the internet. This process is almost identical to TP1 save for two changes. In TP1 there were both BGPVM and NATVM machines; while there is now a single machine MAS-BGPNAT01. The BGPNAT role only exists in the one node (HyperConverged) installation. The other change is the type of Remote Access installation. TP1 also used the "legacy" RRAS for NAT, where all configuration was UI or netsh based. TP2 has transitioned to "modern" Remote Access that is only really manageable through PowerShell. To enable the appropriate NAT mappings we will need to use three PowerShell Cmdlets. Get-NetNat Add-NetNatExternalAddress Add-NetNatStaticMapping I use a script to create all the mappings which takes a simple object, which in our use case is deserialized from JSON. This file is a simple collection of the NAT entries and mappings to be created.

[javascript] { "Portal": { "External": "172.20.40.39", "Ports": [ 80,443,30042,13011,30011,30010,30016,30015,13001,13010,13021,30052,30054,13020,30040,13003,30022,12998,12646,12649,12647,12648,12650,53056,57532,58462,58571,58604,58606,58607,58608,58610,58613,58616,58618,58619,58620,58626,58627,58628,58629,58630,58631,58632,58633,58634,58635,58636,58637,58638,58639,58640,58641,58642,58643,58644,58646,58647,58648,58649,58650,58651,58652,58653,58654,58655,58656,58657,58658,58659,58660,58661,58662,58663,58664,58665,58666,58667,58668,58669,58670,58671,58672,58673,58674,58675,58676,58677,58678,58679,58680,58681,58682,58683,58684,58685,58686,58687,58688,58689,58690,58691,58692,58693,58694,58695,58696,58697,58698,58699,58701 ], "Internal": "192.168.102.5" }, "API": { "External": "172.20.40.38", "Ports": [ 80,443,30042,13011,30011,30010,30016,30015,13001,13010,13021,30052,30054,13020,30040,13003,30022,12998,12646,12649,12647,12648,12650,53056,57532,58462,58571,58604,58606,58607,58608,58610,58613,58616,58618,58619,58620,58626,58627,58628,58629,58630,58631,58632,58633,58634,58635,58636,58637,58638,58639,58640,58641,58642,58643,58644,58646,58647,58648,58649,58650,58651,58652,58653,58654,58655,58656,58657,58658,58659,58660,58661,58662,58663,58664,58665,58666,58667,58668,58669,58670,58671,58672,58673,58674,58675,58676,58677,58678,58679,58680,58681,58682,58683,58684,58685,58686,58687,58688,58689,58690,58691,58692,58693,58694,58695,58696,58697,58698,58699,58701 ], "Internal": "192.168.102.4" }, "DataVault": { "External": "172.20.40.43", "Ports": [80,443], "Internal": "192.168.102.3" }, "CoreDataVault": { "External": "172.20.40.44", "Ports": [80,443], "Internal": "192.168.102.3" }, "Graph": { "External": "172.20.40.40", "Ports": [80,443], "Internal": "192.168.102.8" }, "Extensions": { "External": "172.20.40.41", "Ports": [ 80,443,30042,13011,30011,30010,30016,30015,13001,13010,13021,30052,30054,13020,30040,13003,30022,12998,12646,12649,12647,12648,12650,53056,57532,58462,58571,58604,58606,58607,58608,58610,58613,58616,58618,58619,58620,58626,58627,58628,58629,58630,58631,58632,58633,58634,58635,58636,58637,58638,58639,58640,58641,58642,58643,58644,58646,58647,58648,58649,58650,58651,58652,58653,58654,58655,58656,58657,58658,58659,58660,58661,58662,58663,58664,58665,58666,58667,58668,58669,58670,58671,58672,58673,58674,58675,58676,58677,58678,58679,58680,58681,58682,58683,58684,58685,58686,58687,58688,58689,58690,58691,58692,58693,58694,58695,58696,58697,58698,58699,58701 ], "Internal": "192.168.102.7" }, "Storage": { "External": "172.20.40.42", "Ports": [80,443], "Internal": "192.168.102.6" } } [/javascript]

In the one node TP2 deployment 192.168.102.0 is the subnet for "Public" IP addresses, and if you notice all the VIP's for the stack reside on that subnet. We have 1-to-1 NAT for all the "External" addresses we associate with a given Azure Stack instance.

[powershell] [CmdletBinding()] param ( [Parameter(Mandatory=$true)] [psobject] $NatConfig )

#There's only one could do -Name BGPNAT ... $NatSetup=Get-NetNat

$NatConfigNodeNames=$NatConfig|Get-Member -MemberType NoteProperty|Select-Object -ExpandProperty Name

foreach ($NatConfigNodeName in $NatConfigNodeNames) { Write-Verbose "Configuring NAT for Item $NatConfigNodeName" $ExIp=$NatConfig."$NatConfigNodeName".External $InternalIp=$NatConfig."$NatConfigNodeName".Internal $NatPorts=$NatConfig."$NatConfigNodeName".Ports Write-Verbose "Adding External Address $ExIp" Add-NetNatExternalAddress -NatName $NatSetup.Name -IPAddress $ExIp -PortStart 80 -PortEnd 63356 Write-Verbose "Adding Static Mappings"

foreach ($natport in $NatPorts) { #TCP Write-Verbose "Adding NAT Mapping $($ExIp):$($natport)->$($InternalIp):$($natport)" Add-NetNatStaticMapping -NatName $NatSetup.Name -Protocol TCP ` -ExternalIPAddress $ExIp -InternalIPAddress $InternalIp ` -ExternalPort $natport -InternalPort $NatPort

} } [/powershell]

DNS Records

The final step will be adding the requisite DNS Entries, which have changed slightly as well.  In the interest of simplicity assume the final octet of the IP addresses on the 172.20.40.0 subnet have a 1 to 1 NAT mapping to 38.77.x.0 (e.g. 172.20.40.40 –> 38.77.x.40)

A Record IP Address
api 38.77.x.38
portal 38.77.x.39
*.blob 38.77.x.42
*.table 38.77.x.42
*.queue 38.77.x.42
*.vault 38.77.x.43
data.vaultcore 38.77.x.44
control.vaultcore 38.77.x.44
xrp.tenantextensions 38.77.x.44
compute.adminextensions 38.77.x.41
network.adminextensions health.adminextensions 38.77.x.41
storage.adminextensions 38.77.x.41

 

Connecting to the Stack

You will need to export the root certificate from the CA for your installation for importing on any clients that will access your deployment.   Exporting the root certificate is very simple as the host system is joined to the domain which hosts the Azure Stack CA. To export the Root certificate to your desktop run this simple one-liner in the PowerShell console of your Host system (the same command will work from the Console VM).

[powershell] Get-ChildItem -Path Cert:\LocalMachine\Root| ` Where-Object{$_.Subject -like "CN=AzureStackCertificationAuthority*"}| ` Export-Certificate -FilePath "$env:USERPROFILE\Desktop\$($env:USERDOMAIN)RootCA.cer" -Type CERT [/powershell]

The process for importing this certificate on your client will vary depending on the OS version; as such I will avoid giving a scripted method.

Right click the previously exported certificate.

installcert.png

Choose Current User for most use-cases.

imprt2_thumb.png

Select Browse for the appropriate store.

imprt3.png

Select Trusted Root Certificate Authorities

imprt4.png

Confirm the Import Prompt

To connect with PowerShell or REST API you will need the deployment GUID. This can be obtained from the host with the following snippet.

[powershell] [xml]$deployinfo = Get-content "C:\CloudDeployment\Config.xml" $deploymentguid = $deployinfo.CustomerConfiguration.Role.Roles.Role.Roles.Role | % {$_.PublicInfo.DeploymentGuid} [/powershell]

This value can then be used to connect to your stack.

[powershell] #Deployment GUID $EnvironmentID='4bc6f444-ff15-4fd7-9bfa-5495891fe876' #The DNS Domain used for the Install $StackDomain="yourazurestack.com" #The AAD Domain Name (e.g. bobsdomain.onmicrosoft.com) $AADDomainName='youraadtenant.com' #The AAD Tenant ID $AADTenantID = "youraadtenant.com" #The Username to be used $AADUserName="username@$AADDomainName" #The Password to be used $AADPassword='P@ssword1'|ConvertTo-SecureString -Force -AsPlainText #The Credential to be used. Alternatively could use Get-Credential $AADCredential=New-Object PSCredential($AADUserName,$AADPassword) #The AAD Application Resource URI $ApiAADResourceID="https://api.$StackDomain/$EnvironmentID" #The ARM Endpoint $StackARMUri="Https://api.$StackDomain/" #The Gallery Endpoint $StackGalleryUri="Https://portal.$($StackDomain):30016/" #The OAuth Redirect Uri $AadAuthUri="https://login.windows.net/$AADTenantID/" #The MS Graph API Endpoint $GraphApiEndpoint="https://graph.windows.net/"

#Add the Azure Stack Environment Add-AzureRmEnvironment -Name "Azure Stack" ` -ActiveDirectoryEndpoint $AadAuthUri ` -ActiveDirectoryServiceEndpointResourceId $ApiAADResourceID ` -ResourceManagerEndpoint $StackARMUri ` -GalleryEndpoint $StackGalleryUri ` -GraphEndpoint $GraphApiEndpoint

#Add the environment to the context using the credential $env = Get-AzureRmEnvironment 'Azure Stack' Add-AzureRmAccount -Environment $env -Credential $AADCredential -Verbose [/powershell]

Note: You will need the a TP2 specific version of the Azure PowerShell for many operations.  Enjoy, and stay tuned for more.

Ruling Your Resource Groups with an Iron Fist

grim-reaper4.png

If it is not obvious by now, we deploy a lot of resources to Azure.  The rather expensive issue we encounter is that people rarely remember to clean up after themselves; it goes without saying we have encountered some staggeringly large bills.  To remedy this, we enforced a simple, yet draconian policy around persisting Resource Groups.  If your Resource Group does not possess values for our required tag set (Owner and Solution), it can be deleted at any time. At the time the tagging edict went out we had well over 1000 resource groups.  As our lab manager began to script the removal using the Azure Cmdlets we encountered a new issue, the synchronous operations of the Cmdlets just took too long.  We now use a little script we call "The Reaper" to "fire and forget".

If you read my previous post about Azure AD and Powershell, you may have noticed my predilection for doing things via the REST API.  "The Reaper" simply uses the module from that post to obtain an authorization token and uses the REST API to evaluate Resource Groups for tag compliance and delete the offenders asynchronously. There are a few caveats to note; it will only work with organizational accounts and it will not delete any resources which have a lock.

It is published on the PowerShell gallery, so if you can obtain it like so:

[powershell] #Just download it Save-Script -Name thereaper -Path "C:\myscripts" #Install the script (with the module dependency) Install-Script -Name thereaper [/powershell]

The required parameters are a PSCredential and an array of strings for the tags to inspect. Notable Switch parameters are AllowEmptyTags (only checks for presence of the required tags) and DeleteEmpty (removes Resource Groups with no Resources even if tagged properly). There is also a SubscriptionFilters parameter taking an array of Subscription id's to limit the scope, otherwise all Subscriptions your account has access to will be evaluated. If you would simply like to see what the results would be, use the WhatIf Switch. Usage is as follows:

[powershell] $Credential=New-Object PSCredential("username@tenant.com",("YourPassword"|ConvertTo-SecureString -AsPlainText -Force)) $results=.\thereaper.ps1 -Credential $Credential ` -RequiredTags "Owner","Solution" -DeleteEmpty ` -SubscriptionFilters "49f4ba3e-72ec-4621-8e9e-89d312eafd1f","554503f6-a3aa-4b7a-a5a9-641ac65bf746" [/powershell]

A standard liability waiver applies; I hold no responsibility for the Resource Groups you destroy.

Azure, Azure Active Directory, and PowerShell. The Hard Way

poshoauth.png

In my opinion, a fundamental shift for Windows IT professionals occurred with the release of Exchange 2007.  This established PowerShell as the tool for managing and configuring Microsoft enterprise products and systems going forward.  I seem to remember hearing a story at the time that a mandate was established for every enterprisey product going forward; each GUI action would have a corresponding PowerShell execution.  If anyone remembers the Exchange 2007 console, you could see that in action.  I won’t bother corroborating this story, because the end results are self-evident.  I can’t stress how important this was.  Engineers and administrators with development and advanced scripting skills were spared the further indignity of committing crimes against Win32 and COM+ across a hodgepodge of usually awful languages.  Windows administrators for whom automation and scripting only meant batch files, a clear path forward was presented.

PowerShell and Leaky Abstractions

For roughly two years now, the scope of my work has been mostly comprised of Azure integration and automation.  Azure proved to be no exception to the PowerShell new world order. I entered with wide-eyed optimism and I quickly discovered a great deal of things, usually of a more advanced nature, that could not be done in the portal and purportedly only via PowerShell. As I continue to receive product briefings, I have developed a bit of a pedantic pet-peeve.  PowerShell is always front and center in the presentations when referencing management, configuration, and automation.  However, I continue to see a general hand wave given as to the underlying technologies (e.g. WMI/CIM, REST API) and requirements.  I absolutely understand the intent, PowerShell has always been meant to provide a truly powerful environment in a manner that was highly accessible and friendly to the IT professional.  It has been a resounding success in that regard.  A general concern, I have, is that of too much abstraction.  There is a direct correlation between your frustration level and how far your understanding of what is going on is when an inevitable edge case is hit and the abstraction leaks.

Getting Back to the Point

All of that is a really long preface to the actual point of this post. I’ve never been a fan of the Azure Cmdlets for a number of reasons, most of which I don’t necessarily impugn the decisions made by Microsoft. To be honest, I think  both Switch-AzureMode (for those that remember) and the rapid release cadence that has introduced many understandably unavoidable breaking changes has really prejudiced me; as a result I tend to use the REST API almost exclusively. The fact is, modern systems and especially all of the micro-service architectures being touted are all powered by REST API. In the case of the Microsoft cloud, with only a few notable exceptions, authentication and authorization is handled via Azure Active Directory. It behooves the engineer or developer focused on Microsoft technologies to have a cursory understanding.  Azure Active Directory, Azure, and Office 365 are intrinsically linked. Every Azure and/or Office 365 Subscription is linked with an Azure AD tenant as the primary identity provider. The modern web seems to have adopted OAuth as an authorization standard and Azure AD can greatly streamline the authorization of web applications and API. The management and other API surfaces of Azure (and Azure Stack) and Office 365 have always taken advantage of this. The term you’ve likely heard thrown around is Bearer Token. That is more accurately described as an authorization header on the HTTP request containing a JWT (JSON Web Token).  My largest issue with the Azure and PowerShell automation has been the necessity to jump through hoops to simply obtain that token via PowerShell.  In 2016 a somewhat disingenuously Cmdlet named Get-AzureStackToken in the AzureRM.AzureStackAdmin module finally appeared.  I’m certain a large portion of the potential reading audience has used a tool like Fiddler, Postman, or even more recently resources.azure.com to either inspect or interact with these services.  Those who have can feel free to skip the straight to where this applies to PowerShell.

There are two types of applications you can create within Azure AD, each of with are identified with a unique Client Id and valid redirect URI(s) as the most relevant properties we’ll focus on.

Web Applications

  • Web applications in Azure Active Directory are OAuth2 confidential clients and likely the most appropriate option for modern (web) use cases.

  • Tokens are obtained on behalf of a user using the OAuth2 authorization grant flow. An authorization code or id token will be supplied to the specified redirect URI.

  • If needed, client credentials (a rolling secret key) can be used to obtain tokens on behalf of the user or on it’s own from the web application itself.

Native Applications

  • Native applications in Azure Active Directory are OAuth2 public clients (e.g. an application on a desktop or mobile device).

  • These applications can obtain a token directly (with managed organizational accounts) or use the authorization grant flow, but application level permissions are not applicable.

Getting to the PowerShell

I will focus primarily on the Native application type as it is most relevant to PowerShell. Most of the content will use Cmdlets from a module that will be available with this post.   The module is heavily derived/inspired by the ADAL libraries, has no external dependencies and accept a friendly PSCredential (with the appropriate rights) for any user authentication.  The Azure Cmdlets use a Native application with a Client Id of 1950a258-227b-4e31-a9cf-717495945fc2 and a redirect URI of urn:ietf:wg:oauth:2.0:oob (the prescribed default for native applications).   We’ll use this for our first attempt at obtaining a token for use against Azure Resource Manager or the legacy Service Management API.  A peculiar detail of Azure management is that this one of the scenarios a token is fungible for disparate endpoints. I always use https://management.core.windows.net as my audience regardless of whether I will be working with ARM or SM.  A token obtained from that audience will work the same as one from https://management.azure.com .

If all you would like is a snippet to obtain a token using the Azure, I’ll offer you a chance to bail out now:


$Resource='https://management.core.windows.net'
$PoshClientId="1950a258-227b-4e31-a9cf-717495945fc2"
$TenantId="yourdomain.com"
$UserName="username@$TenantId"
$Password="asecurepassword"|ConvertTo-SecureString -AsPlainText -Force
$Credential=New-Object pscredential($UserName,$Password)
Get-AzureStackToken -Resource $Resource -AadTenantId $TenantId -ClientId $PoshClientId -Credential $Credential -Authority "https://login.microsoftonline.com/$TenantId" 

A good deal of the functionality around provisioning applications and service principals has come to the Azure Cmdlets.  You can now create applications, service principals from the applications, and role assignments to the service principals. To create an application, in this case one that would own a subscription, you would write something like this:


$ApplicationSecret="ASuperSecretPassword!"
$TenantId='e05b8b95-8c85-49af-9867-f8ac0a257778'
$SubscriptionId='bc3661fe-08f5-4b87-8529-9190f94c163e'
$AppDisplayName='The Subscription Owning Web App'
$HomePage='https://azurefieldnotes.com'
$IdentifierUris=@('https://whereeveryouwant.com')
$NewWebApp=New-AzureRmADApplication -DisplayName $AppDisplayName -HomePage $HomePage `
    -IdentifierUris $IdentifierUris -StartDate (Get-Date) -EndDate (Get-Date).AddYears(1) `
    -Password $ApplicationSecret
$WebAppServicePrincipal=New-AzureRmADServicePrincipal -ApplicationId $NewWebApp.ApplicationId
$NewRoleAssignment=New-AzureRmRoleAssignment -ObjectId $NewWebApp.Id -RoleDefinitionName 'owner' -Scope "/subscriptions/$SubscriptionId"
$ServicePrincipalCred=New-Object PScredential($NewWebApp.ApplicationId,($ApplicationSecret|ConvertTo-SecureString -AsPlainText -Force))
Add-AzureRmAccount -Credential $ServicePrincipalCred -TenantId $TenantId -ServicePrincipal 

For those that stuck around, let’s take a look at obtaining JWT(s), inspecting them, and putting them to use.

I added a method for decoding the tokens, so we will have a look at the access token.  A JWT is comprised of a header, payload, and signature.  I will leave explaining the claims within the payload to identity experts.

Now that we have a token, let's use it for something useful, in this case we will ask Azure (ARM) for our associated subscriptions.

Examining the OAuth2 Flow

If you are not interested in what is going on behind the scenes feel free to skip ahead.  Each application exposes a standard set of endpoints and I will not discuss the v2.0 endpoint as I do not have enough experience using it.  There are two endpoints in particular to make note of, https://login.microsoftonline.com/{tenantid}/oauth2/authorize and https://login.microsoftonline.com/{tenantid}/oauth2/token, where {tenantid} represents the tenant id (guid or domain name) e.g. yourcompany.com or common for multi-tenant applications.  Azure AD obviously supports federation and the directing traffic to the appropriate authorization endpoint is guided by a user realm detection API of various versions at https://login.microsoftonline.com/common/UserRealm.  If we inspect the result for a fully managed Azure AD account we see general tenant detail.

If we take a look at a federated user we will see a little difference, the AuthURL property.

userrealm federated

This show us the location of our federated authentication endpoint. The token will actually be requested via a SAML user assertion that is received from an STS, in this case ADFS.

The OAuth specification uses the request parameter collection for token and authorization code responses. A username and password combination can be used to directly request a token in the fully managed scenario public client scenario.

A POST request can go directly to the Token endpoint with the following query parameters:

client_id

The Application Id

resource

The Resource URI to access

grant_type

password

username

The username

password

The password

The ADFS/WSTrust will entail sending a SOAP request to the WSTrust endpoint to authenticate and use that response to create the assertion that is exchanged for an access token.  Through user realm detection we can find the ADFS username/password endpoint.  A SOAP envelope can be sent to  endpoint to receive a security token response, containing the assertions needed.

A POST request is sent to the Username/Password endpoint for ADFS with the following envelope with noteable values encased in {}:

<s:Envelope xmlns:s='http://www.w3.org/2003/05/soap-envelope' 
    xmlns:a='http://www.w3.org/2005/08/addressing' 
    xmlns:u='http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd'>
    <s:Header>
        <a:Action s:mustUnderstand='1'>http://docs.oasis-open.org/ws-sx/ws-trust/200512/RST/Issue</a:Action>
        <a:messageID>urn:uuid:{Unique Identifier for the Request}</a:messageID>
        <a:ReplyTo>
            <a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address>
        </a:ReplyTo>        <!-- The Username Password WSTrust Endpoint -->
        <a:To s:mustUnderstand='1'>{Username/Password Uri}</a:To>
        <o:Security s:mustUnderstand='1' 
            xmlns:o='http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd'>            <!-- The token length requested -->
            <u:Timestamp u:Id='_0'>
                <u:Created>{Token Start Time}</u:Created>
                <u:Expires>{Token Expiry Time}</u:Expires>
            </u:Timestamp>            <!-- The username and password used -->
            <o:UsernameToken u:Id='uuid-{Unique Identifier for the Request}'>
                <o:Username>{UserName to Authenticate}</o:Username>
                <o:Password>{Password to Authenticate}</o:Password>
            </o:UsernameToken>
        </o:Security>
    </s:Header>
    <s:Body>
        <trust:RequestSecurityToken xmlns:trust='http://docs.oasis-open.org/ws-sx/ws-trust/200512'>
            <wsp:AppliesTo xmlns:wsp='http://schemas.xmlsoap.org/ws/2004/09/policy'>
                <a:EndpointReference>
                    <a:Address>urn:federation:MicrosoftOnline</a:Address>
                </a:EndpointReference>
            </wsp:AppliesTo>
            <trust:KeyType>http://docs.oasis-open.org/ws-sx/ws-trust/200512/Bearer</trust:KeyType>
            <trust:RequestType>http://docs.oasis-open.org/ws-sx/ws-trust/200512/Issue</trust:RequestType>
        </trust:RequestSecurityToken>
    </s:Body>
</s:Envelope>

The token response is inspected for SAML assertion types (urn:oasis:names:tc:SAML:1.0:assertion or urn:oasis:names:tc:SAML:2.0:assertion) to find the matching token used for the OAuth token request.

<s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" 
    xmlns:a="http://www.w3.org/2005/08/addressing" 
    xmlns:u="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">
    <s:Header>
        <a:Action s:mustUnderstand="1">http://docs.oasis-open.org/ws-sx/ws-trust/200512/RSTRC/IssueFinal</a:Action>
        <o:Security s:mustUnderstand="1" 
            xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd">
            <u:Timestamp u:Id="_0">
                <u:Created>2016-01-03T01:34:41.640Z</u:Created>
                <u:Expires>2016-01-03T01:39:41.640Z</u:Expires>
            </u:Timestamp>
        </o:Security>
    </s:Header>
    <s:Body>
        <trust:RequestSecurityTokenResponseCollection xmlns:trust="http://docs.oasis-open.org/ws-sx/ws-trust/200512">            <!-- Our Desired Token Response -->
            <trust:RequestSecurityTokenResponse>
                <trust:Lifetime>
                    <wsu:Created xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2016-01-03T01:34:41.622Z</wsu:Created>
                    <wsu:Expires xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2016-01-03T02:34:41.622Z</wsu:Expires>
                </trust:Lifetime>
                <wsp:AppliesTo xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy">
                    <wsa:EndpointReference xmlns:wsa="http://www.w3.org/2005/08/addressing">
                        <wsa:Address>urn:federation:MicrosoftOnline</wsa:Address>
                    </wsa:EndpointReference>
                </wsp:AppliesTo>
                <trust:RequestedSecurityToken>                    <!-- The Assertion -->
                    <saml:Assertion MajorVersion="1" MinorVersion="1" AssertionID="_e3b09f2a-8b57-4350-b1e1-20a8f07b3d3b" Issuer="http://adfs.howtopimpacloud.com/adfs/services/trust" IssueInstant="2016-08-03T01:34:41.640Z" 
                        xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion">
                        <saml:Conditions NotBefore="2016-01-03T01:34:41.622Z" NotOnOrAfter="2016-01-03T02:34:41.622Z">
                            <saml:AudienceRestrictionCondition>
                                <saml:Audience>urn:federation:MicrosoftOnline</saml:Audience>
                            </saml:AudienceRestrictionCondition>
                        </saml:Conditions>
                        <saml:AttributeStatement>
                            <saml:Subject>
                                <saml:NameIdentifier Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified">130WEAH65kG8zfGrZFNlBQ==</saml:NameIdentifier>
                                <saml:SubjectConfirmation>
                                    <saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:bearer</saml:ConfirmationMethod>
                                </saml:SubjectConfirmation>
                            </saml:Subject>
                            <saml:Attribute AttributeName="UPN" AttributeNamespace="http://schemas.xmlsoap.org/claims">
                                <saml:AttributeValue>chris@howtopimpacloud.com</saml:AttributeValue>
                            </saml:Attribute>
                            <saml:Attribute AttributeName="ImmutableID" AttributeNamespace="http://schemas.microsoft.com/LiveID/Federation/2008/05">
                                <saml:AttributeValue>130WEAH65kG8zfGrZEFlBQ==</saml:AttributeValue>
                            </saml:Attribute>
                        </saml:AttributeStatement>
                        <saml:AuthenticationStatement AuthenticationMethod="urn:oasis:names:tc:SAML:1.0:am:password" AuthenticationInstant="2016-08-03T01:34:41.607Z">
                            <saml:Subject>
                                <saml:NameIdentifier Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified">130WEAH65kG8sfGrZENlBQ==</saml:NameIdentifier>
                                <saml:SubjectConfirmation>
                                    <saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:bearer</saml:ConfirmationMethod>
                                </saml:SubjectConfirmation>
                            </saml:Subject>
                        </saml:AuthenticationStatement>
                        <ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
                            <ds:SignedInfo>
                                <ds:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#" />
                                <ds:SignatureMethod Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha1" />
                                <ds:Reference URI="#_e3b09f2a-8b57-4350-b1e1-20a8f07b3d3b">
                                    <ds:Transforms>
                                        <ds:Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature" />
                                        <ds:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#" />
                                    </ds:Transforms>
                                    <ds:DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" />
                                    <ds:DigestValue>itvzbQhlzA8CIZsMneHVR15FJlY=</ds:DigestValue>
                                </ds:Reference>
                            </ds:SignedInfo>
                            <ds:SignatureValue>gBCGUmhQrJxVpCxVsy2L1qh1kMklVVMoILvYJ5a8NOlezNUx3JNlEP7wZ389uxumP3sL7waKYfNUyVjmEpPkpqxdxrxVu5h1BDBK9WqzOICnFkt6JPx42+cyAhj3T7Nudeg8CP5A9ewRCLZu2jVd/JEHXQ8TvELH56oD5RUldzm0seb8ruxbaMKDjYFuE7X9U5sCMMuglU3WZDC3v6aqmUxpSd9Kelhddleu33XEBv7CQNw84JCud3B+CC7dUwtGxwv11Mk/P0t1fGbfs+I6aSMTecKq9YmscqP9tB8ZouD42jhjhYysOQSdulStmUi6gVzQz+c2l2taa5Amd+JCPg==</ds:SignatureValue>
                            <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#">
                                <X509Data>
                                    <X509Certificate>MIIC4DCDAcigAwIBAgIQaYQ6QyYqcrBBmOHSGy0E1DANBgkqhkiG9w0BAQsFADArMSkwJwYDVQQDEyBBREZTIFNpZ25pbmcgLSBhZGZzLmNpLmF2YWhjLmNvbTAgFw0xNjA2MDQwNjA4MDdaGA8yMTE2MDUxMTA2MDgwN1owKzEpMCcGA1UEAxMgQURGUyBTaWduaW5nIC0gYWRmcy5jaS5hdmFoYy5jb20wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDH9J6/oWYAR8Y98QnacNouKyIBdtZbosEz0HyJVyrxVqKq2AsPvCEO3WFm9Gmt/xQN9PuLidZpgICAe8Ukuv4h/NldgmgtD64mObFNuEM5pzAPRXUv6FWlVE4fnUpIiD1gC0bbQ7Tzv/cVgfUChCDpFu3ePDTs/tv07ee22jXtoyT3N7tsbIX47xBMKgF9ItN9Oyqi0JyQHZghVQ1ebNOMH3/zNdl0WcZ+Pl+osD3iufoH6H+qC9XY09B5YOWy8fJoqf+HFeSWZCHH5vJJfsPTsSilvLHCpMGlrMFaTBKqmv+m9Z3FtbzOcnKHS5PJVAymqLctkH+HbFzaDblaSRhhAgMBAAEwDQYJKoZIhvcNAQELBQADggEBAFB0E2Cj+O24aPM61JsCXLIAB28q4h4qLxMwV+ypYjFxxcQ5GzgqaPJ7BARCnW1gm3PyvNfUut9RYrT9wTJlBVY9WDBoX33jsS87riMj+JONXJ7lG/zAozxs0xIiW+PNlFdOt7xyvYstrFgPJS1E05jhiZ2PR8MS20uSlMNkVPinpz4seyyMQeM+1GbpbDE1EwwtEVKgatJN7t6nAn9mw8cHIk1et7CYOGeWCnMA9EljzNiD8wEwsG51aKfuvGrPK8Q8N/G89SPgstpe0Te5+EtWT6latXfpCwdNWxvinH49SKKa25l1VoLLNwKiQF6vK1Iw0F7dP7QkO5YdE7/MTDU=</X509Certificate>
                                </X509Data>
                            </KeyInfo>
                        </ds:Signature>
                    </saml:Assertion>
                </trust:RequestedSecurityToken>
                <trust:RequestedAttachedReference>
                    <o:SecurityTokenReference k:TokenType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV1.1" 
                        xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" 
                        xmlns:k="http://docs.oasis-open.org/wss/oasis-wss-wssecurity-secext-1.1.xsd">
                        <o:KeyIdentifier ValueType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.0#SAMLAssertionID">_e3b09f2a-8b57-4350-b1e1-20a8f07b3d3b</o:KeyIdentifier>
                    </o:SecurityTokenReference>
                </trust:RequestedAttachedReference>
                <trust:RequestedUnattachedReference>
                    <o:SecurityTokenReference k:TokenType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV1.1" 
                        xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" 
                        xmlns:k="http://docs.oasis-open.org/wss/oasis-wss-wssecurity-secext-1.1.xsd">
                        <o:KeyIdentifier ValueType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.0#SAMLAssertionID">_e3b09f2a-8b57-4350-b1e1-20a8f07b3d3b</o:KeyIdentifier>
                    </o:SecurityTokenReference>
                </trust:RequestedUnattachedReference>
                <trust:TokenType>urn:oasis:names:tc:SAML:1.0:assertion</trust:TokenType>
                <trust:RequestType>http://docs.oasis-open.org/ws-sx/ws-trust/200512/Issue</trust:RequestType>
                <trust:KeyType>http://docs.oasis-open.org/ws-sx/ws-trust/200512/Bearer</trust:KeyType>
            </trust:RequestSecurityTokenResponse>
        </trust:RequestSecurityTokenResponseCollection>
    </s:Body>
</s:Envelope>

A POST request is sent to the Token endpoint with the following query parameters:

client_id

The Application Id

resource

The Resource URI to access

assertion

The base64 encoded SAML token

grant_type

urn:ietf:params:oauth:grant-type:saml1_1-bearer urn:ietf:params:oauth:grant-type:saml2-bearer

scope

openid

A GET request is sent to the Authorize endpoint with some similar query parameters:

client_id

The Application Id

redirect_uri

The location within the application to handle the authorization code

response_type

code

prompt

login consent admin_consent

scope

optional scope for access (app uri or openid scope)

The endpoint should redirect you to the appropriate login screen via user realm detection.  Once the user login is completed, the code is added to the redirect address as either query parameters (default) or a form POST.  Once the code is retrieved it can be exchanged for a token. A POST request is sent to the Token endpoint as demonstrated before with some slightly different parameters:

client_id

The Application Id

resource

The Resource URI to access

code

The authorization code

grant_type

authorization_code

scope

previous scope

client_secret

required if confidential client

Tying it All Together

To try to show some value for your reading time, lets explore how this can be used as the solutions you support and deploy become more tightly integrated with the Microsoft cloud.  We'll start by creating a new Native application in the legacy portal.

appnative1
appnative2

I used https://itdoesnotmatter here, but you might as well follow the guidance of using urn:ietf:wg:oauth:2.0:oob.  We will now grant permissions to Azure Active Directory and Azure Service Management (for ARM too).

ADPermissions
ADServiceMgmt

I will avoid discussing configuring the application to be multi-tenant as the processes I outline are identical, it is simply a matter of the targeted tenant.  You should end up with something looking like this.

Native

Let's now try to go get a token for our new application and put it to use.  This should look exactly the same as retrieving the previous token.


$AuthCode=Approve-AzureADApplication -ClientId $NewClientId -RedirectUri 'https://itdoesnotmatter/' -TenantId sendthewolf.com -AdminConsent

nativefirstattempt

Epic failure!  Unfortunately we run into a common annoyance, the application must be consented to interactively.  I do not know of any tooling that exists to make this easy.  I added a function to make this a little easier and it supports a switch of AdminConsent to approve the application for all users within the tenant.  And step through the consent process to receive an authorization code.


$AuthCode=Approve-AzureADApplication -ClientId $NewClientId -RedirectUri 'https://itdoesnotmatter/' -TenantId sendthewolf.com -AdminConsent

Approve
Approve App

Once the authorization code is obtained it can be exchanged for a token, for which I provided another function.  That token can now be used in the exact same manner as the Azure Cmdlet application.


$TokenResult=Get-AzureADAccessTokenFromCode 'https://management.core.windows.net/' -ClientId $NewClientId -RedirectUri 'https://itdoesnotmatter/' -TenantId sendthewolf.com -AuthorizationCode $AuthCode

Authorize2

If you wanted to handle some Azure Active Directory objects, we can target a different audience, and execute actions appropriate to the account's privilege level.   In the following example we will create a new user.


$GraphBaseUri="https://graph.windows.net/"
$GraphUriBuilder=New-Object System.UriBuilder($GraphBaseUri)
$GraphUriBuilder.Path="$TenantId/users"
$GraphUriBuilder.Query="api-version=1.6"
$NewUserJSON=@"
{
    "accountEnabled": true, 
    "displayName": "Johnny Law", 
    "mailNickName" : "thelaw", 
    "passwordProfile": { 
        "password": "Password1234!", 
        "forceChangePasswordNextLogin": false 
    }, 
    "userPrincipalName": "johhny.law@$TenantId" 
}
"@
$AuthResult=Get-AzureADUserToken -Resource $GraphBaseUri -ClientId $NewClientId -Credential $Credential -TenantId $TenantId
$AuthHeaders=@{Authorization="Bearer $($AuthResult.access_token)"}
$NewUser=Invoke-RestMethod -Uri $GraphUriBuilder.Uri -Method Post -Headers $AuthHeaders -Body $NewUserJSON -ContentType "application/json"

If we want to continue the “fun” with Office 365 we can apply the exact sample approach with the Office 365 Sharepoint Online application permissions.  In the interest of moving along and with no regard for constraining access, we will configure the permissions in the following manner.

sharepoint

We’ll now do some querying of the Office 365 SharePoint video API with some more script.


$SharepointUri='https://yourdomain.sharepoint.com/'
$SpUriBuilder=New-Object System.UriBuilder($SharepointUri)
$SpUriBuilder.Path="_api/VideoService.Discover"
$AuthResult=Get-AzureADUserToken -Resource $SharepointUri -ClientId $NewClientId -Credential $Credential
$Headers=@{Authorization="Bearer $($AuthResult.access_token)";Accept="application/json";}
$VideoDisco=Invoke-RestMethod -Uri $SpUriBuilder.Uri -Headers $Headers $VideoDisco|Format-List
$VideoChannelId="306488ae-5562-4d3e-a19f-fdb367928b96"
$VideoPortalUrl=$VideoDisco.VideoPortalUrl
$ChannelUrlBuilder=New-Object System.UriBuilder($VideoPortalUrl)
$ChannelUrlBuilder.Path+="/_api/VideoService/Channels"
$ChannelOData=Invoke-RestMethod -Uri $ChannelUrlBuilder.Uri -Headers $Headers
$ChannelRoot=$ChannelUrlBuilder.Path
foreach ($Channel in $ChannelOData.Value)
{  
    $VideoUriBuilder=New-Object System.UriBuilder($Channel.'odata.id')
    $VideoUriBuilder.Path+="/Videos"
    Invoke-RestMethod -Uri $VideoUriBuilder.Uri -Headers $Headers|Select-Object -ExpandProperty value
}

We should see some output that looks like this:

spvideos

I’ve had Enough! Please Just Show me the Code.

For those who have endured or even skipped straight here, I present the following module for any use your dare apply.  The standard liability waiver applies and it is presented primarily for educational purposes.  It came from a need to access the assortment of Microsoft cloud API in environments where we could not always ensure the plethora of correct Cmdlets are installed.  Initially, being a .Net guy, I just wrapped standard use cases around ADAL .Net.  I really wanted to make sure that I really understood OAuth and OpenId Connect authorization flows as is relates to Azure Active Directory.  The entire theme of this lengthy tome is to emphasize the importance of having a relatively advanced understanding of these concepts.  Regardless of your milieu, if it has a significant Microsoft component, the demand to both integrate and support the integration(s) of numerous offerings will only grow larger.  The module is primarily targeted at the Native Client application type, however there is support for the client secret and implicit authorization flows.  There are also a few utility methods that are exposed as they may have some diagnostic use or otherwise.  The module exposes the following methods all of which support Get-Help:

  • Approve-AzureADApplication

    • Approves an Azure AD Application Interactively and returns the Authorization Code

    • ConvertFrom-EncodedJWT

      • Converts an encoded JWT to an object representation

      • Get-AzureADAccessTokenFromCode

        • Retrieves an access token from a consent authorization code

        • Get-AzureADClientToken

          • Retrieves an access token as a an OAuth confidential client

          • Get-AzureADUserToken

            • Retrieves an access token as a an OAuth public client

            • Get-AzureADImplicitFlowToken

              • Retrieves an access token interactively for a web application with OAuth implicit flow enabled

              • Get-AzureADOpenIdConfiguration

                • Retrieves the OpenId connect configuration for the specified application

                • Get-AzureADUserRealm

                  • Retrieves a the aggregate user realm data for the specified user principal name(s

                  • Get-WSTrustUserRealmDetails

                    • Retrieves the WSFederation details for a given user prinicpal name

Get it here: Azure AD Module

I hope you find it useful and remember not to fear doing things the hard way every so often.