New Blog Location

Hi all,

It has taken quite a long time, but I finally have my new blog up and running. I will be writing about all things Government cloud with my colleague Dennis Guzy. Dennis is a Senior Government Cloud Specialist with many years of experience.

Make sure to bookmark the new URL:

Looking forward to seeing you there!

Posted in Uncategorized | Leave a comment

Microsoft Bound

I have some exciting news! I have taken a job with Microsoft as an Account Technology Strategist. I will be keeping this blog active for the legacy content but will be moving new posts over to a new site in the coming weeks. I have some exciting new collaborative projects in the pipeline so stay tuned for an updated URL to my new projects.

Posted in Microsoft, Office 365 | Tagged | 2 Comments

The Important Things

I have been contemplating whether or not to start a new blog or just add this into my technical blog. Then after chatting with one of my mentors I decided there really isn’t a defining line between my technical blog and the reasons I do it. So while this may not bring any new revelations about the Cloud, it is important to me so feel free to read on or just wait for my next super nerdy tech post.

Early this year I heard the news of a tragedy that shook me to the very core. I was travelling in upstate New York when I got a call from my wife. A family we know had a house fire that took not only their house, but two of their beautiful daughters and left another of their daughters with severe burns. It is one of those days where everything is moving along normally, work was challenging and I was in a groove. But in that instant the only thing I wanted to do was drop everything and fly home to hold my wife and children. All of the work, the accolades, the blogs and speaking engagements were suddenly and completely unimportant to me. I will never forget that feeling, and as I sit here writing this I can feel the tears building and the emotions overwhelming me. The truth is everything, and I mean EVERYTHING comes in a distant second to seeing and being with my family. The whole reason I sit in hotels late at night blogging away after a long day with customers thousands of miles away from them is to give them a better life and provide for them.

Fast forward to this past weekend. It was a normal weekend for our family, my son had a birthday party to go to, my girls were planning on enjoying the beautiful weather we have been having then we’d wrap up the weekend with our company picnic which is always a blast. My wife told me about a special event for Sophie and Madigan, the two girls that were lost in the fire, at a pumpkin patch on the other side of town. So after the birthday party I took my son and his friend to the pumpkin patch to honor the little girls and to let the kids all play together for the afternoon. The event ended with a ceremony where thousands of balloons were released by the children, friends and families of Sophie and Madigan.

It was a beautiful afternoon temperature wise, but there had been a thin layer of clouds all afternoon. Then about 30 minutes before the ceremony something happened that I can only categorize as amazing. The clouds slowly parted and the sun shone through. It was the most beautiful sunset I have ever seen. So standing there with my son and our friends we each took a balloon and released them into the heavens for Sophie and Madigan. As we were standing there my son tapped me on the arm. As I leaned down my son looked at me and said ‘I love you daddy’ and gave me a big hug and kiss.

In that moment I was so happy and sad at the same time, I texted my wife immediately and told her I loved her and to hug our two little angels sitting at home with her. In my world, where I have sacrificed so much in the name of my family I realized all of the awards, the air miles, the blog posts and the books don’t matter. What matters is the six year old little boy standing next to his daddy just happy to be there in the moment with me. I feel so honored to have such an amazing family and friends, I am truly blessed.

My hope by writing this is those who follow me will take a moment to savor the important things in life. Your families, your friends and your loved ones. Life is a beautiful and special thing, no matter how crazy life gets don’t ever forget it…I know I won’t.

Posted in Uncategorized | 3 Comments

Fun with PowerShell

Today I wanted to show some examples of how PowerShell can make your life easier. I will say I am not a PowerShell master by any means and I am sure someone out there can tweak these scripts to be much more efficient, but they do the job. First let’s take a look at licensing. Several posts ago I shared a method for using PowerShell to license your users. Before you begin licensing users you will need to get the specific account sku for your tenant. To do this you’ll need to use the MSOnline CmdLets with come with the Windows Azure Directory Module for Windows PowerShell. Once you have this open PowerShell and run the following commands:

Import-Module MSOnline

This will return your licensed Account Sku’s:

Once you have this information you can add it to the script below for the $AccountSkuID variable:

$powerUser = ""
$powerPass = "Password"
$password = ConvertTo-SecureString $powerPass -AsPlainText -Force
$adminCredential = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $powerUser,$password

Connect-MSOLService -Credential $adminCredential

$AccountSkuId = "sku:PARTNUMBER"
$UsageLocation = "US"
$LicenseOptions = New-MsolLicenseOptions -AccountSkuId $AccountSkuId

$Users = Import-Csv c:\Scripts\Users.csv
$Users | ForEach-Object {
Set-MsolUser -UserPrincipalName $_.UserPrincipalName -UsageLocation $UsageLocation
Set-MsolUserLicense -UserPrincipalName $_.UserPrincipalName -AddLicenses $AccountSkuId -LicenseOptions $LicenseOptions

You will notice I am getting my users from a CSV file:

$Users = Import-Csv c:\Scripts\Users.csv

This can easily be changed to license everyone in the tenant by changing the line to:

$Users = get-msoluser –All

Or you could filter and license by some specific attribute like department:

$Users = get-msoluser | where{$_.Department –eq "IT"} 

Now, as you can imagine this licenses your users for a full bundle which includes all the sub-services. So if you were licensing the SKU above the user would be licensed for a full E3 SKU including Exchange, Sharepoint, Lync, etc. But what about cases where you only want to license your users for some of the components of a particular package? Honestly the first time a customer brought this to me I had never considered it, but since then I have had many customers request it. So, in order to do this we need to get some information, first we need to know what the parts numbers are for the components of a full bundle SKU. To do this you need to run the following commands:

*You will note the SkuPartNumber is pulled from the Get-MSOLAccountSku command. I took the second ½ of the Account SKU after the :

$ServicePlans = Get-MsolAccountSku | Where {$_.SkuPartNumber –eq "ENTERPRISEPACK"}

This will return the individual part numbers like below:

Now you have to figure out what everything means J. Once you have figured that out you can plug in the appropriate values. In the example below I am just licensing Exchange Online. The trick here is when you define your ‘package’ you tell it what components are disabled within the plan.

$powerUser = ""
$powerPass = "XXXX"

$password = ConvertTo-SecureString $powerPass <em>-AsPlainText -Force
$adminCredential = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $powerUser,$password

Connect-MSOLService -Credential $adminCredential
#The below AccountSKU needs to be adjusted based on the specific tenant SKU name
$AccountSkuId = "AccountSKU:ENTERPRISESKU"
$UsageLocation = "US"

#The below DisabledPlan section should be updated based on the specific disabled components required based on the O365get-AccountSkuParts.ps1 script
$Users = Import-Csv c:\Scripts\Users.csv

$Users | ForEach-Object {
Set-MsolUser -UserPrincipalName $_.UserPrincipalName -UsageLocation $UsageLocation
Set-MsolUserLicense -UserPrincipalName $_.UserPrincipalName -AddLicenses $AccountSkuId -LicenseOptions $MyO365Sku

The last thing I’ll leave you with is a quick tip for the security nuts. I know I put my global admin credentials in my scripts, my theory is the server should be locked down and I only run the scripts from a secure location. But sure, I can see why some people may not like this. One way to avoid this is to echo the request for credentials dynamically at the beginning of your script. To do this change the first two lines of the script:


$powerUser = ""
$powerPass = "XXXX"


$powerUser = Read-Host "Enter Your Global Admin Username"
$powerPass = Read-Host "Enter Your Global Admin Password"
Posted in Office 365 | Tagged , , , | 2 Comments

DirSync Filtering

I realize DirSync filtering has been around for a while now but I recently worked with a customer that wanted to do 2 specific types of filtering so I thought I would do a little write-up about it.

For a little bit of background this customer had 4 domains in their forest and were in the process of consolidating down to a single domain. However, they had two interesting issues:

  • They had several hundred administrative accounts that essentially mirrored the users standard account, only it had elevated permissions
  • They had duplicate accounts in some of the old forests that were still not cleaned up after the migration

This customer was running Domino as their mail server and we leveraged the Quest tools to migrate them to Office 365. The tool itself has several shortcomings, one of them being the way it provisions mail attributes into existing AD accounts. The first bullet point above was a case where the ‘admin’ account had matching SMTP attributes even though in reality it didn’t have a mailbox. However, during DirSync these objects were often grabbed first and tagged with the SMTP and UPN values leaving their true AD accounts they use for accessing mail with the dreaded ‘’ address. There are multiple ways to skin a cat, however we decided to use an unused attribute in AD to filter the admin accounts out of the sync to Office 365. Below is how we accomplished this:

  • Launch the MIISAdmin console on the Directory Synchronization server
  • Click Management Agents from the top section, right click Source AD and select properties

  • Click Configure Connector Filter and select User from the list on the right hand side

  • Select the attribute you want to filter by, the criteria (equals, begins with, etc) and set the value then click Add Condition

  • Once all conditions have been configured click OK
  • Confirm by scrolling down the list of filters and validating your new attribute filter exists

This satisfied our issue with the admin accounts, in addition to this we modified the internal identity provisioning tool the customer used to add the special attribute when creating new admin accounts. This was to avoid the need for administrators to use the attribute editor and automated the process for future accounts.

The second thing we wanted to do was exclude OU’s from the sync to avoid the old accounts in the domains that were to be retired. In order to do this we created the following filter:

  • Launch the MIISAdmin console from the Directory Sync Server
  • Click Management Agents from the top section, right click Source AD and select properties

  • Select Configure Directory Partitions and highlight the appropriate directory partition from the list on the right then click Containers

  • Enter the domain credentials for a standard domain user account in the domain you are modifying and click OK

  • Uncheck the appropriate OU’s you want to exclude from Directory Synchronization and click OK

Now, you have to remember that when you filter you run the risk of an administrator accidentally orphaning an Office 365 identity by placing them in the wrong OU so use care when setting this up. If a user is accidentally put into the ‘dumpster’ in Office 365 because they were filtered out you can reconnect them for up to 30 days from the time of the initial orphaning.

The 30 day orphaning can also cause issues like it did with my customer. Even though the admin accounts were now in the dumpster the proper user accounts still wouldn’t update properly because of the dumpster items. So rather than waiting 30 days I came up with a script to enumerate the users in the dumpster and then forcefully purge them from the dumpster. Based on the paragraph above be sure you don’t need to recover any of these accounts, when you run this you will NOT be able to recover the accounts and Microsoft cannot restore them. They will need to be recreated and any pre-existing Office 365 data like the users mailbox will be gone.

Here is the purge script run from Windows PowerShell 2.0

Import-Module MSOnline
$powerUser = ""
$powerPass = "Password"

$password = ConvertTo-SecureString $powerPass -AsPlainText -Force
$adminCredential = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $powerUser,$password

Connect-MSOLService -Credential $adminCredential

$users=get-MSOLUser -ReturnDeletedUsers -all
$users | ForEach-Object {
	Remove-MsolUser -UserPrincipalName $_.UserPrincipalName -RemoveFromRecycleBin -Force

Posted in Directory Sync, Office 365 | Tagged , , , , , | Leave a comment

Office 365 Fast Track Network Analysis

First off I’d like to apologize for all of my faithful blog followers. The past 6 months have been incredibly busy for me both in business and in my personal life, so in order to maintain sanity I have been laying low on the posts in recent months. Hopefully as I gear up for some exiting new projects I’ll have some time to update my posts on a more regular basis. Now on to the fun stuff, new toys!

Microsoft recently released the Office 365 Fast Track Network Analysis web page located here. Note this is for North America. If you want the tool for other locations they can be found below:

The tool itself is a java based applet (yes I know, I don’t like the fact that it uses Java either) that runs multiple tests and provides a quick and easy to read report on several factors related to your network. To get started (after you accept the Java applet) you need to add your domain, I’ll use my O365 domain as an example:

Funny thing is after entering this bit of information everything else is done for you. As you can see from the left hand side of the screen you have 5 tabs (which turn into 10 tabs after the tool has finished analyzing the network) to cycle through that will get populated as the various tests complete. I have highlighted a few of these tests below to show how valuable this tool can be.

Service Connectivity validates there are no ports being blocked by your corporate firewalls:

The Route tab displays several sub-tabs that provide information on the path packets will take from your network and can identify bottlenecks along the route. Below is the summary tab:

Speed and Capacity tests:

Again, these are just a few of the reports, but you can see how valuable they can be during the early stages of an assessment to determine an organizations viability for cloud services. The last sections I want to highlight are the summary and Advanced tabs.

The summary page is a quick go/no-go report on the key areas around network viability, as you can see from my report, the location I am working from has some issues that may cause consistency problems and poor QoS for VOIP traffic. Additionally the tool found a sligt amount of jitter which may cause some additional ‘unpredictability’ with VOIP service.

The advanced tab provides more concrete information on packet size, loss, latency and route speed. Additionally, this information can easily be ported into a text file to be inserted into an assessment report.

This tool is a long time coming and I have been very impressed with the wide variety of tests it performs with almost no input or work required on my part. For your next O365 project it is a must have! Enjoy!

Posted in Office 365 | Tagged , , , | Leave a comment

Making the Quest Tools Work with Office 365 Wave 15

Wave 15 is relatively new and some bumps with third party software were to be expected. I recently hit one of those bumps on a project I am doing. It is a migration from Domino to Office 365 Wave 15 for just under 10K users. My plan was to use the Quest Notes Migrator for Exchange tool to sync mail attributes from Domino to Active Directory. Then let the DirSync tool grab those objects and move them into Office 365. And finally to migrate the newly linked identities from Domino to Office 365 using the NME tool.

Unfortunately during the configuration I ran into an issue connecting to the Office 365 tenant. When I would add the Office 365 credentials into the Exchange Server configuration page on the NME tool I would get an error saying the Outlook Anywhere configuration was invalid and to fix it or uncheck the box. Unfortunately when you select Office 365 as your target the OA box is grayed out so you can’t make any changes. So I opened a ticket and was pointed to this workaround:

So I excitedly read through the article hoping to have a workaround but then read the last paragraph:

NOTE: Using this workaround, you will not be able to Provision Users or Groups with NME. This Microsoft article discusses importing users into Office 365 using a CSV file. Also, the Office 365 Admin Pool feature will not work in this configuration. Migrating more than 5 users during a migration run could cause the migration to fail.

Yup, that is right kids, if you do this workaround you can’t provision your users in AD using the Notes tool AND you can’t use the pooling feature meaning you can only migrate 5 accounts at a time per workstation. So migrating 10,000 users would take about 5 years.

After berating the Quest support team for a horrible workaround I was told there is an interim patch that fixes the pooling issue…ok great, now I can migrate larger batches of users but still can’t actually provision my mail attributes into AD. The primary cause of the issue is that you can’t populate the password field in the INI file because it is hashed and you can’t set it in the GUI because when you click ‘Apply’ it fails with the same connection issue to Wave 15.

So as I was wallowing in sadness an idea came to me. In the Global Default Settings INI file you can see the hashed password value for the Notes ID. What if I were to set my Notes password to the same as my Office 365 password and copied the resulting hash? Hmmm…

So here is what you do to get this to work:

Log into your Notes client on your migration workstation, click File > Security > User Security…

Enter your existing password

Click Change Password

Enter your existing password again

Enter and confirm your new password and click OK

Now go back to the NME tool and click Notes Server under Edit Default Settings, update the password and click Apply

Now click Edit and select Global Default Settings…

In the INI file find the section that starts with [Notes], find the ~~Password= field and copy the entire value after the ‘=’.

Now find the [Exchange] header, find ~~Password= and paste the value you copied from the notes section.

*Note – the image below has been modified, the actual INI file will have many more attributes listed, I cleared of them out for simplicity. I also made up this hash J

I haven’t been able to get an ETA on when Quest expects their code to be fixed so by the time you read this is may be fixed, however I don’t expect this to be the case. If/When I get confirmation the problem has been fixed and have tested it I will update this post.

Posted in Domino, Office 365, Quest, Uncategorized | Tagged , , , , | Leave a comment