Tuesday, March 13, 2018

Export Exchange Properties From Resource Forest.

This is part of an on-going article for migrating mailboxes from an on-premise Exchange 2010+ environment to Microsoft's O365. See post here.

Creating the Identity Package - Mailboxes

As an Exchange admin, one of your first steps will be to sync the mailbox properties from the resource forest into the authentication domain. These mailbox properties are applied to the mailbox linked master account and synchronized to O365.

When performing a migration from on-premise Exchange to O365, the migration batch/move-request process will look for specific properties (Exchange GUID, Primary SMTP Address) and fail the moves if they do not match. In addition, you'll want to modify the UserPrincipalName on the AD objects so that clients logon to OWA using their email address.

Exchange properties and ADObject name:

  1. #Name (name)
  2. #DisplayName (displayname)
  3. #SamAccountName (SamAccountName)
  4. #WindowsEmailAddress (mail)
  5. #PrimarySMTPAddress (from ProxyAddresses)
  6. #LegacyExchangeDN (legacyExchangeDN)
  7. #EmailAddresses (proxyaddresses)
  8. #ExchangeGUID (msExchMailboxGUID)
  9. #GrantSendOnBehalfTo (publicDelegates)
  10. #ExternalEmailAddress (TargetAddress)

Part 1: Exporting Exchange properties from Exchange:

This script (testing out pastebin), is the latest revision of my script to do this. When it is run in your RF, you'll need to provide it a list of mailbox objects, the current accepted email domain ( and the tenant email domain (i.e. The script will export a CSV file in the current directory containing all of the required properties.

  1.  $ExportNames = get-mailbox -organizationalunit "Finance" -resultsize unlimited
  2.  .\Export-RFToAD.ps1 -identity $ExportNames  -acceptedDomain "Example.COM" -O365Domain "" -department "Finance"

The script:

Part 2: Importing the Exchange Properties

Once the CSV has been created, copy it over to the Exchange box in the authentication domain. This admin account will need to have sufficient permissions to modify all objects in this domain.

Point the following script at the CSV and the Organizational Unit for where to create new user objects. Note: New users will typically be resource type mailboxes in the Exchange environment. They can be moved after creation to the correct folders in your AD structure. Just make sure that this other OU is also synced up to the O365 tenant.

.\Import-O365IdentityPackageCSV.ps1 -CSVPath .\Finance.CSV -NewUserOU "Contoso.local\Resource Accounts"

If the script is unable to finish the steps, it will generate a new CSV called ".\FailedToImport.csv". Review each of these users in the authentication AD to see why they may have failed.

Possible Failure scenarios:

  1. Legacy Exchange properties - Unable to apply new Exchange properties as they still have properties from a former Exchange install. "Can't enable mail user as it's already a mailbox", or "Can't contact (some server long gone) to mail enable user". I attempt to resolve this issue by checking for the setting (~ line 92). If this does not resolve the issue, check the ADUC Attribute Editor for references to the legacy environment and (you may have to) remove them. 
  2. Non-Inherited permissions - If the script was unable to modify an account "ACCESS DENIED", the issue is typically because the account is not inheriting permissions. 
    1. Active Directory Users and Computers
    2. enable Advanced Features (view menu)
    3. Open properties of the AD account for failed user.
    4. Security tab -> Advanced button
    5. Check the "Include Inheritable permissions from this object's parent' checkbox.
For a quick check, I recommend doing two things:
  1. Make sure all of the objects that you mail-enabled match the CSV. We had an odd issue were the wrong GUID got applied to the wrong mailbox. Not a BIG deal as after migration, the missed GUID is still on premise and the wrong account now has the wrong displayname/email address.  For example:

    import-CSV .\csvpath | ?{(get-mailuser $_.exchangeguid).displayname -ne $_.displayname}
  2. Make sure mail-enabled accounts are in OUs that are sync'd via AAD Connect.

    import-CSV .\csvpath | %{get-mailuser $_.exchangeguid} | group organizationalUnit

Friday, March 2, 2018

Migration from on-premise Exchange 2010 resource forest to EXO.

We've just finished a year long project migrating 80,000 mailboxes from our single on-premise Exchange 2010 solution, to multiple O365 tenants. The on-premise environment is/was an Exchange resource forest. This means that user's logged into their local authentication domain, then via AD trust, they gained access to their mailbox hosted with us.

The only difference to this diagram, instead of the Internet cloud, replace with a WAN cloud. Clients could only access mail via Outlook over the VPN to our data center. I plan to use this space to post each of the scripts that I generated for this project.

image source: MS Blog - you had me at ehlo

To migrate from shared-multi-tenant, on-premise resource forest Exchange environment to individual O365 tenants.

  1. Create O365 tenant
  2. Configure Authentication domain
    1. Update AD to 2012 R2 or better. 
    2. Install Exchange 2013/2016 
      1. Clean-up legacy Exchange properties
      2. Extend schema for Exchange
      3. Install Exchange software
      4. Remove SCP record reference for auto-discover
    3. Cleanup 
      1. Consolidate (if possible) OUs so that they are easy to manage.
      2. Remove dead accounts.
      3. Cleanup mailboxes in Exchange.
      4. Update workstations so using latest Outlook (2013/2016) as available in Windows Update. Don't forget other Office applications.
      5. Public Folders
  3. Configure ADFS and AAD Connect to O365 tenant.
  4. Do Mailbox Identity Sync
    1. Export identity information from resource forest (RF). 
    2. Import identity info into similar objects in account forest.
  5. Sync mail enabled accounts and groups from account forest to O365.
    1. Review mailboxes for permissions assigned to Auth domain security groups. Include these groups in the sync.

  6. Migrate mailboxes from RF to O365.
    1. Create migration end-point from O365 to Exchange on-premise
    2. Create migration batches/move requests 
    3. Monitor / complete move requests.
  7. License mailboxes once migrated. 
    1. Convert any resource mailboxes that came over as USERMAILBOX to shared/equipment so as to avoid using a mailbox license.
  8. Create groups and external contacts in O365 tenant. 
  9. Review/ fix shared mailbox permissions (security groups)


  • 3.14 - extended outline, added page for Identity Sync scripts.

Thursday, February 15, 2018

Powershell: Indeterminate Inline Progress Bar

I am currently working on removing a replicas from all of our on-premise public folders. There is quite possibly 40,000 public folders nested inside of the structure. We are looking to remove 2 copies from all replicas and the Exchange 2010 tools, while working didn't provide any progress.

So, I grabbed all the 'good' replicas.

$firstPF = get-publicfolder ".\Top" 

$GoodReplicas = $firstPF.Replicas | ?{$_ -notlike "remove this db name"} 

Get-PublicFolder -recurse | ?{$_.replicas -ne $GoodReplicas} | Set-PublicFolder -replicas $GoodReplicas

This was taking for ever and I couldn't tell if it was even running. I modified one of my previous progress bars, so that it would simply count to 100, then reset back to 1. Next I modified the code, to return the object I am currently parsing. This allows me to put the progress bar in-line with my code above and it keeps running.

Get-PublicFolder -recurse | ?{$_.replicas -ne $GoodReplicas} | %{wp3 -passthru $_} | Set-PublicFolder -replicas $GoodReplicas

You can see I use a few global variables. This helps maintain the counter between iterations. This also means that it will start at where you left off with the last run.

function wp3 {
 [CmdletBinding()] param(  

 $envVar = get-Variable -Name $JobName -Scope Global -ErrorAction SilentlyContinue -ValueOnly
 $Times = $JobName+"_count"
 $envVar2 = get-Variable -Name $Times -Scope Global -ErrorAction SilentlyContinue -ValueOnly
 if ($EnvVar -eq $null) { 
  #Global Variable doesn't exist, create one called based on $JobName
  $Env_WPIndex = 0
  New-Variable -Name $JobName -Scope Global -Value 0 #-Visibility Private
 } else {
  #Use current global variable value.
  $env_WPIndex = [double]$EnvVar
 if ($envVar2 -eq $null) { 
  #Global Variable doesn't exist, create one called based on $JobName
  $envTimeThru = 0
  New-Variable -Name $Times -Scope Global -Value 0 #-Visibility Private
 } else {
  #Use current global variable value.
  $envTimeThru = [double]$envVar2
 Write-Progress -Activity ($JobName+"("+$envTimeThru+")") -Status $([string]$Env_WPIndex+"%") -PercentComplete $Env_WPIndex 
 $env_WPIndex = $env_wpIndex + 1
 if ($env_wpIndex -lt 100) { 
  #if less then max object count, increment the global variable by one
  Set-Variable -Name $JobName -Scope Global -ErrorAction SilentlyContinue -Value $env_WPIndex
 } else {
  $envTimeThru = $envTimeThru + 1
  #if already greater than max, remove the global variable from machine. 
  Set-Variable -Name $JobName -Scope Global -Value 0 # -ErrorAction SilentlyContinue

 Set-Variable -Name $Times -Scope Global -ErrorAction SilentlyContinue -Value $envTimeThru
 return $Passthru

Monday, January 8, 2018

MegaMillions Script - i.e. playing with Invoke-webrequest

My co-worker likes to read through the powershell reddit and found this interesting little challenge. 

I had an idea for a function to generate Powerball and Megamillions numbers - multiple ticket generation segmented into separate objects and then converted into valid JSON; I did come up with something but figured this would be a nice short script challenge since it doesn't rely on an external API.
Enjoy! (source)
I thought I'd take a crack at it. So far, of the 12 responses, most of them are variants of the random number generator. After one post about 'unique' numbers, the values started getting error checking.

Results returned looks like this:
Balls:  1(16%) 58(12%) 6(16%) 61(12%) 64(12%) Mega: 22

$MegaMillionsWebPage = Invoke-WebRequest
#Extract table from web page
$mmTable = ($MegaMillionsWebPage.ParsedHtml.getElementsByTagName("TABLE") | % {$_.innertext} | % {$_.split("`n")})[1..25]
#Return mid-five columns from table (ball column)
#DrawDate Balls MegaBall Megaplier Details
$balls = $mmTable | % {$_.split(" ")[1..5]}
#Get only megaball values
$Megaball = $mmTable | % {$_.split(" ")[6]}
#Group appearance of mega ball by appearance on table
$GrpBalls = $balls | group | Sort-Object -property count -Descending 
#Base line statistics of mega ball number occurence.
$BallStats = $GrpBalls | Measure-Object -Property count -min -max -average
# Have won more than 'average' number of times. 
$avg = [int]$BallStats.average
While ($mostPopular.count -lt 5 -and $avg -gt 0) {
    $mostPopular = $GrpBalls | ? {$_.count -gt $avg} | select -ExpandProperty Name    
    $avg-- #broaden scope if less than 5 results returned. 
#Return 5 numbers from the most popular results.
$MyBalls = $mostPopular | Sort-Object {Get-Random} | select -first 5 | % {[int]$_} | Sort-Object
#Show number of appearance of each value.. 
$WeightedBalls = $GrpBalls | ? {$myballs -eq $} | select name, @{Name = "Weight"; Expression = {[string](($_.count / 25) * 100) + "%"}}
$BallReport = $WeightedBalls | sort -Property name | %{$"("+$_.weight+")"}
write-host -NoNewline "Balls: ", ($BallReport -join (" "))
$megaGrp = $megaball  | group | Sort-Object -property count -Descending 
$MegaBallStats = $megaGrp | Measure-Object -Property count -min -max -average
$MegaAVG = [int]$MegaBallStats.average
#Randomly pick one of the most popular mega numbers.. 
$MegamostPopular = $megaGrp | ? {$_.count -eq $MegaBallStats.maximum} | select -ExpandProperty Name | Sort-Object {Get-Random} | select -first 1
write-host " Mega:",$MegamostPopular

Monday, September 11, 2017

Powershell CSV Join

I am currently working on a fairly large project migrating mailboxes from on-premise Exchange 2010 upto Microsoft's O365. One of the issues that we're having is that mailbox permissions don't always migrate correctly. So I've developed process to capture the mailbox permissions in one form or another in a CSV prior to migration, then I reapply them after reapplying them.

One of the issues experienced is that on-premise account information doesn't always match up with what's in O365. For example, I have the local Exchange / AD account information, but not necessarily what the account looks like in o365. So I wrote this script to join two CSV files based on similar properties. This way I can join a CSV containing mailbox information (i.e. SamAccountName) to the Get-MailboxPermission "USER" field.

.\CSVJoin.PS1 -CSV1Data "c:\bin\All-Mailboxes.CSV" -CSV1Property Samaccountname -CSV2Data "c:\bin\All-MailboxPermissions.csv" -CSV2Property USER -JoinedCSV "c:\BIN\Joined-CSV.csv"

The  script will export a CSV containing all fields from CSV1 plus all the fields from CSV2 that don't overlap in name.

    [parameter(Mandatory=$true,HelpMessage="Path and filename for source CSV 1")][ValidateScript({Test-Path $_ })][string]$csv1Name,
    [parameter(Mandatory=$true,HelpMessage="Propery from CSV1 to join files on.")][string]$csv1Property,
    [parameter(Mandatory=$true,HelpMessage="Path and filename for source CSV 2")][ValidateScript({Test-Path $_ })][string]$Csv2Name,
    [parameter(Mandatory=$true,HelpMessage="Propery from CSV2 to join files on.")][string]$csv2Property,
    [parameter(Mandatory=$true,HelpMessage="Path and Name for combined CSV file")][string]$JoinedCSV

$csv1Data = Import-CSV $csv1Name | ?{$_.$CSV1Property -ne $null}
$csv2Data = Import-csv $Csv2Name | ?{$_.$CSV2Property -ne $null}

#Capture all the column values for each CSV file and compare them.
$csv1Members = $csv1Data[0] | Get-Member | ?{$_.membertype -eq "NoteProperty"}
$csv2Members = $csv2Data[0] | Get-Member | ?{$_.membertype -eq "NoteProperty"}
$AddCSV2members = Compare-Object $csv1Members $csv2Members | ?{$_.sideindicator -eq "=>"} | %{$_.inputobject}

#Populate HashTable with First CSV based on JOIN fields
$csv1HashTable = @{}
$csv1Data | %{$csv1HashTable[$_.$csv1Property.trim()] = $_}

#Loop through Second CSV and join fields to first CSV. 
$newCSV = @()
ForEach ($c in $csv2Data) {
    $Row = $csv1HashTable[$c.$csv2property.trim()]
    if ($row ) {
        ForEach ($m in $AddCSV2members) {
            $Row | add-member -Membertype NoteProperty -Name $ -value $C.$($ -force
        $newCSV += $Row

if ($newCSV) {
    $newCSV | Export-csv $JoinedCSV -notypeinformation

Thursday, July 6, 2017

Delete and Compress old Log Files using DOS Batch file.

On all of our Exchange Client Access servers, I've been running my Compress and Delete script. Unfortunately, randomly the scheduled task will fail to run the script and the log files don't get purged. This requires kicking off the process by hand to avoid server meltdown.

I am thinking, that powershell may be partially at fault on some of these boxes. Permissions, or run time exceptions, may be causing the script to fail at running. So I've managed to put together this DOS Batch script that does basically the same thing.

  • Deletes all log (and zip) files in the folder older than 30 days
  • (if finds 7-zip) It will compress all log files older than 7 days, then delete them
The one disadvantage that I see is date-stamping. In Powershell, I was going through and stamping the original log files 'last modified' date on the ZIP. This allowed me to easily trigger 30-day deletes on any file in the folder because it would maintain it's original date. I figure that if I run this scheduled task daily, it will only offset the date by 7 days (i.e. an 8 day old log file will take today's date).

I am saving the following as "DOSPurgeOldLogFiles.CMD" and running it from a daily scheduled task.

@Echo off
REM Folder for Log Files
Set InetLogsFolder=c:\inetpub\logs\LogFiles\W3SVC1
if NOT EXIST %InetLogsFolder% Goto :NoLogs

REM Purge all files in log folders older than 30 days old
forfiles -p %InetLogsFolder% /s /m *.* /d -30 /c "cmd /c del @path"

if NOT EXIST "c:\program files\7-Zip\7z.exe" Goto :NoZIP

REM ZIP all files older than 7 days old
for /F %%G in ('forfiles -p %InetLogsFolder% /s /m *.LOG /d -7') DO "c:\program files\7-Zip\7z.exe" a -tzip -mtc=on "%InetLogsFolder%\" "%InetLogsFolder%\%%~G"

REM delete all files that are now ZIP'd
forfiles -p %INETLogsFolder% /s /m *.log /d -7 /c "cmd /c del @path"


Echo Cannot locate log files at %InetLogsFolder%

Echo Install 7-ZIP to compress log files.


Friday, May 12, 2017

Pull All WU Patches from Servers

With all the concern in the news lately, we went through all our on-premise servers and reviewed the patching. This script reads all of our Exchange servers (you could replace that for a CSV of server names) and does a remote call for the Windows Update patching. The script will return an object containing each installed patch and if it was successful or not.
$scriptBlock = {
 $Session = New-Object -ComObject "Microsoft.Update.Session"
 $Searcher = $Session.CreateUpdateSearcher()
 $historyCount = $Searcher.GetTotalHistoryCount()
 $Searcher.QueryHistory(0, $historyCount) | ?{$_.title -notlike "*definition update for*"} |Select-Object Title, Description, Date,
 @{name="Operation"; expression={switch($_.operation){
    1 {"Installation"}; 2 {"Uninstallation"}; 3 {"Other"}
 @{name="Status"; expression={switch($_.resultcode){ 1 {"In Progress"}; 2 {"Succeeded"}; 3 {"Succeeded With Errors"};4 {"Failed"}; 5 {"Aborted"} }}}

$serverList = get-exchangeserver
$Patching = @();$serverCount = $serverList.count;$index=1

forEach ($server in $ServerList ) {
    write-progress -activity "reading Windows Update" -Status $ -percentcomplete (($index/$serverCount)*100);$index++
    $LastUpdates = Invoke-Command -ScriptBlock $scriptBlock -ComputerName $ -ErrorVariable $failedWINRM
    $Patching+= $lastupdates
return $Patching

.\report-WindowsUpdatePatching.ps1 | ?{$_.title -like "*4012212*" -and $_status -eq "Failed"}