Monday, August 4, 2014

Quickie Mailbox Statistics Report

So I support an o365 environment. One of the reports my manager asked me to run is to gather mailbox size information for our entire population. Sadly, due to RBAC controls, you can't get a numeric response back from a get-mailboxstatistics, only a string.

TotalItemSize
-------------------------
75.33 MB (78,991,829 bytes)

Using string manipulation, I am able to pull out the mailbox size.

"75.33 MB"

Then using Powershell's own Invoke-Expression, convert that down to a number (in bytes).

78989230.08

Now, I can use the Measure-Object command to give me statistics.

get-mailbox -resultsize 100 | select @{Name="Size";Expression={(Invoke-Expression ((get-mailboxStatistics -identity $_.identity).totalitemsize.tostring().split("(")[0].replace(" ",""))) / 1mb}} | measure-object -property size -min -max -average -sum

Count : 100
Average : 28.4768
Sum : 2847.68
Maximum : 500.8
Minimum : 0
Property : Size

Tuesday, July 22, 2014

CMDLET Bidnding Options

I just located this awesome post that details all the various options when doing cmdlet binding. For example, you can define your parameters to a script as they are inputting them. For example:
ValidateScript Validation Attribute
The ValidateScript attribute specifies a script that is used
to validate a parameter or variable value. Windows PowerShell
pipes the value to the script, and generates an error if the
script returns "false" or if the script throws an exception.
When you use the ValidateScript attribute, the value
that is being validated is mapped to the $_ variable. You can
use the $_ variable to refer to the value in the script.
In the following example, the value of the EventDate parameter
must be greater than or equal to the current date. 
Param
(
[parameter()]
[ValidateScript({$_ -ge (get-date)})]
[DateTime]
$EventDate
)
In the following example, the value of the variable $date must be
greater than or equal to the current date and time.
[DateTime][ValidateScript({$_ -ge (get-date)})]$date = (get-date)
I could see using this in a script where I absolutely want the identifier of a user mailbox. Using the ValidateScript I would do a simple Get-Mailbox and validate I get a legitimate answer.



Wednesday, July 9, 2014

Compress and Delete old IIS Logs

We have a single server that hosts OWA and ActiveSync services for the entire organization. This server generates IIS log files on a daily basis averaging around 1.5GB in size. After a month of web traffic, these log files quickly fill up the drive space and bring it to it's knees.

The purpose of this script is to clean-up the log files folder and keep it at a manageable size.

  1. Log files older than 7 days will be compressed into ZIP format. 
  2. ZIP files older than 30 days will be deleted.
If ran locally, the script can use the native windows compression tools, but I wanted it to run as a scheduled task. Without using 7-Zip, the script would create the ZIP file, but would never copy the file into the ZIP. 7-Zip is able to accomplish this for me. If it's installed, the script will favor 7-Zip over native tools (update path if you didn't use 32-bit version). I believe this is due to the fact that the copy method in the native compression is ran as a separate task. As a scheduled task, it is never spawned. This leaves a 1kb ZIP file in place. 

After running script the first time, you should find in your "C:\inetpub\logs\LogFiles\W3SVC1\" folder, the following items. 
  • ZIPs containing logs over 7 days old. 
  • The original log files that are in those ZIPs. 
Second run:
  • ZIPs over 30 days deleted.
  • Logs over 7 days with existing ZIP over 1kb deleted.


If I take out the support for native compression, I can safely have the script delete the old log files after compression. Until then, the script will always leave the old log files until a later run. To accommodate this, I've scheduled the script to run both Sunday and Monday. 



<#
.SYNOPSIS
   Compress then purge old IIS logs
.DESCRIPTION
   Looks for Log files over a certain date. 
     - Compress files that are over a certain date to ZIP format. 
  - Delete ZIP files that exceed longer date. 
   Uses 7ZIP if installed - needs it if running as a scheduled task.
       Download: http://7-zip.org/
   
#>

#variables for folder management

#Folder path to where the IIS logs reside.
$FolderPath = "C:\inetpub\logs\LogFiles\W3SVC1\"

#LOGS older than this date (-7 days ago) will be compressed.
$CompressLogsDate = (Get-Date).AddDays(-7)

#ZIPs older than this date (-30 days ago) will be deleted.
$OldestZIPDate = (Get-Date).AddDays(-30)


#Create ZIP scripts
function New-Zip {
 param([string]$zipfilename)
 set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
 (  dir $zipfilename).IsReadOnly = $false
}

function Add-Zip {
 param([string]$zipfilename)

 if(-not (test-path($zipfilename))) {
  set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
  (   dir $zipfilename).IsReadOnly = $false 
 }

 $shellApplication = new-object -com shell.application
 $zipPackage = $shellApplication.NameSpace($zipfilename)

 foreach($file in $Input) {
  $zipPackage.CopyHere($file.FullName)
  Start-sleep -milliseconds 500
 }
}

if (!(Test-Path $FolderPath)) {
 write-host "no log folder found at $folderpath"
 break
}

#Check to see if 7Zip installed.
$7ZipPath = "${Env:ProgramFiles(x86)}"+"\7-Zip\7z.exe" #32-bit path
#$7ZipPath = "$Env:ProgramFiles\7-Zip\7z.exe"   #64-bit path
$Use7Zip = (test-path $7ZipPath)

#Delete ZIPs over a specific date
$ZipPath = $FolderPath +"*.zip"
$RemoveZips = Get-ChildItem $ZipPath |?{$_.lastwritetime -lt $OldestZIPDate}
if ($RemoveZips) { #Delete ZIP files older than $OldestZIPDate.
 ForEach ($zip in $RemoveZips) {
  Remove-Item $zip.VersionInfo.FileName.tostring()
 }
}  

#Find Logs that are older than specific date.
$LogPath = $FolderPath +"*.log"
$Oldlogs = Get-ChildItem $LogPath |?{$_.lastwritetime -lt $CompressLogsDate}

if ($Oldlogs) { #Process old log files.
 ForEach ($Log in $Oldlogs) {
  $LogFileStr = $Log.VersionInfo.FileName.tostring()  
  $TempZipName = $LogFileStr.replace(".log",".zip")
  if (!(Test-Path $tempZipName)) { #Create a new ZIP if one doesn't exist
   if ($Use7Zip ) {
    #Use 7Zip if using a scheduled task. 
    set-alias sz $7ZipPath
    sz a -tzip $TempZipName $LogFileStr  
   } else {
    #Found this method doesn't work as a scheduled task. This only works if ran interactively.
    New-zip $tempZipName
    Get-Item $LogFileStr | add-zip $tempZipName
   }
  } else {
   $newZipFile = Get-Item $TempZipName 
   $CreatedTodayBool = $newZipFile.creationtime -gt (get-date).addhours(-1)
   if ($newZipFile.length -gt 1kb) { 
    #Delete original LOG file it ZIP is larger than creation size. 
    if (Test-Path $log) {Remove-Item $Log}
   } else {
    #Make sure not deleting zips created in last hour. Might still be compressing files. 
    if (!$CreatedTodayBool) {
     #Delete ZIP files that are only 1KB in size. 
     Remove-Item $tempZipName 
    }
   }
  } 
 }
}

Wednesday, May 28, 2014

Monitoring Invoke-Command on Remote Servers

So, I've been playing around with Invoke-Command to search event logs on remote servers.

$j = Invoke-Command -ComputerName $servers -ScriptBlock $ScriptBlock -ArgumentList $NumberOfHours -AsJob
Do {
 $jobcount = $j.childjobs.count
 $index = ($j.childjobs | ?{$_.state -eq "completed"}).count
 Write-Progress -Activity "reading remote logs" -status $j.state -PercentComplete (($index / $jobcount)*100)
} while ($j.state -ne "completed")

By adding the -AsJob switch to the Invoke-Command, the variable $J contains the job results. It appears to dynamically update as the job progresses along, detailing the servers that have completed and those that have not. Now I am able to create  a status bar that updates based on the % of completed servers.

When the script is complete, I just need to gather the data from the request.

$Results = Receive-Job $J



Wednesday, May 21, 2014

Activate Earlier DBs

Last Friday night we patched each of our Exchange 2010 DAGs with the latest Windows OS patches. As a practice, I start with node 1, and work down the list to node 6.


  1. Run $ExScripts\StartDagMaintenance.ps1 to put DAG in maintenance mode and move active DBs off to passive nodes. 
  2. Run batch file to disable ForeFront integration on this 2010 server. 
  3. Start Windows Update process (do not let WU reboot server!!)
  4. Reverse step 2, re-integrate FF.
  5. Reverse step 1, run $ExScripts\StopDagMaintenance.ps1 script; reboot server
Now, unfortunately, the server no longer has any of it's active dbs still running on it. Sure, I'll run the rebalance db script when the entire process completes, but at 2am, I don't always like waiting to the very end. Plus the StartDagMaintenance script can take quite awhile when hitting server 4,5,6 in the process. 

So, Friday, I wrote up this little script to 'activate earlier dbs' on my current server. 

$s = $env:Computername;Get-MailboxDatabase -Server $s | ?{$_.activationpreference[0].key.name -eq $s -and $_.server.tostring() -lt $s} | Move-ActiveMailboxDatabase -ActivateOnServer $s


So, if you are on server 3 of the DAG and you run this script it will check to see if any dbs are currently active on server 1 and 2. If so, it will activate those copies. 

Assumption: Your dag nodes are alphabetically in order. For example:
  • DAG101
  • DAG102
  • DAG103
  • DAG104
  • DAG105
  • DAG106

Test Transport Servers

Using native tools, I have been looking for a way to test the transport services on all my Exchange 2010 services. This one-liner, will attempt to send an email via each transport server and if it get's any error during the process, it will send an email to the admin that it failed.

$str="";$Invalid=@();get-transportserver | %{send-mailmessage -to invalid@example.com -from $($_.name+"@example.com") -Subject $_.name -SmtpServer $_.name -ErrorVariable ERRORS -erroraction silentlycontinue;if ($errors) { [array]$Invalid += $_.name}};If($invalid){$invalid | %{Get-ExchangeServer $_}  | select name, site | convertto-html | %{$str+= $_}; Send-MailMessage -to "Admin@Example.com" -from "SMTP-DOWN@Example.COM" -Subject "SMTP ERRORS" -SmtpServer SMTP_Server_Name -Body $str -bodyashtml }


Now, it makes one assumption that the server you send the alert through is valid.

Part 1:  Initialize variables
$str="";$Invalid=@();

Part 2: Send a test message, if get error response, capture it and record it in $INVALID
get-transportserver | %{send-mailmessage -to invalid@example.com -from $($_.name+"@example.com") -Subject $_.name -SmtpServer $_.name -ErrorVariable ERRORS -erroraction silentlycontinue; if ($errors) { [array]$Invalid += $_.name}}  # $INVALID will contain the TransportServer name when found invalid.

Part 3: Process server names, create message body, send email.
If($invalid) {$invalid | %{Get-ExchangeServer $_}  | select name, site | convertto-html | %{$str+= $_}; Send-MailMessage -to "Admin@Example.com" -from "SMTP-DOWN@Example.COM" -Subject "SMTP ERRORS" -SmtpServer SMTP_Server_Name -Body $str -bodyashtml }

The message received contains the servername and the AD site the server resides in.


Monday, March 3, 2014

Exchange 2010 - Outlook Client versions Report

So, I have been investigating remoting as a method of speeding up the processing of long and tedious reports. My first (and probably best) test case, is to read each of the various RPC Client Access Logs on all the Client Access Servers in my Org and generate this report.