What I like about PowerShell

What I like about PowerShell


First of all, I do not consider myself a professional let alone an expert on PowerShell. However, I think the learning curve of PowerShell is very appealing especially if you have got some background in programming.

The first programming language I learned was Java…way back in 2005 at University. Well, frankly speaking my first programming language was Basic on a Commodore C128 computer…but that time as a teen I didn’t really get a deep understanding and ended up retyping listings from books….so that doesn’t count. Opposed to Linux Shell, PowerShell feels very similar to coding Java. It doesn’t matter how you format the code…you don’t have to put attention on such things as line endings and carriage return and don’t have to use vim at all :-P. Well…enough on that admittedly biased and poorly educated comparison…which is probably not what Rob asked for.

Reading the topic for TSQL Tuesday I thought, OK let’s take a look at the last PowerShell Scripts I wrote and take out some features I especially like. The project was about automated backup testing: A separate server should restore all SQL Server backups and if successfully restored put them onto a tape for archiving. For the restoring part of the solution I choose to write TSQL procedures as a lot of this involved working on the database and it seemed more natural to use. However for the second part (doing the backup on tape) I choose PowerShell. I ended up with four PowerShell Scripts for the job. This blog post is about the script createTapeBackupJobForSqlBackups: It does what it says… creates a backup job for copying the tested backups to tape (with Symantec BackupExec 10.5)

So let’s take a closer look on the script and nice things it does via PowerShell:

Logging for beginners like me

Every Script run creates a log file. The log file itself is simple stored within a (shame on myself…untyped) variable:

$logFolder = 'X:\Scripts\log'; # folder to store logging information from script exection

# set logfile name
$logFile = $logFolder + "\createTapeBackupJobForSQLBackups_$(Get-Date -Format yyyyMMdd_HHmmss).log";

echo "$(Get-Date -Format "yyyyMMdd HH:mm:ss") [INFORMATION] Starting script execution. Threshold for tape backup is set to $fileSizeThresholdForBackupJobMB MB. Database is $databaseServer\$databaseName." > $logFile; 

I hope you dear reader are not yawning yet…because that’s so simple and fundamental what I write about….

I have got a variable $logFolder for the loggingPath. Within that Path I generate a filename consisting of the scriptName and a timestamp. Finally I am just writing information into that file. This is achieved by using linux-like > $logFile which creates the file and writes the first line to it. Following logging uses >> $logFile in order to append to the file.

Scripting pros like Rob could argue that it would be better to write a separate function for logging purposes and just call that. I am fine with this way as it doesn’t involve much more code than a function call (just repeating the Get-Date Part over and over again).

Querying SQL Server and iterating through results

Next one is about querying sql server tables. For this purpose I use the module sqlps, which you have to import first of all.

Import-Module sqlps -DisableNameChecking; # module sqlps is needed to execute queries on sql server

Then I prepare my statement in a variable called sql like in the following example and execute it:

 # query to determine new backups ready to be added to a job: BackupExec can process a maximum of 800 files within a backup job
$sql = "
    SELECT  TOP $fileCountMaximum *, CAST(NULL AS NUMERIC(8,2)) AS FileSizeMB 
    FROM    dbo.CommandLogForTapeBackup 
    WHERE   TapeBackupJobId IS NULL
    ORDER BY COALESCE(RecoveryTestEndTime, RecoveryTestStartTime) /* skipped backups have RecoveryTestEndTime NULL*/;

# execute query and read results into an object
$backupFileList = Invoke-Sqlcmd -ServerInstance $databaseServer -Database $databaseName -IncludeSqlUserErrors -OutputSqlErrors $True  -Query $sql ; 

Sidenote about what that’s about: I am using Ola Hallengren’s Backup Solution. For the purpose of backup testing I added some additional columns and a view. This view holds all relevant data of successfully tested backups. Backups which are not on tape yet (precisely: have no tape backup job created yet) are selected for this purpose. Last of all I am using a variable for the TOP-part…as Backup Exec 10.5 supports 800 files within one job max. I add a column FileSizeMB in order to populate it later on. For better understanding: here’s a screenshot of the query results:

The SQL is executed using Invoke-Sqlcmd which uses good old sqlcmd.exe under the covers. Naturally you have to provide the InstanceName, DatabaseName and the query itself. This one uses integrated security…so there are no credentials.

Next let’s iterate through the result set and do something with the query results.

# iterate through all files prepare the selection list for tape backup and determine the file size
    ForEach ($backupFile in $backupFileList)
       $backupFile.FileSizeMB = (Get-Item Microsoft.PowerShell.Core\FileSystem::$($backupFile.BackupFile)).length/1024/1024; # the type Microsoft.PowerShell.Core\FileSystem is needed for PowerShell to be able to handle UNC paths smoothly...works with classic local paths (like C:\directory) as well)
       # assemble selection list for jobscriptfile...each file has to be numbered for the list
       # PATH1=E:\SQL-Backup\... --> file to be backed up
       # INCLUDE1=1 --> include this file into tape backup job
       # SUBDIRS1=0 --> don't process subdirectories (because its a single file :-))
       $counter = $counter +1;
       #$selectionList = "$selectionList`r`nPATH$counter=$($backupFile.BackupFile -replace "\\\\3CDNVCB01","E:")";
       $selectionList = "$selectionList`r`nPATH$counter=$($backupFile.BackupFile -replace "\\\\3CDNVCB01\\SQL-Backup2","X:\MSSQL12.MSSQLSERVER\MSSQL\Backup\SQL-Backup")";
       $selectionList = "$selectionList`r`nINCLUDE$counter=1";
       $selectionList = "$selectionList`r`nSUBDIRS$counter=0";

Iterating itself is quite elegant using ForEach. I have got the result of my query in the variable backupFileList. ForEach extracts the current record in the look into the variable backupFile. Now I can access each column using <variable>.<columnname> (for example $backupFile.FileSizeMB). That’s great! I don’t know if the correct PowerShell termin for the variable is “object”…but that’s quite what I think it is and how it looks like compared to Java.  It’s possible to assign new values to the properties of the object or to just read them. Here I determine the size of each file, save it into the property FileSizeMB. Additionally I add each filename to a new variable $selectionList which is the format BackupExec expects the file selection list to be in.

Summing up values

Remember I have got the object $backupFileList with the result of my sql query. The property FileSizeMB holds information about the size of each backup file. Now I would like to know the total file size I am about to backup.  This can be achieved using the Measure(-Object) cmdlet:

$totalFileSizeMB = ($backupFileList | Measure FileSizeMB -Sum).Sum; 

How easy is that?

I am just combining the object with the command using the pipe (|) symbol, then tell it which property to measure and the type (could be multiple like -Sum -Average). This returns another object…in order to just access the value I access the property Sum and that’s it! Great and easy isn’t it?

Formatting convenience

PowerShell is great on formatting as well. I have processed all my backupFiles and now would like to add a table output in my logfile. That can be achieved using the following commands:

$backupFileList | Select ID, DatabaseName, BackupFile, @{Expression={[System.Math]::Round($_.FileSizeMB, 2)}; Label = "FileSizeMB"}  | Sort DatabaseName, ID | Format-Table -Property ID, BackupFile, FileSizeMB -Autosize >> $logFile;

If I just use $backupFileList to get the output of my object it is listed as follows:

The cmdlet Select(-Object) enables you to choose some specific properties for further processing. As you can see in this example it is even possible to do some calculations using expressions. In this case this is just for keeping the ouput of file size small and shorten it to two decimal places.

Afterwards Sort(-Object) helps with sorting the list.

The best part of it all is the Format-Table cmdlet. This generates a nicely formatted and aligned text table from all the values.

Be careful with comparison vs. assignment

Well this last paragraph is quite a basic one. Throughout the examples you have already seen some examples of variable assignment. This is quite straightforward using the “=” operator.

For example: $continueProcessing = $FALSE; sets the variable $continueProcessing to the boolean value false.

However if you want to check a variable for a certain value, you have to use –eq as comparison operator

This checks if the variable $continueProcessing is set to the boolean value true.

If ($continueProcessing -eq $TRUE)

For a full list of comparison operators have a look at the documentation.

What do you think?

That’s my first contribution to TSQL Tuesday. I hope you found something valuable to take along. As already mentioned I am no PowerShell Expert and far from programming PowerShell each day. So if you are more experienced than me let me know what I could do better next time.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.