Measure twice

In this article I demonstrate the difference between using the InputObject parameter in the Measure-Object command and piping the same object into it. We get different results, and this is the explanation.

Measure twice, cut once

One of my other hobbies is furniture making. It's a well known mantra amongst woodworkers to "measure twice, cut once". We can avoid spilling wood and time by making sure our measurements are accurate. When we measure twice and get the same measurement both times, we can be more confident we did it right. When I ran into some weirdness with Measure-Object, and got different measurements consistently, this parallel immediately entered my mind.


The file share where we keep our backups was running out of space, and I wanted to find out where we could reclame some. All SQL Backups are made using maintenancesolution.sql by Ola Hallengren. The space taken per server or per database is determined not only by the database size, but also by the backup interval and the retention period. We use backup compression on all backups. I needed to compare the space taken per database, and couldn't find a ready-to-use solution, so I wrote a little function.
To add up the file sizes of the .bak files, I turned to Measure-Object, and noticed this cmdlet returned different results depending on the syntax I used.


On this share, there's a directory for every SQL Server instance in the company, with subdirectories for every database, which have subdirectories 'FULL', 'DIFF' and 'LOG' where applicable.

Backup Directories

The retention period as defined by the 'CleanupTime' in the backup procedure differs from instance to instance. As a first step I wanted to look at the full backups and see how many backup files I had for every database and what the total size per database was. The number of files is easily found by the '.count' property of the 'FileInfo' collection we get from Get-ChildItem, but to get the total file size, we need either a loop to add to a variable, or Measure-Object.

I always start by exploring the objects I'm going to work with. There's nothing complicated about Get-Childitem, but for sake of completeness, here's what it returns:

File properties

We are interested in the 'length' property, as that gives us "the size, in bytes, of the current file", as you can also find on MSDN.
Now, the first test I ran, I knew I had a directory with 4 files of 22 GB each, and yet, something totally different came out:

Measure inputobject

Strange, count 1? length 4?
Usually, I send objects to Measure-Object through the pipeline. But surely, this should be the same? Let's try:

Measure pipeline

Totally not the same! This one seems correct: I have 4 files and the 'length' adds up to 94 billion bytes or 88 GB.

The Mystery revealed

So, is the -InputObject parameter broken? Not really, but we need to be aware of the logic behind it.
Maybe we are used to work with -ComputerName for a lot of cmdlets and functions, and we rely blindly upon the ability of the command to handle whatever collection we provide. We know this executes the action we chose, like Get-DbaSqlService or Get-DbaOperatingSystem or whatever, separately to every computer in the collection. The collection is 'fold out', 'unpacked', 'split'.

-InputObject doesn't do that! Is that wrong? Not necessarily, it may be a choice of design to have possibilities to enquire the properties of the collection itself and those of the 'members' or 'children'. It's just a surprise the behaviour differs from pipeline input to parameter input.
That said, what would be the result if we compare the Get-Member output of both methods, as this cmdlet has a -InputObject parameter too?

collection properties

There we go: exactly the same, meaning: totally different again. Compare this output with what we got earlier by

$BUFiles | Get-Member -MemberType Properties

and we see again this version, using -InputObject returns properties of the $BUFiles collection, while the first version, using the pipeline, returns properties of the child objects!

One more trick ;-)

,$BUFiles | Get-Member -MemberType Properties

We add just one little ',' and get the behaviour of the -InputObject technique. Does the automatic 'unpacking' not work when preceding with a comma?
Sure it does. Only, instead of having a collection $BUfiles holding fileinfo objects, we created an array with one child, namely the $BUFiles collection.

The complete script

Just in case you're wondering, here's the function I used to give me the information on the backup files per database:

Import-Module sqlserver

function Get-FullBackupsOnShare {
    $DBs = Invoke-Sqlcmd -ServerInstance $DBAinstance -Database $DBADB -Query "SELECT HostName,DatabaseName FROM dbo.Databases WHERE HostName NOT IN ('TestComputer','FakeHost') AND DatabaseName NOT LIKE '%tempdb';"
    foreach ( $DB in $DBs ) {
        $HostName = $DB.HostName
        $DBName = $DB.DatabaseName
        $SharedBackupFiles = Get-ChildItem \\\SQLBackups\$HostName\$DBName\FULL\ -Recurse
        $LatestBU = $SharedBackupFiles | sort LastWriteTime -Descending | select -First 1
        $BUSize = ($SharedBackupFiles | Measure-Object -Property Length -Sum).Sum / 1GB
            ComputerName = $HostName
            Database = $DBName
            LastBackedUp = $LatestBU.LastWriteTime
            LastFullBackup = $LatestBU.FullName
            FullBackups = $SharedBackupFiles.count
            FullBackupSize = $BUSize

Get-FullBackupsOnShare | Out-GridView  

As you can see, we have our SQL Instances and Databases listed in a DBA database. You may have to adapt this function to your situation, and find your own way to feed your databases to the function.
The location of the file share is hard coded in the function, you may wan to alter that too: either replace by your path, or have the function accept a parameter.


$BUFiles | Get-Member -...
$BUFiles | Measure-Object -...

return properties and measurements on the objects in the collection,

Get-Member -InputObject $BUFiles
Measure-Object -InputObject $BUFiles

return properties and measurements on the collection itself.


Simply put, we need to be aware that some commands behave differently when using the pipeline. It's not because a parameter accepts pipeline input, that it behaves the same way.

Previous Post Next Post