Recently, I received a task which required me to run a particular command on a several thousands of servers. Since execution of this command takes some time, it is just logical to run them in a parallel mode. And PowerShell Background Jobs are the first thing comes in mind.
But resources of my PC are limited — it cannot run more than a 100 jobs simultaneously. Unfortunately, PowerShell doesn’t have a built-in functionality for limiting background jobs yet.
Though, I’m not the first one who stuck with the same problem: official “Hey, Scripting Guy!” blog has introduced us a queue based on .NET Framework objects. But I couldn’t manage this solution to work and needed something simpler. After all, all we need are:
- a loop
- a counter, for how much jobs are active and running
- a variable, allowing next job in queue to start
Eventually, I came up with a piece of code like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
$maxConcurrentJobs = 100 #Max. number of simultaneously running jobs foreach($Object in $Objects) { #Where $Objects is a collection of objects to process. It may be a computers list, for example. $Check = $false #Variable to allow endless looping until the number of running jobs will be less than $maxConcurrentJobs. while ($Check -eq $false) { if ((Get-Job -State 'Running').Count -lt $maxConcurrentJobs) { $ScriptBlock = { #Insert the code of your workload here } Start-Job -ScriptBlock $ScriptBlock $Check = $true #To stop endless looping and proceed to the next object in the list. } } } |
Whereas my version is similar to a solution proposed at StackOverflow (which I only found AFTER completing my own version, of course), the SO version suffers from a bug where some items in queue may be skipped.
While PS Jobs are so easy to play with, Boe Prox claims that runspaces work much and much faster. Go, check it out, how you can use them to accelerate your job queue in his blog.