3
votes

I have a rootDir with 3 dirs under it, I just want to compress each dir and listen the result by using multithreaded code, so I tried the following:

Set-Location "C:\test"
$sw = [Diagnostics.Stopwatch]::StartNew()
Get-Job | Remove-Job
$rootDirectory = $PWD
$dirs = $(Get-ChildItem -Path $rootDirectory -Directory).Name
# $dirs = d1, d2, d3
$sb = {
    Param($init)
    Compress-Archive -Path $init -DestinationPath "$init.zip" -CompressionLevel NoCompression
}

$jobs = @()
$dirs | ForEach-Object {
    $jobs += Start-Job -ScriptBlock $sb -ArgumentList $_
}

Wait-Job -Job $jobs | Out-Null
Receive-Job -Job $jobs

I got

The path 'd1' either does not exist or is not a valid file system path.
    + CategoryInfo          : InvalidArgument: (d1:String) [Compress-Archive], InvalidOperationException
    + FullyQualifiedErrorId : ArchiveCmdletPathNotFound,Compress-Archive
    + PSComputerName        : localhost

if I run the compress command serially, not using the Start-Job, all seems to work.

1

1 Answers

4
votes

The jobs don't inherit their working directory directory from your script. Put the Set-Location inside the scriptblock and pass the directory as a parameter:

$rootdir = 'C:\test'
...
$jobs = Get-ChildItem -Path $rootdir -Directory | ForEach-Object {
    Start-Job -ScriptBlock {
        Param($init, $dir)
        Set-Location $dir
        Compress-Archive -Path $init -DestinationPath "${init}.zip" -CompressionLevel NoCompression
    } -ArgumentList $_, $rootdir
}
...

Better yet, pass the full path as the parameter, not just the name:

$rootdir = 'C:\test'
...
$jobs = Get-ChildItem -Path $rootdir -Directory | ForEach-Object {
    Start-Job -ScriptBlock {
        Param($init)
        Compress-Archive -Path $init -DestinationPath "${init}.zip" -CompressionLevel NoCompression
    } -ArgumentList $_.FullName
}
...