1
votes

I have a network share hosted by a server (\SERVER) which is being accessed by other servers/clients.

\\SERVER\SHARE\Folder\File

If I wanted to compress Folder and everything in it, is it possible to do using PowerShell WITHOUT having the files be downloaded to the machine that is running the command?

The files in question are large, and there is not enough room on the C drive to download and compress the files on the client. So for example if I navigated to the network share from the client using Windows File Explorer, and selected the folder to compress, it would start downloading the files to the client and then fail due to insufficient free space on the client.

What about PowerShell's Invoke-Command Option?

I do have the option to Invoke-Command from the client to the server, however, the C drive of \SERVER is to small to handle the request as well. There is a D drive (which hosts the actual \SHARE), which has plenty of space though. I would have to tell PowerShell to compress files on that drive somehow instead of the default which would be the C drive.

Error when running the below PowerShell Command

Compress-Archive -Path "\\SERVER\SHARE\Folder" -DestinationPath "\\SERVER\SHARE\OtherFolder\Archive.zip"

Exception calling "Write" with "3" argument(s): "Stream was too long." At C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820 char:29 + ... $destStream.Write($buffer, 0, $numberOfBytesRead) + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : IOException

1

1 Answers

1
votes

The problem is caused by Compress-Archive's limits. Its maximum file size is 2 GB. Documentation mentions this:

The Compress-Archive cmdlet uses the Microsoft .NET API System.IO.Compression.ZipArchive to compress files. The maximum file size is 2 GB because there's a limitation of the underlying API.

As for a solution, compress smaller files, or use another a tool such as 7zip. There's a module available, though manual compression is not that complex. As 7zip is not a native tool, install either it or the Powershell module.

Set-Alias sz "$env:ProgramFiles\7-Zip\7z.exe"

$src = "D:\somedir"
$tgt = "D:\otherdir\archive.7z"    
sz a $tgt $src 

If the source files are small enough so that a single file will never create an archive larger that the limit, consider compressing each file by itself. An example is like so,

$srcDir = "C:\someidir"

$dstDir = "D:\archivedir"
# List all the files, not subdirs
$files = gci $srcDir -recurse | ? { -not $_.PSIsContainer }

foreach($f in $files) {
  # Create new name for compressed archive. Add file path, but
  # replace \ with _ so there are no name collisions.
  $src = $f.FullName
  $dst = "c:\temppi\" + $src.Replace('\', '_').Replace(':','_') + ".zip"
  Compress-Archive -whatif -Path $src -DestinationPath $dstDir
}

As a side note: use Enter-PSSession or Inoke-Command to run the script on the file server. There you can use local paths, though UNC paths should work pretty well - those are processed by loopback, so data isn't going through network.