0
votes

I have a web application that generates and uploads documents (word, pdf etc) on the server. Now I am writing a power shell script that will first transfer these files to an ftp folder. This part is done. The second script will run on my back up server every day and download all the folders/files from that ftp server to my back up server. The problem is that I am unable to download folders. Its easy to download files, but I cant seem to find any piece of code that basically lets me either download the entire folder (with contents) or at least let me iterate through the folder contents.

If I knew the folders then I would not have a problem, I could just hard code a function for every folder but the documents are uploaded to dynamic folders (folder names are generated by the application). I just know the primary folder. I need to iterate through it. And zipping is not a good idea as the folder size is around 1 Gig. Don't want to transfer 1 gig through the wire every day.

Any help in this regard would be greatly appreciated.

Thanks.

-----------UPDATE--------

The gci only works for files on a local system (I could be way wrong here). I am referring to iterating through the ftp folders and files.

Heres how I gci through locally: $GciFiles = gci C:\ParentFolder\SubFolder\ -recurse | Where {$_.attributes -ne "Directory"}

foreach ($file in $GciFiles) {}

Now this is all dandy since the foreach can give me the folder name plus the file name as well. However I want to do the same with the FTP code:

$ftphost = "ftplocation\Parentfolder\" credentials here make a ftp_web_request object

$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectory $ftpresponse = $ftprequest.GetResponse() $responseStream = $ftpresponse.GetResponseStream() $readStream = new-object System.IO.StreamReader $responseStream

while (($line = $readStream.ReadLine() ) -ne $null) { $line here gives us the name of the file #function to download the file }

Now this is all fine PROVIDED I KNOW THE FOLDER I WANT TO DOWNLOAD THE FILES FROM.However as I said before, I only know the parent folder which has unknown number of folder (their names are GUID created by an application) and in each GUID folder has some files.

Now how on earth am i suppose to GCI on that code???

What I did was the on the same code, first get the name of each folder, then have another function (the same except now i know the folder name) and on that functions While loop download the files. However it doesnt work. The script just keeps on executing. I am an deep water! No clue what on earth to do!

1
I'm not sure I understand. Get-ChildItem should iterate through folders as well as files. Can you post a code snippet of how you access your ftp site?tenpn

1 Answers

0
votes

tenpn has it. Just use the -Recurse option to Get-ChildItem

Yeah, it seems that taking this route doesn't really provide you with the functionality you're looking for. I suspect there's a hack way to get around it by pulling the directory name on the server side into a string and then creating an identical dir on the client end.

To be honest though, I think you'd probably fare better using something like the net-cmdlets

I believe there's a hobbyist license, or if you're doing this through work the full-version is under 100 bucks.