1
votes

I have a question. I have some .zip files on my FTP location, and i want to decompress and copy them to ADLS. In documentation there is clear explanation for this:

"Read .zip file from FTP server, decompress it to get the files inside, and land those files into Azure Data Lake Store. You define an input FTP dataset with the compression type JSON property as ZipDeflate."

I tried with this and on my ADLS i get compress file. I tried to put file properties, to define delimiter and stuff, and still getting compressed file on data lake store. I think its how i define output dataset. Is there some rules how to define output dataset if input dataset are .zip files from FTP.

1

1 Answers

4
votes

You most likely defined your output dataset's compression as ZipDeflate also, thats why you are getting the zipped file on ADLS. Try changing your output dataset (the same where you configure the path in ADLS) so it doesn't use compression. You should have in your copy activity the input dataset where you configure the ftp, with compression and the output dataset, where you configure most stuff for the lake, without compression.

This way you are telling data factory to get a zipped file, and save it unzipped on ADLS.

Hope this helped!