2
votes

I have a model called Image. Images have files attached using Dragonfly that are stored in S3.

I have a requirement that I need to zip up all images.

I'm using:

 Zip::ZipFile.open(tmp_zip, Zip::ZipFile::CREATE) do |zipfile|
   zipfile.add("image.jpg", image_path)      
 end

The problem I'm running into is that this works if image_path is local. When you need to call to S3 for the file, image_path is a remote path, such as http://example.s3.amazonaws.com/foo/image.jpg, and I don't think that there is a RubyZip method that handles that.

I'm debating on writing something that creates a temp file from the remote path, adds that temp file to the zip, then deletes the temp file.

But before I do that, does anyone know if RubyZip or some other zip library handles zipping up remote files? Or is there a better/easier method?

Thanks!

2
I ended up going with the custom method that downloads the file via net/http, saves a temp file, puts the temp file in the zip, and then disposes of the temp file.Corey

2 Answers

4
votes

I have faced same issue and I have found a solution. So I am sharing it, might help someone.

You can add any remote file to zip without saving it in a temp file, then read it from temp file and finally deleting temp file.

create zip and add remote files in it

Zip::OutputStream.open(tmp_zip) do |zos|
   zos.put_next_entry("image.jpg")
   zos.print(URI.parse(image_url).read)    
 end

If you want to add any local files inside above temp_zip then you can open it again:

open this zip again and add any local files you want

zipfile = Zip::File.open(tmp_zip)
zipfile.add("report.pdf", my_pdf_path)
zipfile.close
0
votes

One option would be to mount s3 locally. There various ways to do this using ftp like programers and there are dedicated programs as well. It depends on the OS you're running as well.

I don't see a way to stream a zip via zip using a remote URL.