0
votes

I have a s3 Bucket which keeps update with .zip and .gz files. I have read-only permissions on that bucket. I want to extract the compressed files and move them to another s3 bucket. But i don't see any command for extracting the compressed files in the AWS CLI. Can i use the unix commands for doing the same? I have gone through the aws s3 cli : http://docs.aws.amazon.com/cli/latest/userguide/using-s3-commands.html

1
You can download the files to your machine and then work it locally with all the unix command you want, unless the files are text files and then you can use zcat or zgrep with piping.Avihoo Mamka
Download the file with S3 GetObject, decompress it in your machine and then upload the decompressed file to S3 with PutObject. You can find many "compression" libraries in Java/Ruby/C++ .anshul410
hi anshul410,i actually don't want to download the file on my local machine. i rather want the aws to unzip it for me and save the extracted files in some other bucket. Is there a way to accomplish this?Khuzema bohra
i actually don't want to download the file on my local machine. i rather want the aws to unzip it for me and save the extracted files in some other bucket. Is there a way to accomplish this? –Khuzema bohra

1 Answers

1
votes

S3 is 'dumb'. You can't ask it to-do things for you like creating an image thumbnail or compress/decompress files.

You will need to download the compressed file and do it yourself. This is a much faster operation if you use an EC2 instance (rather than fetching it outside Amazon's network).