I have about 50Gb worth of files that was stored in S3. Yesterday I stupidly added a lifecycle rule to transfer files that were more than 30 days old from S3 to Glacier not realising that this will disable the public link to the original file.
I actually really need these files to stay in S3 as they are images and drawings that are linked on our website.
I've have requested a restore of the files from Glacier, however as far as I understand this has limits for the number of days that the files will be available for before they go back to Glacier.
I was thinking that I am going to have to create a new bucket, then copy the files across to it and then link that new bucket up to my website.
My questions:
I was wondering if there is a way to do this without having to copy my files to a new bucket?
If I just change the storage class of the file once it is back in S3 will this stop it going back to Glacier?
If I have to copy the files to a new bucket I'm assuming that these copies won't randomly go back to Glacier?
I'm quite new to S3 (as you can probably tell by my bone-headed mistake) so please try to be gentle