0
votes

When I attempt to use gsutil to copy a directory of JPGs to my bucket, all the files say they upload with no errors. However, when I run gsutil ls or check the directory in the web console, only some of the files are listed. This happens both with and without the -m flag.

I'm testing to see if the same files upload correctly just using the cloud console, but as this takes much longer, I'd prefer to use gsutil.

For reference, I'm using gsutil 3.38 installed through pip on OSX 10.9. I'm attempting to upload to a Cultural Institute syndicate bucket, so I don't have normal admin access to the bucket. However I have used gsutil to upload this way previously, so I know this command at least used to work.

1
Could you show an example gsutil command where the objects you're expecting don't show up?jterrace
gsutil -m cp -R RGB/ gs://[bucket]/Seth
Not sure if it's in good taste to put my bucket ID up, but I can if I need to. Also, I just added the folder to the bucket using the Cloud Console on the web. Everything uploaded correctly, but when I refreshed the page, only a csv that was in the folder was actually in the bucket. I'm actually starting to wonder if the CI guys have imposed some sort of file size restriction on their buckets...Seth

1 Answers

0
votes

The issue was not a Cloud Storage issue. The Cultural Institute now deletes files from Cloud Storage after they have been processed for their database. Problem solved.