0
votes

I have a set of file paths written in a file on every line. These paths point to a file on the Google cloud storage.

I am currently using gsutil cp command and reading every line from the input file one at a time and downloading the file to my local hard disk using bash script

Is there a more efficient way to download all these files into my local hard disk? Fyi, the file paths point to different directories on Google Cloud and not a single directory.

1

1 Answers

1
votes

The gsutil -I argument lets you read the list of object names to copy from stdin. See the documentation for details.