2
votes

I want to restore a very large database dump directly from a google cloud storage object without having to download and save it to a local file, because of disk space constraint.

I have the database dumps produced by mongodump (with and without --gzip)

When I try to pipe the database dump data to mongorestore using the following command (I referred to Streaming transfers)

gsutil cp - gs://<bucket>/<object_path> | mongorestore --uri=<connection_uri> --archive

The process is stuck at the following output

Copying from <STDIN>...
/ [0 files][    0.0 B/    0.0 B]

I'm not sure whether it's a gsutil or mongorestore issue

The file is a valid mongodump file, as I tried to download a small database dump and I could successfully restore it using mongorestore --uri=<connection_uri> --archive=<local_file_path>

gsutil version: 4.57

mongorestore version: 100.2.0

1

1 Answers

2
votes

The dash in your gsutil command is in the place of the source argument, not the destination, so it's trying copy from STDIN to the gs:// path (which would overwrite your object!).

If you want to pipe the contents of your GCS object to another program, you can use gsutil cp and make the destination argument be the dash character, or more simply, just use gsutil cat gs://...