40
votes

I am trying to copy files from my instance to my local directory using following command

gcloud compute scp <instance-name>:~/<file-name> ~/Documents/

However, it is showing error as mentioned below

$USER/Documents/: Is a directory

ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].

Copying from local directory to GCE works fine.

I have checked Stanford's tutorial and Google's documentation as well.

I have one another instance where there is no issue like this.

I somewhat believe it might be issue with SSH keys.

What might have gone wrong?

7
is there a link to the stanford tutorial that you've mentioned?baxx
And you might as well link to whatever Google documentation you've been reading.ComputerScientist

7 Answers

34
votes

Your command is correct if your source and destination paths are correct

The command as you've posted in your question works for me when copying a file from the Google Compute Engine VM to my local machine.

$ gcloud compute scp vm1:~/.bashrc ~/Documents/
.bashrc                                          100% 3515     3.4KB/s   00:00

I also tried doing the copy from other side (i.e. from my local machine to GCE VM) and it works:

$ gcloud compute scp ~/Documents/.bashrc vm1:~/temp/
.bashrc                                          100% 3515     3.4KB/s   00:00

$ gcloud compute scp ~/Documents/.bashrc vm1:~/.bashrc-new
.bashrc                                          100% 3515     3.4KB/s   00:00

gcloud relies on the scp executable present in your PATH. The arguments you provide to the gcloud scp command are passed through to the scp binary. Assuming your source and destination paths are correct, it should work.

Recursive copying using scp

Based on your particular error message though, I've seen that variation only appear when the source path you're trying to copy from is a directory instead of file. For that particular case, you can pass a --recurse argument (similar to the -r argument supported by regular scp) which will recursively copy all files and directories under the specified directory.

gcloud compute scp --recurse SRC_PATH DEST_PATH

30
votes

To copy files from VM to your desktop you can simply SSH into the VM and on top right corner there is a settings button, there you will find the download file option just enter the path of file.

If it is folder then first zip the folder then download it.

11
votes

Everything was perfect except I was trying to run these commands on the terminal connected to GCE instead of local terminal.

oyashi@oyashi-torch-instance:~$ gcloud compute scp oyashi-torch-instance:~/spring1617_assignment1.zip ~/Documents/

/home/oyashi/Documents/: Is a directory ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].

But when I tried this one on my local terminal. This happened.

oyashi@oyashi:~/Documents$ gcloud compute scp oyashi-torch-instance:~/spring1617_assignment1.zip ~/Documents/

spring1617_assignment1.zip 100% 42KB 42.0KB/s 00:00

Thank you everyone for their comments and help. I know its a silly mistake from my end. But I posted this answer so that others might learn from my silliness.

4
votes

If you need to pass the information of zone, project name you may like to do as it worked for me: the instance name is the name you chose in the GCP instances.

gcloud beta compute scp --project "project_name" --zone "zone_name" instance_name:~jupyter/file_name /home/Downloads
1
votes

I met the same problem. The point is you should run the scp command from a local terminal, rather than cloud terminal.

0
votes

For copying file to local machine from Ubuntu vmware

For ex: you have instance by name : bhk

0
votes

Run a basic nginx server and copy all the files in
/var/www/html (nginx serving dir)
and then from your local machine simple run
wget <vm's IP>/<your file path>

For example If my vm's IP is 1.2.3.4 and I want to copy /home/me/myFolder/myFile , then simply copy this file in /var/www/html

then run wget 1.2.3.4/myfile