380
votes

I'd like to copy files from/to remote server in different directories. For example, I want to run these 4 commands at once.

scp remote:A/1.txt local:A/1.txt
scp remote:A/2.txt local:A/2.txt
scp remote:B/1.txt local:B/1.txt
scp remote:C/1.txt local:C/1.txt

What is the easiest way to do that?

16
When I made a script, I had to put password for each command. Can I avoid it?user987654
Avoid repeating password this way: scp remote:"A/1.txt A/2.txt B/1.txt C/1.txt" local:./JohnMudd
stackoverflow.com/a/23748561/874188 (nominated as duplicate of this one) has a nice additional technique.tripleee
I would suggest that you have a look at rsync, maybe it can help you in this case and many upcoming cases. Then, to avoid entering passwords (let alone multiple times) you should read about ssh public/private keys, e.g. digitalocean.com/community/tutorials/how-to-set-up-ssh-keys--2mnagel
Example based on @JohnMudd 's answer: scp [email protected]:'/etc/openvpn/ca.crt /etc/openvpn/client/client0.crt /etc/openvpn/client/client0.key /etc/openvpn/client/ta.key' ./ .Eduardo Lucio

16 Answers

494
votes

Copy multiple files from remote to local:

$ scp [email protected]:/some/remote/directory/\{a,b,c\} ./

Copy multiple files from local to remote:

$ scp foo.txt bar.txt [email protected]:~
$ scp {foo,bar}.txt [email protected]:~
$ scp *.txt [email protected]:~

Copy multiple files from remote to remote:

$ scp [email protected]:/some/remote/directory/foobar.txt \
[email protected]:/some/remote/directory/

Source: http://www.hypexr.org/linux_scp_help.php

223
votes

From local to server:

scp file1.txt file2.sh [email protected]:~/pathtoupload

From server to local:

scp -T [email protected]:"file1.txt file2.txt" "~/yourpathtocopy"

83
votes

You can copy whole directories with using -r switch so if you can isolate your files into own directory, you can copy everything at once.

scp -r ./dir-with-files user@remote-server:upload-path

scp -r user@remote-server:path-to-dir-with-files download-path

so for instance

scp -r [email protected]:/var/log ~/backup-logs

Or if there is just few of them, you can use:

scp 1.txt 2.txt 3.log user@remote-server:upload-path
57
votes

As Jiri mentioned, you can use scp -r user@host:/some/remote/path /some/local/path to copy files recursively. This assumes that there's a single directory containing all of the files you want to transfer (and nothing else).

However, SFTP provides an alternative if you want to transfer files from multiple different directories, and the destinations are not identical:

sftp user@host << EOF
  get /some/remote/path1/file1 /some/local/path1/file1
  get /some/remote/path2/file2 /some/local/path2/file2
  get /some/remote/path3/file3 /some/local/path3/file3
EOF

This uses the "here doc" syntax to define a sequence of SFTP input commands. As an alternative, you could put the SFTP commands into a text file and execute sftp user@host -b batchFile.txt

32
votes

The answers with {file1,file2,file3} works only with bash (on remote or locally)

The real way is :

scp user@remote:'/path1/file1 /path2/file2 /path3/file3' /localPath
21
votes

After playing with scp for a while I have found the most robust solution:

(Beware of the single and double quotation marks)

Local to remote:

scp -r "FILE1" "FILE2" HOST:'"DIR"'

Remote to local:

scp -r HOST:'"FILE1" "FILE2"' "DIR"

Notice that whatever after "HOST:" will be sent to the remote and parsed there. So we must make sure they are not processed by the local shell. That is why single quotation marks come in. The double quotation marks are used to handle spaces in the file names.

If files are all in the same directory, we can use * to match them all, such as

scp -r "DIR_IN"/*.txt HOST:'"DIR"'
scp -r HOST:'"DIR_IN"/*.txt' "DIR"

Compared to using the "{}" syntax which is supported only by some shells, this one is universal

16
votes

The simplest way is

local$ scp remote:{A/1,A/2,B/3,C/4}.txt ./

So {.. } list can include directories (A,B and C here are directories; "1.txt" and "2.txt" are file names in those directories).

Although it would copy all these four files into one local directory - not sure if that's what you wanted.

In the above case you will end up remote files A/1.txt, A/2.txt, B/3.txt and C/4.txt copied over to a single local directory, with file names ./1.txt, ./2.txt, ./3.txt and ./4.txt

14
votes

Problem: Copying multiple directories from remote server to local machine using a single SCP command and retaining each directory as it is in the remote server.

Solution: SCP can do this easily. This solves the annoying problem of entering password multiple times when using SCP with multiple folders. Consequently, this also saves a lot of time!

e.g.

# copies folders t1, t2, t3 from `test` to your local working directory
# note that there shouldn't be any space in between the folder names;
# we also escape the braces.
# please note the dot at the end of the SCP command

~$ cd ~/working/directory
~$ scp -r [email protected]:/work/datasets/images/test/\{t1,t2,t3\}  .

PS: Motivated by this great answer: scp or sftp copy multiple files with single command


Based on the comments, this also works fine in Git Bash on Windows

12
votes

Copy multiple directories:

scp -r dir1 dir2 dir3 [email protected]:~/
7
votes

You can do this way:

scp hostname@serverNameOrServerIp:/path/to/files/\\{file1,file2,file3\\}.fileExtension ./

This will download all the listed filenames to whatever local directory you're on.

Make sure not to put spaces between each filename only use a comma ,.

3
votes

NOTE: I apologize in advance for answering only a portion of the above question. However, I found these commands to be useful for my current unix needs.

Uploading specific files from a local machine to a remote machine:

~/Desktop/dump_files$ scp file1.txt file2.txt lab1.cpp etc.ext [email protected]:Folder1/DestinationFolderForFiles/

Uploading an entire directory from a local machine to a remote machine:

~$ scp -r Desktop/dump_files [email protected]:Folder1/DestinationFolderForFiles/

Downloading an entire directory from a remote machine to a local machine:

~/Desktop$ scp -r [email protected]:Public/web/ Desktop/

3
votes

In my case, I am restricted to only using the sftp command.
So, I had to use a batchfile with sftp. I created a script such as the following. This assumes you are working in the /tmp directory, and you want to put the files in the destdir_on_remote_system on the remote system. This also only works with a noninteractive login. You need to set up public/private keys so you can login without entering a password. Change as needed.

#!/bin/bash

cd /tmp
# start script with list of files to transfer
ls -1 fileset1* > batchfile1
ls -1 fileset2* >> batchfile1

sed -i -e 's/^/put /' batchfile1
echo "cd destdir_on_remote_system" > batchfile
cat batchfile1 >> batchfile
rm batchfile1

sftp -b batchfile user@host
3
votes

In the specific case where all the files have the same extension but with different suffix (say number of log file) you use the following:

scp [email protected]:/some/log/folder/some_log_file.* ./

This will copy all files named some_log_file from the given folder within the remote, i.e.- some_log_file.1 , some_log_file.2, some_log_file.3 ....

3
votes

Is more simple without using scp:

tar cf - file1 ... file_n | ssh user@server 'tar xf -'

This also let you do some things like compress the stream (-C) or (since OpenSSH v7.3) -J to jump any times through one (or more) proxy servers.

You can avoid using passwords coping your public key to ~/.ssh/authorized_keys with ssh-copy-id.

Posted also here (with more details) and here.

2
votes
scp remote:"[A-C]/[12].txt" local:
1
votes

scp uses ssh for data transfer with the same authentication and provides the same security as ssh.

A best practise here is to implement "SSH KEYS AND PUBLIC KEY AUTHENTICATION". With this, you can write your scripts without worring about authentication. Simple as that.

See WHAT IS SSH-KEYGEN