156
votes

I am trying to download the files for a project using wget, as the SVN server for that project isn't running anymore and I am only able to access the files through a browser. The base URLs for all the files is the same like

http://abc.tamu.edu/projects/tzivi/repository/revisions/2/raw/tzivi/*

How can I use wget (or any other similar tool) to download all the files in this repository, where the "tzivi" folder is the root folder and there are several files and sub-folders (upto 2 or 3 levels) under it?

8
You can't do that if server has no web-page with list of all links to files you need.Eddy_Em
do you know the name of the files?Karoly Horvath
no i don't know the name of all files.I tried wget with the recursive option but it didn't work either.Is that because the server doesn't have any index.html file which lists all the inner links.code4fun
Did you try the mirroring option of wget?Tomasz Nguyen

8 Answers

219
votes

You may use this in shell:

wget -r --no-parent http://abc.tamu.edu/projects/tzivi/repository/revisions/2/raw/tzivi/

The Parameters are:

-r     //recursive Download

and

--no-parent // Don´t download something from the parent directory

If you don't want to download the entire content, you may use:

-l1 just download the directory (tzivi in your case)

-l2 download the directory and all level 1 subfolders ('tzivi/something' but not 'tivizi/somthing/foo')  

And so on. If you insert no -l option, wget will use -l 5 automatically.

If you insert a -l 0 you´ll download the whole Internet, because wget will follow every link it finds.

21
votes

You can use this in a shell:

wget -r -nH --cut-dirs=7 --reject="index.html*" \
      http://abc.tamu.edu/projects/tzivi/repository/revisions/2/raw/tzivi/

The Parameters are:

-r recursively download

-nH (--no-host-directories) cuts out hostname 

--cut-dirs=X (cuts out X directories)
5
votes

This link just gave me the best answer:

$ wget --no-clobber --convert-links --random-wait -r -p --level 1 -E -e robots=off -U mozilla http://base.site/dir/

Worked like a charm.

4
votes

use the command

wget -m www.ilanni.com/nexus/content/
3
votes
wget -r --no-parent URL --user=username --password=password

the last two options are optional if you have the username and password for downloading, otherwise no need to use them.

You can also see more options in the link https://www.howtogeek.com/281663/how-to-use-wget-the-ultimate-command-line-downloading-tool/

2
votes

you can also use this command :

wget --mirror -pc --convert-links -P ./your-local-dir/ http://www.your-website.com

so that you get the exact mirror of the website you want to download

-1
votes

This works:

wget -m -np -c --no-check-certificate -R "index.html*" "https://the-eye.eu/public/AudioBooks/Edgar%20Allan%20Poe%20-%2"
-1
votes

This will help

wget -m -np -c --level 0 --no-check-certificate -R"index.html*"http://www.your-websitepage.com/dir