1
votes

I have a website: http://www.example.com/

This website has a subdirectory: http://www.example.com/images/

This subdirectory contains numerous subdirectories with images inside of them:

When I try wget from the images directory, it downloads each directory (01/,02/, etc) and an index.html inside of each one. It doesn't download any of the images. e.g.

wget -r http://www.example.com/images/ <- doesn't download pngs

wget -r http://www.example.com/images/01/ <- does download pngs

How can I use wget to download all the pngs from all the subdirectories of the images/ directory without going through each subdirectory (01/,02/,03/, etc.) one by one?

1
Yep that helped. Thank you!Aaron

1 Answers

2
votes

You just need to get to see man wget to look up for options as you requested. With a quick lookup I found these,

-A acclist --accept acclist         
       Specify comma-separated lists of file name suffixes or patterns to
       accept or reject.

--no-directories
       Do not create a hierarchy of directories when retrieving recursively.
       With this option turned on, all files will get saved to the current directory, 
       without clobbering (if a name shows up more than once, the filenames will get extensions .n).

Recursive Retrieval Options
-r
--recursive
       Turn on recursive retrieving. The default maximum depth is 5.

combining these together you can put together a command as

wget -nd -r -A png http://www.example.com/images/