I need to download a big folder with unknown number of layers there to my Linux (Redhat) server.
I went to the web version and it is telling me the file is "to large to download". People were saying that you can simply retrieve the download URL and wget will do the job. However, looking at that folder, I don't see anyway to retrieve URL other than the "share link".
Can anyone share with me what is the best practice to recursively download a big folder a linux server?
wget -r http://website/folder- IT_User