Download all files from a web directory which requires login ID and password by using command line (Linux).
1
0
Entering edit mode
4.3 years ago
eDNAuRNA ▴ 20

I have a web directory which requires login and password for access. I need to download multiple data files from the web directory by using linux command line but I don't know how to get the web links to all files in the directory as well as how to use a list of web links in wget command. Any help will be appreciated.

Thanks

linux web directory • 7.8k views
ADD COMMENT
0
Entering edit mode

In addition to @Alex's solution below you may need to use -r and -A option to get all files (of a kind).

ADD REPLY
0
Entering edit mode

and -np and -nd

ADD REPLY
0
Entering edit mode

Thanks a lot Alex.

For getting the list of links, can we have a solution in Linux?

ADD REPLY
0
Entering edit mode

Please use ADD COMMENT/ADD REPLY when responding to existing posts to keep threads logically organized. This comment belongs under @Alex's answer.

SUBMIT ANSWER is for new answers to original question.

ADD REPLY
2
Entering edit mode
4.3 years ago

If the site uses basic authentication, you could use wget --user=<user> --password=<password> "<url>" to download the specified url.

The wget tool is for requests. It won't give you a list of links. You could use Python requests with scraping libraries: https://blog.hartleybrody.com/web-scraping-cheat-sheet/

ADD COMMENT

Login before adding your answer.

Traffic: 2522 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6