I have a web directory which requires login and password for access.
I need to download multiple data files from the web directory by using linux command line but I don't know how to get the web links to all files in the directory as well as how to use a list of web links in wget
command.
Any help will be appreciated.
Thanks
In addition to @Alex's solution below you may need to use
-r
and-A
option to get all files (of a kind).and
-np
and-nd
Thanks a lot Alex.
For getting the list of links, can we have a solution in Linux?
Please use
ADD COMMENT/ADD REPLY
when responding to existing posts to keep threads logically organized. This comment belongs under @Alex's answer.SUBMIT ANSWER
is for new answers to original question.