I have a website, and I want to download all of the pages/links within that website. I want to do a wget -r
on this URL.
None of the links go "outside" of this specific directory, so I'm not worried about downloading the entire internet.
As it turns out, the pages I want are behind the password-protected section of the website. While I could use wget to manually do that cookie negotiation, it would be a lot easier for me to just "log in" from the browser and use some firefox plugin to recursively download everything.
Is there an extension or something that will let me do this? Many extensions focus on getting media/pictures from the page (heh. heh.) but I'm interested in all of the content - HTML and everything.
Suggestions?
Thanks!
Edit
DownThemAll seems like a cool suggestion. Can it do a recursive download? As in, download all of the links on the page, and then download all of the links contained within each of those pages, etc? So that I basically mirror the entire directory tree by following links? Kinda like the -r
option of wget
?
No comments:
Post a Comment