Wget is a command line tool that lets you download individual web
pages or an entire web site through http or ftp. The tool works
well on slow and troublesome connections, and will run in the background,
making it an ideal tool for backing up your entire web site.
To start a download, all you have to do is write wget and the URL
in the Terminal. For example, the command wget http://www.beforever.com
downloads the front page of the site.
If you use the recursive option (wget -r http://www.beforever.com),
you can download all of the web pages and image files found on that
site. Note that some sites are really big, and choosing to download
a complete site can consume all of your free disk space.
Wget creates a subdirectory by the same name as the URL from which
you are downloading, so if you are to download from several sites,
wget will drop the files from each site into separate directories
to avoid confusion.
Wget includes several options, such as setting a username and password
for accessing sites, limiting the depth of your retrieval, and more.
|