How to download a complete website

If you ever need to download an entire Web site, wget can do the job:

wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --adjust-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains example.com \
     --no-parent http://example.com.lv/thefolder/

The options are: --recursivedownload the entire Web site.

--domains example.com don't follow links outside example.com.

--no-parent don't follow links outside the directory thefolder/.

--page-requisites get all the elements that compose the page (images, CSS and so on).

--html-extension save files with the .html extension.

--convert-links convert links so that they work locally, off-line.

--restrict-file-names=windows modify filenames so that they will work in Windows as well.

--no-clobber don't overwrite any existing files (used in case the download is interrupted and resumed).

Also note that in wget 1.12, –html-extension is renamed to –adjust-extension

The orignal resource: http://www.linuxjournal.com/...