How to retrieve entire site via command line using wget

wget is a small non-interactive network retriever which is part of GNU project.

This simple, but powerful tool could be easily used to download a website to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer:

wget -rkpNl5

-r – Retrieve recursively
-k – Convert the links in the document to make them suitable for local viewing
-p – Download everything (inlined images, sounds, and referenced stylesheets)
-N – Turn on time-stamping
-l5 – Specify recursion maximum depth level 5