Download a working local copy of a webpage [closed]
I would like to download a local copy of a web page and get all of the css, images, javascript, etc. In previous discussions (e.g. here and here , both of which are more than two years old), two suggestions are generally put forward: wget -p and httrack . However, these suggestions both fail. I would very much appreciate help with using either of these tools to accomplish the task; alternatives are also lovely. Option 1: wget -p wget -p successfully downloads all of the web page's prerequisites (css, images, js). However, when I load the local copy in a web browser, the page is unable to load