What's the best way to save a complete webpage on a linux server?

前端 未结 4 861
挽巷
挽巷 2021-01-31 20:57

I need to archive complete pages including any linked images etc. on my linux server. Looking for the best solution. Is there a way to save all assets and then relink them all t

4条回答
  •  有刺的猬
    2021-01-31 21:14

    If all the content in the web page was static, you could get around this issue with something like wget:

    $ wget -r -l 10 -p http://my.web.page.com/
    

    or some variation thereof.

    Since you also have dynamic pages, you cannot in general archive such a web page using wget or any simple HTTP client. A proper archive needs to incorporate the contents of the backend database and any server-side scripts. That means that the only way to do this properly is to copy the backing server-side files. That includes at least the HTTP server document root and any database files.

    EDIT:

    As a work-around, you could modify your webpage so that a suitably priviledged user could download all the server-side files, as well as a text-mode dump of the backing database (e.g. an SQL dump). You should take extreme care to avoid opening any security holes through this archiving system.

    If you are using a virtual hosting provider, most of them provide some kind of Web interface that allows backing-up the whole site. If you use an actual server, there is a large number of back-up solutions that you could install, including a few Web-based ones for hosted sites.

提交回复
热议问题