User Tools

Site Tools


Download/Get/Scrap/Crawl all static files from website using Wget

We will use the website as a good example of this usage, because it only contains static files to host everything, but it needs multiple recursive crawls to download pages in all languages.

This is the base technique used to host, plus some sed/grep comands to remove certain unwanted parts of all translated pages (such as Tweet button)

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --domains --no-parent
docu/csheet/sysadm/script/bash/wget_all_website.txt · Last modified: 2021/01/13 23:47 by admin