I want to get html, css, js files.
Website: www.horstdiekgerdes.com
Command: wget --page-requisites -r -l 10 www.horstdiekgerdes.com
I only get 4 files: about, index.html, robots.txt, favicon.ico
What's wrong?
1) + domain scarspace www.horstdiekgerdes.com/sitemap.xml
2) + option "ignore robots"
3) + swipe under the browser (set user agent)
4) - if there is JS, wget won't process it