You are not logged in.
Pages: 1
I often wget -m a_website.com if I think it may not be online for much longer... How can I copy just the latest updates? If I cd into the directory, then do wget -m a_website.com again - it copies the whole thing again.
Recommendations?
Thanks.
Offline
I think option -m of wget actually do what do you want (there must be some marginal changes on server, so it download).
You may try also options: -r and -nc -nv, but without -m and -N:
wget -r -nc -nv site
You may also try 'httrack --update'.
Offline
Your title says you are looking for diffs which would not be possible without transfering the full content of one of the files. But if you mean just downloading pages/files that have actually changed, then yes '-m' is the way. But this depends on the timestamps the server provides for the files - it is entirely up to the server when/if these are updated and dynamic content (e.g. php pages) will always be considered new.
"UNIX is simple and coherent" - Dennis Ritchie; "GNU's Not Unix" - Richard Stallman
Offline
Pages: 1