You are not logged in.
Pages: 1
Hello,
1)what do you use for downloading website?
The problem is not only save the certain website, but I'm finding something, what can "scan" the website and automatically save site into defined depth..I mean, I want to save e.g. www.xyz.com and all the articles of the directory with all the internal links. Is there something? (because I have found nothing)
2) what do you use to print your PDFs?
I'm using KDE and his Koffice, and implemented export is very poor. Cups PDF module isn't much better. Openoffice exports great. But I don't want to install OOO only for exporting pdfs. Is there any alternative to OOO pdfs printing?
Offline
1) wget... I've never actually tried this... man wget has a lot of information... the examples section has some examples of what you may want to do..
syd wrote:Here in NZ we cant spell words with more than 5 letters. So color will have to do.
You must be very special then because "letters" has 7
Offline
Try httrack.
It's a web site downloader, available for Arch.
Just install via pacman:
pacman -Sy httrack
Further info can be found at www.httrack.com.
Offline
hello,
I found krawlsite in AUR, which is good option for GUI/KDE users.
Httrack is great /simple, doing what have to do, ... /
thanks for quick replies!
And what about my second /offtopic/ question? Is there any solution?
Offline
For exporting to PDFs, you could first print to a file (Postscript format)
then convert postscript to PDF with
ps2pdf name_of_ps_file
Last edited by JAwuku (2007-06-07 02:30:11)
Offline
Pages: 1