Website downloader

A

A man

I'm looking for a website downloader so I can view websites offline.
Currently I use Httpget but when I want to update my website it
downloads the whole thing again, when I just want to download the new
pages. (There is an option to "Update the website" but when I watch
it it really justs downloads the whole thing again. A 40mb website
over 56k modem causes a problem.) Is there any downloader where I can
get just changed pages, or just new pages?

Thanks.

--
Freezone Freeware: 1100+ applications
http://chuckr.bravepages.com
http://chuckr.freezone.org
http://freezone.darksoft.co.nz
http://home.att.net/~chuckr30/index.html
 
L

Livewire

I'm looking for a website downloader so I can view websites offline.
Currently I use Httpget but when I want to update my website it
downloads the whole thing again, when I just want to download the new
pages. (There is an option to "Update the website" but when I watch
it it really justs downloads the whole thing again. A 40mb website
over 56k modem causes a problem.) Is there any downloader where I can
get just changed pages, or just new pages?

Thanks.
HTTrack from

http://www.httrack.com

Very flexible and will allow you to download just what you want.
 
I

Iain Cheyne


I think this is a problem with with all website downloaders.

From the HTTrack FAQ - www.httrack.com/html/faq.html#Q100

"Q: I want to update a mirrored project, but HTTrack is retransfering all
pages. What's going on?
A: First, HTTrack always rescans all local pages to reconstitute the
website structure, and it can take some time. Then, it asks the server if
the files that are stored locally are up-to-date. On most sites, pages
are not updated frequently, and the update process is fast. But some
sites have dynamically-generated pages that are considered as "newer"
than the local ones.. even if they are identical! Unfortunately, there is
no possibility to avoid this problem, which is strongly linked with the
server abilities."
 
J

Jörg Volkmann

I'm looking for a website downloader so I can view websites offline.
Currently I use Httpget but when I want to update my website it
downloads the whole thing again, when I just want to download the new
pages. (There is an option to "Update the website" but when I watch
it it really justs downloads the whole thing again. A 40mb website
over 56k modem causes a problem.) Is there any downloader where I can
get just changed pages, or just new pages?

Thanks.

Hallo
Try these

1.) Httrack http://www.httrack.com
2.) Getleft http://personal1.iddeo.es/andresgarci/getleft/english/

Jörg

JV
 
T

Tone Marie Berg

Is there any downloader where I can get just changed pages, or
just new pages?

I'd go for wget and its -N flag, myself. It's about as powerful as a
downloader can be, and it gives you a verbose log of what it's doing.
It's not for the command line timid, though -- you'll *have* to go
through the manual in some detail to get the desired results in a
complex case such as this. Taking the time to set it up *will* get you
the *exact* result you want, though, as long as the server behaves.

A Windows binary of wget is available from
<URL:http://xoomer.virgilio.it/hherold/>. You will probably want
<URL:ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-
complete.zip>. A manual can be found at
<URL:http://www.gnu.org/manual/wget-1.8.1/> if you want to look before
you download. Don't worry, it's not as scary as it looks. :)

I really do recommend giving wget a try if you feel competent to handle
it, and I really recommend getting something simpler with a nice GUI if
you do not.

Tone
 
A

A man

A Windows binary of wget is available from
<URL:http://xoomer.virgilio.it/hherold/>. You will probably want
<URL:ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-
complete.zip>. A manual can be found at
<URL:http://www.gnu.org/manual/wget-1.8.1/> if you want to look before
you download. Don't worry, it's not as scary as it looks. :)

I really do recommend giving wget a try if you feel competent to handle
it, and I really recommend getting something simpler with a nice GUI if
you do not.

No problem, I grew up with DOS 3.1...ah...those were the days.

Does wget need any other packages besides the ssl binaries?

If I specify http://www.some.com/dir/ will it get the index.html file
as a starting point?

Thanks.

--
Freezone Freeware: 1200+ applications
http://chuckr.bravepages.com
http://chuckr.freezone.org
http://freezone.darksoft.co.nz
http://home.att.net/~chuckr30/index.html
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads

Lemmings clone? 9
time computer is running 5
CD image maker 10
Wanted: DOS zip with LFN 9
JV16 powertools here 3
Beethoven or Bach MP3 files 4
Fav recipe program? 11
Free spreadsheets 1

Top