A
Alexander Fischer
Hello,
I'm looking for a utility, running on Windows, which can perform this
task: take a given URL (plus possibly some filters/restrictions), crawl
it until a certain limit (e.g., 3 levels deep) and create a simple text
or html file which lists all the files which exist on this site.
E.g., input: http://cnn.com
Output:
http://cnn.com/images/index.htm
http://cnn.com/images/logo1.gif
http://cnn.com/images/logo2.gif
http://cnn.com/images/logo3.gif
http://cnn.com/images/information.zip
etc.
This must be easy - however I'm not finding anything. Can anybody help
me?
To clarify: I do NOT want a tool like httrack which will DOWNLOAD all
these files - I only want to see which files exist.
Thank you!
Alex
I'm looking for a utility, running on Windows, which can perform this
task: take a given URL (plus possibly some filters/restrictions), crawl
it until a certain limit (e.g., 3 levels deep) and create a simple text
or html file which lists all the files which exist on this site.
E.g., input: http://cnn.com
Output:
http://cnn.com/images/index.htm
http://cnn.com/images/logo1.gif
http://cnn.com/images/logo2.gif
http://cnn.com/images/logo3.gif
http://cnn.com/images/information.zip
etc.
This must be easy - however I'm not finding anything. Can anybody help
me?
To clarify: I do NOT want a tool like httrack which will DOWNLOAD all
these files - I only want to see which files exist.
Thank you!
Alex