If you have Python installed (
http://www.python.org), the following script
will do what you want.
-----begin-----
import urllib
urlfile = open('c:\\temp\\url.txt', 'r')
for lines in urlfile:
try:
outfilename = lines.replace('/', '-')
urllib.urlretrieve(lines.strip('/n'), 'c:\\temp\\' +
outfilename.strip('\n')[7:] + '.txt')
except:
pass
-----end-----
Note that the list of urls must be a text file and the urls must begin with
"http://" (without the quotes). I would suggest that you make a folder
called "temp" on your c drive and name your list of urls "url.txt", then you
can use the script unaltered. If you wish to use this and need help, post
back here.
Louis
Shani said:
I need a program that can take a list of URLs and and save their source
page ( html code) on my computer as a text file. Does anyone know of
anysuch program.