B
Brent
I'm hoping to grab some 70,000 text files over http using the code
below. For the most part, the files are a few KB in size, and are
downloaded quickly. The code meets its Waterloo, however, with a rare
behemoth of 3 MB. I bumped up the Timeout property to 10 minutes (or, at
least I think I did) in hopes it wouldn't fail, but it fails still.
I'd appreciate any tips to fix the problem!
--Brent
==================================
public string get13f (string strURL)
{
try
{
HttpWebRequest oRequest = (HttpWebRequest)WebRequest.Create(strURL);
oRequest.Timeout = 10*60000; // 10 minutes; for long files (10000 =
10 seconds)
oRequest.UserAgent = "Web Client";
HttpWebResponse oResponse = (HttpWebResponse)oRequest.GetResponse();
Stream myStream = oResponse.GetResponseStream();
StreamReader sr = new StreamReader(myStream);
string strResponse = sr.ReadToEnd();
return strResponse;
myStream.Close();
}
catch
{
return "0";
}
}
==================================
below. For the most part, the files are a few KB in size, and are
downloaded quickly. The code meets its Waterloo, however, with a rare
behemoth of 3 MB. I bumped up the Timeout property to 10 minutes (or, at
least I think I did) in hopes it wouldn't fail, but it fails still.
I'd appreciate any tips to fix the problem!
--Brent
==================================
public string get13f (string strURL)
{
try
{
HttpWebRequest oRequest = (HttpWebRequest)WebRequest.Create(strURL);
oRequest.Timeout = 10*60000; // 10 minutes; for long files (10000 =
10 seconds)
oRequest.UserAgent = "Web Client";
HttpWebResponse oResponse = (HttpWebResponse)oRequest.GetResponse();
Stream myStream = oResponse.GetResponseStream();
StreamReader sr = new StreamReader(myStream);
string strResponse = sr.ReadToEnd();
return strResponse;
myStream.Close();
}
catch
{
return "0";
}
}
==================================