N
No_Excuses
All,
I am interested in reading the text of a web page and parsing it.
After searching on this newgroup I decided to use the following:
******************************* START OF CODE ************************
String sTemp = "http://cgi3.igl.net/cgi-bin/ladder/teamsql/team_view.cgi?ladd=teamknights&num=238&showall=1";
WebRequest myWebRequest = WebRequest.Create(sTemp);
WebResponse myWebResponse = myWebRequest.GetResponse();
Stream myStream = myWebResponse.GetResponseStream();
// default encoding is utf-8
StreamReader SR = new StreamReader( myStream );
Char[] buffer = new Char[2048];
// Read 256 charcters at a time.
int count = SR.Read( buffer, 0, 2000 );
//while (count > 0)
//{
// do some processing - may read all or part
// count = SR.Read(buffer, 0, 2000);
//}
SR.Close(); // Release the resources
myWebResponse.Close();
******************************* END OF CODE ************************
This code should look very familiar because it is all over the
newsgroup and Microsoft support help pages.
The web page has a big table on it and it takes a while to download
(even with a cable modem).
What I observe is the following. If I open and read all the data
(i.e.
until count > 0 fails, then stepping over SR.Close() execution time is
immediate. If I read only 2000 bytes as the above example shows, when
I step over SR.Close() it takes a long time (for me around 10-15
seconds). This may be a coincidence but it seems to take the same
amount of time as if I was reading all of the data. At this point
I am starting to believe that SR.Close() does not abort reading until
the entire web page has been recieved. This is not desired and in
fact I parse the data and desire to terminate loading because the
entire process is so slow and not necessary all of the time.
Does anyone know how to terminate the loading of the page so I can
eliminate the delay? I had implemented this in C++ with MFC using
CInternetSession.OpenURL() and did not have this problem.
Thanks in advance.
Todd
I am interested in reading the text of a web page and parsing it.
After searching on this newgroup I decided to use the following:
******************************* START OF CODE ************************
String sTemp = "http://cgi3.igl.net/cgi-bin/ladder/teamsql/team_view.cgi?ladd=teamknights&num=238&showall=1";
WebRequest myWebRequest = WebRequest.Create(sTemp);
WebResponse myWebResponse = myWebRequest.GetResponse();
Stream myStream = myWebResponse.GetResponseStream();
// default encoding is utf-8
StreamReader SR = new StreamReader( myStream );
Char[] buffer = new Char[2048];
// Read 256 charcters at a time.
int count = SR.Read( buffer, 0, 2000 );
//while (count > 0)
//{
// do some processing - may read all or part
// count = SR.Read(buffer, 0, 2000);
//}
SR.Close(); // Release the resources
myWebResponse.Close();
******************************* END OF CODE ************************
This code should look very familiar because it is all over the
newsgroup and Microsoft support help pages.
The web page has a big table on it and it takes a while to download
(even with a cable modem).
What I observe is the following. If I open and read all the data
(i.e.
until count > 0 fails, then stepping over SR.Close() execution time is
immediate. If I read only 2000 bytes as the above example shows, when
I step over SR.Close() it takes a long time (for me around 10-15
seconds). This may be a coincidence but it seems to take the same
amount of time as if I was reading all of the data. At this point
I am starting to believe that SR.Close() does not abort reading until
the entire web page has been recieved. This is not desired and in
fact I parse the data and desire to terminate loading because the
entire process is so slow and not necessary all of the time.
Does anyone know how to terminate the loading of the page so I can
eliminate the delay? I had implemented this in C++ with MFC using
CInternetSession.OpenURL() and did not have this problem.
Thanks in advance.
Todd