Catching web exception

  • Thread starter Thread starter =?ISO-8859-1?Q?=22Andr=E9s_G=2E_Aragoneses=22?=
  • Start date Start date
?

=?ISO-8859-1?Q?=22Andr=E9s_G=2E_Aragoneses=22?=

Hello.

If I just want to check whether a specific URL returns error (404, 403,
500...) or not, I do this:

HttpWebResponse oResponse = null;
try
{
oResponse = (HttpWebResponse)oRequest.GetResponse();
}
catch (WebException oException)
{

But examining the net traffic I have found out that, if the URL doesn't
return an error and corresponds to a big file (4 MB for instance), the
program is retreiving around 1 MB of data. I don't want my program to
generate this traffic, I only want to check if the URL is ok, so how can
I improve this?

Thanks in advance.

Andrew [ knocte ]

--
 
try:

oRequest.Method = "HEAD";

this will make sure the server does not send you the content of the page.

hth,
Baileys
 
Thus wrote Baileys,
try:

oRequest.Method = "HEAD";

this will make sure the server does not send you the content of the
page.

HEAD will do trick if the original requests are all GETs. If they are POSTs,
HEAD likely won't work, as HEAD is technically just a special GET.

Cheers,
 
Joerg Jooss escribió:
Thus wrote Baileys,


HEAD will do trick if the original requests are all GETs. If they are
POSTs, HEAD likely won't work, as HEAD is technically just a special GET.

Cheers,

Thanks very much to both of you. Currently my needs fit well with only
GET :)

Regards,

Andrew [ knocte ]

--
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top