HTTP streaming

V

Vince Panuccio

Hi,

Is there anyway I can stream large documents to a clients web browser
without putting a huge load on the web servers memory consumption?

The only way to access these documents it the HTTP protocol.

A direct link is out of the question as the user name and password
have to be suppied in the URL parameter string. We have no control
over this.

What we have working though, is the ability to download the file onto
the webserver and then stream it the client once the download is
complete just using a little c# code. Not hard.

But Id like a way to somehow download it in chunks and send it off to
the client to keep the footprint on the webserver down. Im guessing
this will help.

Keep in mind, HTTP is the only protocol available to get images from
this other system.

Any ideas?

Thanks in advance,
 
M

Marc Gravell

HTTP is pretty-much designed with this in mind. As long as you disable
buffering, you should be able to run your mid-server (broadly) as a
proxy - i.e. a "handler" (assuming asp.net) that connects to the other
system and reads into a short-buffer, then pours its buffer into the
local response - loop until you run out of data...

Re the username/password, another option here is a valid-once token
(perhaps a guid) that is verified (perhaps by a module) befoer
performing the download.

Examples of both approaches all over the 'net.

Marc
 
A

Anders Borum

If you decide to do a local cache on the midtier webserver, make sure you
look into the Response.TransmitFile() (available from the HttpContext
instance). It's designed to work closely with IIS and is able to transmit
really large files.
 
V

Vince Panuccio

If you decide to do a local cache on the midtier webserver, make sure you
look into the Response.TransmitFile() (available from the HttpContext
instance). It's designed to work closely with IIS and is able to transmit
really large files.

Hi Marc,

I have already done what you have mentioned. Here is some of my code

byte[] httpData = new byte[4096];
int numBytesRead = 0;

WebRequest request = WebRequest.Create(@"http://www.c-
sharpcorner.com/UploadFile/mahesh/
WebRequestNResponseMDB12012005232323PM/WebRequestNResponseMDB.aspx");
WebResponse response = request.GetResponse();
StreamReader sr = new
StreamReader(response.GetResponseStream());

numBytesRead = sr.Read(httpData, 0, 4096);


this.Response.Clear();
this.Response.ContentType = "application/msword";

//this.Response.OutputStream.Write(httpData, 0,
numBytesRead);
this.Response.Write(httpData, 0, 4096);


I thought this would act as a proxy, but what is happening is that the
asp worker process memory consumption is going through the roof and im
not sure why. It should just grab the data and pass it on in small
chunks but its not happening.

Anders,
Im not sure if the transmitfile() function would work if the file
exists on another server. I would have to copy it locally first,
transmit and then delete the file. We may have up to 200 users on the
system at any one time and would send the harddrive on the server
spastic! :)

The code above is placed into a loop, Im just illustrating what im
doing.
 
M

Marc Gravell

Can you post more of the loop? I suspect the devil may be in the
detail... for instance, are you allocating lots of buffers...

Marc
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top