Limit on SQLDataReader w/ Blobs

  • Thread starter Thread starter Garth Wells
  • Start date Start date
G

Garth Wells

I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBytes to write
each pdf to disk.

----------------------------------------
rdr = command.ExecuteReader();

while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_file"];
File.WriteAllBytes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------

I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.

Thanks
 
Garth said:
I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBytes to write
each pdf to disk.

----------------------------------------
rdr = command.ExecuteReader();

while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_file"];
File.WriteAllBytes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------

I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.

An SQLServer BLOB (called IMAGE) can be up to 2 GB.

There is lots of things that can prevent you from reading
such a big data item in a single call.

You can try using a loop and via rdr.GetBytes read 100 KB or 1 MB
a time and write them to the file.

Arne
 
You can try using a loop and via rdr.GetBytes read 100 KB or 1 MB
a time and write them to the file.

Additional: when handling large BLOBs, it would be a good idea to set
CommandBehavior.SequentialAccess in the reader; this will allow it to
work more efficiently, as it doesn't try to load the entire row into
memory - just the bytes you are reading at that moment. Of course,
with the GetBytes() approach you are forced to use the slightly longer
version of writing, using a FileStream.
called IMAGE

Or varbinary(max) ;-p

Marc
 
Garth said:
I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBytes to write
each pdf to disk.

----------------------------------------
rdr = command.ExecuteReader();

while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_file"];
File.WriteAllBytes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------

I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.

An SQLServer BLOB (called IMAGE) can be up to 2 GB.

There is lots of things that can prevent you from reading
such a big data item in a single call.

You can try using a loop and via rdr.GetBytes read 100 KB or 1 MB
a time and write them to the file.

I reviewed all the file sizes and the largest was 4 MB. There must be
a max SQLDataReader size, but a quick google did not indicate the
value.
 
It is conceivable that the limit is in the line:
rdr["attachment_file"];

For BLOBs, I would be using GetBytes() successively with increasing
offsets (into a re-usable buffer)

Marc
 
It is conceivable that the limit is in the line:
rdr["attachment_file"];

For BLOBs, I would be using GetBytes() successively with increasing
offsets (into a re-usable buffer)

Can you point me to an example of this approach?

Thanks
 
OK: http://weblogs.asp.net/cazzu/archive/2003/08/27/25568.aspx

The context here is streaming an image from the database to a browser
- however ,the BLOB reading is the same.

Personally, I think the "read" step could be neater - it makes the not-
strictly-true assumption that a buffer must be filled; I'd go with:
while ((size = r.GetBytes(1, idx, buffer, 0, ChunkSize)) > 0)
{
context.Response.BinaryWrite(buffer, 0, size);
idx += size;
}

(and remove the "last bytes" stuff down to the end-brace, since we
have dealt with them above).

"Replace context.Response" with your own stream / writer...

Marc
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top