G
Garth Wells
I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBytes to write
each pdf to disk.
----------------------------------------
rdr = command.ExecuteReader();
while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_file"];
File.WriteAllBytes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------
I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.
Thanks
a filename and a blob (pdf). I then use File.WriteAllBytes to write
each pdf to disk.
----------------------------------------
rdr = command.ExecuteReader();
while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_file"];
File.WriteAllBytes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------
I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.
Thanks