Large Data Sets using DataReader

C

Charlie

I have a stored procedure that will return a dataset that contains the
first n records of a select statement. I use a DataReader to retrieve
the records to write them to a file. When the SP only returns 10,000
records, it can process those records at a linear processing rate 1000
records in 2 seconds. When I get 50,000 records back the linear
processing rate drops to 1000 record in 18 seconds. What puzzles me
is that the first 1000 records are the same no matter how many we get
back in the complete set. However, eventhough they're the same, they
take 9 times longer because there are a total of 50,000 rather than
10,000. I though a DataReader, since it deals with one row at a time,
would have a linear processing rate and that rate would be without
regard to the total number of records retrieved. I've narrowed the
increasing consumption time to the .getString() method. It is NOT in
the .read() method!

Can anyone explain this to me or tell me what I'm doing wrong.

Thanks in advance.

Charlie
 
W

William \(Bill\) Vaughn

What you're doing wrong? First you're using a query interface to do bulk
operations. I would use BCP or DTS to handle bulk data. As far as your
problem goes, I expect the GC is kicking in as the strings are destroyed.

--
____________________________________
Bill Vaughn
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top