Fastest way to work with DataTable

P

Pascal Berger

What's the fastest way to accomplish the following tasks:

- Copy all data from one DataTable to another. Currently I work with
CreateDataReader() / Load(). Is there any faster approach?
- Add a lot of rows (>10'000) to a DataTable (one by one). Currently I
use NewRow() / AddRow(). It's painfully slow. I know I can disable
indexing during the import. Any other possible optimizations?

Thanks in advance
Pascal
 
P

Pascal Berger

Kevin said:
DataTable.Copy()
Thanks for your answer. But DataTable.Copy() creates a new DataTable
instance. This doesn't work in my case since I already have several
references to this instance, which will be invalid otherwise.
 
K

Kevin Spencer

You *did* say "to another" didn't you? I assumed that meant "to another
DataTable instance". What am I misundertanding?

--
HTH,

Kevin Spencer
Microsoft MVP
Software Composer
http://unclechutney.blogspot.com

If the Truth hurts, wear it.
 
P

Pascal Berger

Kevin said:
You *did* say "to another" didn't you? I assumed that meant "to another
DataTable instance". What am I misundertanding?
sorry abou the confusion. I already have two DataTable instances (both
with the same structure).
I need now to copy all rows from one DataTable to the other.

Thanks!
pascal
 
P

Pascal Berger

Kevin said:
Just taking a blind stab now. How about DataTable.Merge or
DataTable.ImportRow?
Currently I use the following:
destDataTable.Load(sourceDataTable.CreateDataReader());

This seems to be as fast as destDataTable.Merge(sourceDataTable). But in
both cases it takes about 10 seconds to copy 7000 rows.

For DataTable.ImportRow i need to loop through the whole source table,
right? I've tried this also. Copying of 7000 rows took about 90 seconds
this way...

Thanks
Pascal
 
K

Kevin Spencer

Hi Pascal,

If you're using a DataReader to do the copying, you're getting the maximum
performance you can from the ADO.Net objects. With 7000 rows of data to
copy, it's going to take some time, particularly if the size of the
individual records is at all large.

--
HTH,

Kevin Spencer
Microsoft MVP
Software Composer
http://unclechutney.blogspot.com

If the Truth hurts, wear it.
 
C

chanmm

Please bear in mind what your CreateDataReader is reading at the backend
might affecting your performance as well.

chanmm
 
P

Pascal Berger

Kevin said:
If you're using a DataReader to do the copying, you're getting the maximum
performance you can from the ADO.Net objects. With 7000 rows of data to
copy, it's going to take some time, particularly if the size of the
individual records is at all large.
The size of the records is really small. Just two GUID and one string
field. No constraints or primary keys either.
I'm coming from the Delphi world. There you have a Data property which
contains the binary data of a DataTable. You can achieve greate
performance there by just copying this property (10'000 records in ~1
second). Does someting in ADO.NET exists?
I also use the DataTable just as In-Memory DataTables (no binding to a
database). Are there any faster components for this task? Any Options to
set? Will the performance be much bigger if I rewrite everything using
ArrayLists or Collections?

Thank you!
Pascal
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top