Can't seem to speed up Rows.Add in DataTable

B

Bryan

Hello,
I am trying to speed up the process for updating a datatable with new rows.

First of all I have 2 datatables, ServerTable and ClientTable, ServerTable
is from a remote datasource, it has new, and updated records in it.
ClientTable exists on the client. It needs the new/updated records from the
ServerTable. So I have created a loop that loops through all the records in
the ServerTable and for each record checks the primary key of the
ClientTable to and gets a like DataRow below:

object[] key = {drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;

DataRow drClient = DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;

if it finds a datarow then it will change certain fields in drClient like
this:

drClient["field1"] = drServer["field1"] ;

drClient["field2"] = drServer["field2"] ;

if there is no match because drClient is null then I do an Row.Add(drServer)
on the ClientTable.

This is where it slows down considerably. It seems to take more than 3
minutes to add only 15,000 records to the ClientTable, then after the loop I
apply the update to the database (I am even using the new 2.0 batch update)
takes another 3 or more minutes for this to occur, although it does seem
when I change it to the batchupdate it ran a little quicker. Please see the
full example code below:

foreach(DataRow drServer in dtServer.Rows) // iterrate through the new
records, loop take more than 3 minutes for 15,000+ records
{
object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;



DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;




if(drClient != null) //Update row
{
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 20+ more fields to update...


}
else //Add row
{
drClient = DataClass.dsGrids.Tables["ClientTable"].NewRow() ;
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 44 more fields to add...

DataClass.dsGrids.Tables["ClientTable"].Rows.Add(drN) ;
}



}


DataClass.daClientTable.UpdateCommand.UpdatedRowSource =
UpdateRowSource.None;
DataClass.daClientTable.InsertCommand.UpdatedRowSource =
UpdateRowSource.None;

DataClass.daClientTable.UpdateBatchSize = 20;

int Update =
DataClass.daAllSigns.Update(DataClass.dsGrids.Tables["ClientTable"]) ; //
Update take more than 3 minutes for Update



Anyone know what I am doing wrong? Surely it can be quicker....?

Any help would be great!

Thanks, Bryan
 
K

Kevin Yu [MSFT]

Hi Bryan,

I have a question regarding to this issue.

Why you need two tables to store data? Are you getting it from some other
resource or you just designed two tables for increasing performance? IMO,
this might be the bottle neck for updating the database. Because the
comparison between two tables might take a lot of time with Find method. If
you get a table from the DataAdapter directly, you can modify/insert
directly on that table. The Update method will automatically looks for
changes and will only update the changed rows. So please call Update
directly on ServerTable.

Kevin Yu
Microsoft Online Community Support

============================================================================
==========================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
============================================================================
==========================

(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 
B

Bryan

It is from a remote source as XML that I read into a datatable
(ServerTable). Any more ideas?
 
C

Cor Ligthert [MVP]

Bryan,

I would have asked the same as Kevin, while your answer does not give at
least me the answer on the question that Kevin was asking.

But beside that, is it not usable to search the largest table (I assume your
servertable is the largest because I see not keying in a client 15,000
records in a day) to find rows is the smallest, looping the smallest table
and than do a find in the largest is more usable.

Cor

Bryan said:
Hello,
I am trying to speed up the process for updating a datatable with new
rows.

First of all I have 2 datatables, ServerTable and ClientTable, ServerTable
is from a remote datasource, it has new, and updated records in it.
ClientTable exists on the client. It needs the new/updated records from
the ServerTable. So I have created a loop that loops through all the
records in the ServerTable and for each record checks the primary key of
the ClientTable to and gets a like DataRow below:

object[] key = {drServer["UPC"].ToString(),drServer["ItemNum"].ToString()}
;

DataRow drClient = DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key)
;

if it finds a datarow then it will change certain fields in drClient like
this:

drClient["field1"] = drServer["field1"] ;

drClient["field2"] = drServer["field2"] ;

if there is no match because drClient is null then I do an
Row.Add(drServer) on the ClientTable.

This is where it slows down considerably. It seems to take more than 3
minutes to add only 15,000 records to the ClientTable, then after the loop
I apply the update to the database (I am even using the new 2.0 batch
update) takes another 3 or more minutes for this to occur, although it
does seem when I change it to the batchupdate it ran a little quicker.
Please see the full example code below:

foreach(DataRow drServer in dtServer.Rows) // iterrate through the new
records, loop take more than 3 minutes for 15,000+ records
{
object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;



DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;




if(drClient != null) //Update row
{
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 20+ more fields to update...


}
else //Add row
{
drClient = DataClass.dsGrids.Tables["ClientTable"].NewRow() ;
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 44 more fields to add...

DataClass.dsGrids.Tables["ClientTable"].Rows.Add(drN) ;
}



}


DataClass.daClientTable.UpdateCommand.UpdatedRowSource =
UpdateRowSource.None;
DataClass.daClientTable.InsertCommand.UpdatedRowSource =
UpdateRowSource.None;

DataClass.daClientTable.UpdateBatchSize = 20;

int Update =
DataClass.daAllSigns.Update(DataClass.dsGrids.Tables["ClientTable"]) ; //
Update take more than 3 minutes for Update



Anyone know what I am doing wrong? Surely it can be quicker....?

Any help would be great!

Thanks, Bryan
 
C

Cor Ligthert [MVP]

I would have asked the same as Kevin, while your answer does not give at
least me the answer on the question that Kevin was asking.

But beside that, is it not usable to search the largest table (I assume
your

it is not usable
servertable is the largest because I see not keying in a client 15,000
records in a day) to find rows is the smallest, looping the smallest table
and than do a find in the largest is more usable.

Cor

Bryan said:
Hello,
I am trying to speed up the process for updating a datatable with new
rows.

First of all I have 2 datatables, ServerTable and ClientTable,
ServerTable is from a remote datasource, it has new, and updated records
in it. ClientTable exists on the client. It needs the new/updated records
from the ServerTable. So I have created a loop that loops through all the
records in the ServerTable and for each record checks the primary key of
the ClientTable to and gets a like DataRow below:

object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;

DataRow drClient = DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key)
;

if it finds a datarow then it will change certain fields in drClient like
this:

drClient["field1"] = drServer["field1"] ;

drClient["field2"] = drServer["field2"] ;

if there is no match because drClient is null then I do an
Row.Add(drServer) on the ClientTable.

This is where it slows down considerably. It seems to take more than 3
minutes to add only 15,000 records to the ClientTable, then after the
loop I apply the update to the database (I am even using the new 2.0
batch update) takes another 3 or more minutes for this to occur, although
it does seem when I change it to the batchupdate it ran a little quicker.
Please see the full example code below:

foreach(DataRow drServer in dtServer.Rows) // iterrate through the new
records, loop take more than 3 minutes for 15,000+ records
{
object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;



DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;




if(drClient != null) //Update row
{
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 20+ more fields to update...


}
else //Add row
{
drClient = DataClass.dsGrids.Tables["ClientTable"].NewRow() ;
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 44 more fields to add...

DataClass.dsGrids.Tables["ClientTable"].Rows.Add(drN) ;
}



}


DataClass.daClientTable.UpdateCommand.UpdatedRowSource =
UpdateRowSource.None;
DataClass.daClientTable.InsertCommand.UpdatedRowSource =
UpdateRowSource.None;

DataClass.daClientTable.UpdateBatchSize = 20;

int Update =
DataClass.daAllSigns.Update(DataClass.dsGrids.Tables["ClientTable"]) ; //
Update take more than 3 minutes for Update



Anyone know what I am doing wrong? Surely it can be quicker....?

Any help would be great!

Thanks, Bryan
 
B

Bryan

Ok, let me make this more clear. Is there any way to make what I have shown
quicker rather than so slow????

The server datatable comes from a call to a remote server, then I read
through that datatable and see if there are any hits on the client
datatable, if so I do an update, if not I do an add.

Surely there must be something faster than the approach I am using?


Bryan said:
Hello,
I am trying to speed up the process for updating a datatable with new
rows.

First of all I have 2 datatables, ServerTable and ClientTable, ServerTable
is from a remote datasource, it has new, and updated records in it.
ClientTable exists on the client. It needs the new/updated records from
the ServerTable. So I have created a loop that loops through all the
records in the ServerTable and for each record checks the primary key of
the ClientTable to and gets a like DataRow below:

object[] key = {drServer["UPC"].ToString(),drServer["ItemNum"].ToString()}
;

DataRow drClient = DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key)
;

if it finds a datarow then it will change certain fields in drClient like
this:

drClient["field1"] = drServer["field1"] ;

drClient["field2"] = drServer["field2"] ;

if there is no match because drClient is null then I do an
Row.Add(drServer) on the ClientTable.

This is where it slows down considerably. It seems to take more than 3
minutes to add only 15,000 records to the ClientTable, then after the loop
I apply the update to the database (I am even using the new 2.0 batch
update) takes another 3 or more minutes for this to occur, although it
does seem when I change it to the batchupdate it ran a little quicker.
Please see the full example code below:

foreach(DataRow drServer in dtServer.Rows) // iterrate through the new
records, loop take more than 3 minutes for 15,000+ records
{
object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;



DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;




if(drClient != null) //Update row
{
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 20+ more fields to update...


}
else //Add row
{
drClient = DataClass.dsGrids.Tables["ClientTable"].NewRow() ;
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 44 more fields to add...

DataClass.dsGrids.Tables["ClientTable"].Rows.Add(drN) ;
}



}


DataClass.daClientTable.UpdateCommand.UpdatedRowSource =
UpdateRowSource.None;
DataClass.daClientTable.InsertCommand.UpdatedRowSource =
UpdateRowSource.None;

DataClass.daClientTable.UpdateBatchSize = 20;

int Update =
DataClass.daAllSigns.Update(DataClass.dsGrids.Tables["ClientTable"]) ; //
Update take more than 3 minutes for Update



Anyone know what I am doing wrong? Surely it can be quicker....?

Any help would be great!

Thanks, Bryan
 
C

Cor Ligthert [MVP]

Bryan,

This is not so clean but probably will it work (I never did this but you can
try)

I assume that your clienttable has everywhere the rowstate added

http://msdn2.microsoft.com/en-us/library/system.data.datarowstate.aspx

If not than you have to retrieve (fill) this datatable with

Acceptchangesduringfill = false
http://msdn.microsoft.com/library/d...aadapterclassacceptchangesduringfilltopic.asp

You do first an insert from your clienttable on the database server from all
rows.
(you have to make your own insertcommand for that.
(I assume that the schemas are the same)
Than you set on the dataadapter during that update (insert)
http://msdn.microsoft.com/library/d...ataadapterclasscontinueupdateonerrortopic.asp

Than you can update all rows with an error which says it exist already,
however you put in the insert command of the dataadapter an update SQL
String.
http://msdn2.microsoft.com/en-us/library/system.data.datarow.rowerror.aspx

(You have to copy these rows to a seperate datatable using a for each and be
aware to keep the rowstate the same.)

The problem is that you cannot affect the update rowstate, otherwise you
could have used the update part of the dataadapter instead what I write now
the insertpart.

I would not know why this would not work, however as I said, I have never
tried this, and therefore you have to try it yourself. If this works, than
please reply.

Cor




Bryan said:
Ok, let me make this more clear. Is there any way to make what I have
shown quicker rather than so slow????

The server datatable comes from a call to a remote server, then I read
through that datatable and see if there are any hits on the client
datatable, if so I do an update, if not I do an add.

Surely there must be something faster than the approach I am using?


Bryan said:
Hello,
I am trying to speed up the process for updating a datatable with new
rows.

First of all I have 2 datatables, ServerTable and ClientTable,
ServerTable is from a remote datasource, it has new, and updated records
in it. ClientTable exists on the client. It needs the new/updated records
from the ServerTable. So I have created a loop that loops through all the
records in the ServerTable and for each record checks the primary key of
the ClientTable to and gets a like DataRow below:

object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;

DataRow drClient = DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key)
;

if it finds a datarow then it will change certain fields in drClient like
this:

drClient["field1"] = drServer["field1"] ;

drClient["field2"] = drServer["field2"] ;

if there is no match because drClient is null then I do an
Row.Add(drServer) on the ClientTable.

This is where it slows down considerably. It seems to take more than 3
minutes to add only 15,000 records to the ClientTable, then after the
loop I apply the update to the database (I am even using the new 2.0
batch update) takes another 3 or more minutes for this to occur, although
it does seem when I change it to the batchupdate it ran a little quicker.
Please see the full example code below:

foreach(DataRow drServer in dtServer.Rows) // iterrate through the new
records, loop take more than 3 minutes for 15,000+ records
{
object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;



DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;




if(drClient != null) //Update row
{
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 20+ more fields to update...


}
else //Add row
{
drClient = DataClass.dsGrids.Tables["ClientTable"].NewRow() ;
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 44 more fields to add...

DataClass.dsGrids.Tables["ClientTable"].Rows.Add(drN) ;
}



}


DataClass.daClientTable.UpdateCommand.UpdatedRowSource =
UpdateRowSource.None;
DataClass.daClientTable.InsertCommand.UpdatedRowSource =
UpdateRowSource.None;

DataClass.daClientTable.UpdateBatchSize = 20;

int Update =
DataClass.daAllSigns.Update(DataClass.dsGrids.Tables["ClientTable"]) ; //
Update take more than 3 minutes for Update



Anyone know what I am doing wrong? Surely it can be quicker....?

Any help would be great!

Thanks, Bryan
 
K

Kevin Yu [MSFT]

Hi Bryan,

You can directly modify on the ServerTable. Because as Cor mentioned, the
DataRow has a RowState property which indicates if the row is newly added,
modified or originally in the table. You don't need to care about this, and
you can simply call DataAdapter.Update(ServerTable). All the original rows
will be ignored and only new rows and changed rows will be updated to the
database. This is the fastest way to update.

Kevin Yu
Microsoft Online Community Support

============================================================================
==========================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
============================================================================
==========================

(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 
K

Kevin Yu [MSFT]

Hi Bryan,

I'd like to know if this issue has been resolved yet. Is there anything
that I can help. I'm still monitoring on it. If you have any questions,
please feel free to post them in the community.

Kevin Yu
Microsoft Online Community Support

============================================================================
==========================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
============================================================================
==========================

(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 
B

Bryan

Thanks for your reply. However, when I set acceptchangesduringfill = false I
came across the same as "BeginLoadData()" and "EndLoadData()", these sped up
the performance quite considerably! The only problem is that is turns OFF
notifications, which means that the datagrid will not get changes to the
"ClientTable". which has to happen in my case. Oh well, I guess these are
limitations of the if I want events sent to the datagrid.

Bryan

I am under the impression now the
Cor Ligthert said:
Bryan,

This is not so clean but probably will it work (I never did this but you
can try)

I assume that your clienttable has everywhere the rowstate added

http://msdn2.microsoft.com/en-us/library/system.data.datarowstate.aspx

If not than you have to retrieve (fill) this datatable with

Acceptchangesduringfill = false
http://msdn.microsoft.com/library/d...aadapterclassacceptchangesduringfilltopic.asp

You do first an insert from your clienttable on the database server from
all rows.
(you have to make your own insertcommand for that.
(I assume that the schemas are the same)
Than you set on the dataadapter during that update (insert)
http://msdn.microsoft.com/library/d...ataadapterclasscontinueupdateonerrortopic.asp

Than you can update all rows with an error which says it exist already,
however you put in the insert command of the dataadapter an update SQL
String.
http://msdn2.microsoft.com/en-us/library/system.data.datarow.rowerror.aspx

(You have to copy these rows to a seperate datatable using a for each and
be aware to keep the rowstate the same.)

The problem is that you cannot affect the update rowstate, otherwise you
could have used the update part of the dataadapter instead what I write
now the insertpart.

I would not know why this would not work, however as I said, I have never
tried this, and therefore you have to try it yourself. If this works, than
please reply.

Cor




Bryan said:
Ok, let me make this more clear. Is there any way to make what I have
shown quicker rather than so slow????

The server datatable comes from a call to a remote server, then I read
through that datatable and see if there are any hits on the client
datatable, if so I do an update, if not I do an add.

Surely there must be something faster than the approach I am using?


Bryan said:
Hello,
I am trying to speed up the process for updating a datatable with new
rows.

First of all I have 2 datatables, ServerTable and ClientTable,
ServerTable is from a remote datasource, it has new, and updated records
in it. ClientTable exists on the client. It needs the new/updated
records from the ServerTable. So I have created a loop that loops
through all the records in the ServerTable and for each record checks
the primary key of the ClientTable to and gets a like DataRow below:

object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;

DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;

if it finds a datarow then it will change certain fields in drClient
like this:

drClient["field1"] = drServer["field1"] ;

drClient["field2"] = drServer["field2"] ;

if there is no match because drClient is null then I do an
Row.Add(drServer) on the ClientTable.

This is where it slows down considerably. It seems to take more than 3
minutes to add only 15,000 records to the ClientTable, then after the
loop I apply the update to the database (I am even using the new 2.0
batch update) takes another 3 or more minutes for this to occur,
although it does seem when I change it to the batchupdate it ran a
little quicker. Please see the full example code below:

foreach(DataRow drServer in dtServer.Rows) // iterrate through the new
records, loop take more than 3 minutes for 15,000+ records
{
object[] key =
{drServer["UPC"].ToString(),drServer["ItemNum"].ToString()} ;



DataRow drClient =
DataClass.dsGrids.Tables["ClientTable"].Rows.Find(key) ;




if(drClient != null) //Update row
{
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 20+ more fields to update...


}
else //Add row
{
drClient = DataClass.dsGrids.Tables["ClientTable"].NewRow() ;
drClient["field1"] = drServer["field1"] ;
drClient["field2"] = drServer["field2"] ;
... 44 more fields to add...

DataClass.dsGrids.Tables["ClientTable"].Rows.Add(drN) ;
}



}


DataClass.daClientTable.UpdateCommand.UpdatedRowSource =
UpdateRowSource.None;
DataClass.daClientTable.InsertCommand.UpdatedRowSource =
UpdateRowSource.None;

DataClass.daClientTable.UpdateBatchSize = 20;

int Update =
DataClass.daAllSigns.Update(DataClass.dsGrids.Tables["ClientTable"]) ;
// Update take more than 3 minutes for Update



Anyone know what I am doing wrong? Surely it can be quicker....?

Any help would be great!

Thanks, Bryan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top