The fastest way to update a database

M

Mircea Pleteriu

Hi all,

I have to insert to my database's table around 200 items. Actualy, for each
new item that must be inserted an SQL stored procedure is invoked.
The stored procedure does some tasks and then inserts the item into the
table.

The items which must be inserted are passed to my code through a DataTable
object.
So, each row in this object must be inserted into the database.

To accomplish this task, the inserting of all rows, at the moment, I'm using
a SQLDataAdapter object with the insert command pointing to the SQL stored
procedure and invoking the Update() method.

It works fine, the items are all inserted successfully.
The big problem is that it takes around 2.5 - 3 seconds to insert 200 items.

Do you have any idea to make it faster?
Thanks,
 
D

David Browne

Mircea Pleteriu said:
Hi all,

I have to insert to my database's table around 200 items. Actualy, for each
new item that must be inserted an SQL stored procedure is invoked.
The stored procedure does some tasks and then inserts the item into the
table.

The items which must be inserted are passed to my code through a DataTable
object.
So, each row in this object must be inserted into the database.

To accomplish this task, the inserting of all rows, at the moment, I'm using
a SQLDataAdapter object with the insert command pointing to the SQL stored
procedure and invoking the Update() method.

It works fine, the items are all inserted successfully.
The big problem is that it takes around 2.5 - 3 seconds to insert 200 items.

Do you have any idea to make it faster?

The only thing you can try is to batch the procedure calls together to
reduce the number of round trips to the database. There are tradeoffs
there, though, so it might not be any faster.

And you can't use the DataAdapter, you must iterate the DataTable and build
the batches by hand.

David
 
S

Scott M.

Store all the new records in a DataSet and then call the DataAdapter's
Update method just once.
 
A

Angel Saenz-Badillos[MS]

Scott,
Currently this would result in two hundred separate inserts into the
database. We really don't have a good story for uploading data in our
current api. You can use DTS (see below) or you can create your own batch
manually.

For the whidbey alpha we released two features that are greatly going to
improve this scenario:

Adpater bulk updates, you can specify the number of rows that you want to
update at a time.
SqlBulckCopy does bulk load update of data.

In a completely unofficial and flawed test I saw the following performance
differences for a 6k row update with a decent amount of data per row:
current Adapter update: 1.06minutes.
new Whidbey batch Adapter update: 15 seconds. Creating your own batch
manually would result in very similar performance, and you can do this now.
new Whidbey SqlBulkCopy update: 0.4 seconds. //Sql Server specific

Before whidbey you can get Bulk copy performance for Sql Server outside of
ado.net by using Data Transformation Services (DTS)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top