Bulk Inserts Via DataSet

J

John B.

I am processing a great deal of information and need to make in excess
of 9000 inserts per work item. The table is used to create an audit
trail of past work operations. Now to my question: I am trying to
maximize performance so would it be better to call adapter.update with
9000 rows in the dataset or would it be better to updated every 1000
or so? If it is better to break up the updates into smaller 'chunks,'
how do you determine an ideal sized 'chunk.'

not sure if it matters but the database is oracle 9i and the table
structure is:

VARCHAR2(8)
VARCHAR2(20)
VARCHAR2(8)
VARCHAR2(32)
DATE
NUMBER(1)
VARCHAR2(80)
NUMBER(3)
VARCHAR2(10)
VARCHAR2(10)
DATE
CHAR(1)
VARCHAR2(32)
NUMBER(4)
 
W

William \(Bill\) Vaughn

Do not use ADO.NET for bulk operations--it's not designed to do this.
Use BCP or DTS--it IS designed to move bulk data into and out of databases.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
J

John B.

I do not believe that BCP or DTS will work for what i am to doing.
These large inserts are done on a per work item basis and i will be
processing over 10 million work items. Each work item must run in a
transaction so i can either pass or fail the work item.
 
W

William \(Bill\) Vaughn

Okay, in this case I would use VB to read the data from the inbound source
and filter it based on your own logic, but write it to a delimited file
which is passed periodically to DTS. It will be far faster than using ADO to
update the target DB.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
J

Jim Hughes

DTS can participate in transactions. Each step can have Success | Failure
branches.

Depending on how many items are part of each work item, it may still be
worth investigating.

You may want to review some of the material at http://www.sqldts.com/
 
J

John B.

Thanks for the advice but I am not going to be able to use DTS for
this service. Is it better to break up my updates into smaller subsets
or to do an insert of all records at once.

thanks
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top