Inserting Record to SQL Server

C

Carlos

I picked up a routine from a old employee where he insert about 1000 record
every 30 minutes, he is going in a loop until finisih reading the file and
calling an insert for each record. That is taking a while a is slow, tha
table has 22 million records !!...


Now my question is: Is it faster to load the file in a DataSet or
DataTable and pass it to SQL so the inter procedure goes faster ?

Thanks
 
S

sloan

One method to ship down "bulk inserts" is using Xml, Sql Server 2000, 2005
especially. 2008 you can still use Xml, but it also adds sending down a
DataTable.

This example
http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!527.entry
should be useful for you.
You will see dramatic performance increases compared to the "row by row".
One of the savings comes from index rebuilding. If you insert 1 row....it
rebuilds the indexes after that 1 row......if you insert 1000 rows, it
rebuilds the indexes (ONCE) for those 1000 rows.

...................
 
S

sloan

One method to ship down "bulk inserts" is using Xml, Sql Server 2000, 2005
especially. 2008 you can still use Xml, but it also adds sending down a
DataTable.

This example
http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!527.entry
should be useful for you.
You will see dramatic performance increases compared to the "row by row".
One of the savings comes from index rebuilding. If you insert 1 row....it
rebuilds the indexes after that 1 row......if you insert 1000 rows, it
rebuilds the indexes (ONCE) for those 1000 rows.

...................
 
V

vanderghast

It would be much faster, for execution speed, to write a stored procedure,
in MS SQL Server, which 'append' the 'file' you mention, and, from your
application, ask the db engine to run that stored proc.

Sure, that assumes the intermediate storage you actually paln to have is not
used for other purposes (or if there is, that it won't be better to read it
back from the db, after the importation is made, and data fully validated by
the db).



Vanderghast, Access MVP
 
V

vanderghast

It would be much faster, for execution speed, to write a stored procedure,
in MS SQL Server, which 'append' the 'file' you mention, and, from your
application, ask the db engine to run that stored proc.

Sure, that assumes the intermediate storage you actually paln to have is not
used for other purposes (or if there is, that it won't be better to read it
back from the db, after the importation is made, and data fully validated by
the db).



Vanderghast, Access MVP
 
P

Paul

1000 inserts in 30 minutes is very slow.

You could try bulk insert/Integration services but I think you have a more
underlying problem here.

What is the PK on the table?

Is that PK clustered?

What other indexes are there on the table and if not the PK are any of them
clustered?

The thing is indexes are great for selects but can be a problem for inserts,
just think x number of inserts requires a re-build of indexes and
re-building a clustered index of 22 million records will take a while.

You need to look at ways around the problem, maybe play with padding and
fill factor and then rebuild indexes out of hours in preperation for you
next bulk insert.

or

Add a surrogate key and cluster on this, inserts should always be added at
the end. This may cause you select issues tho so you need to play around.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top