How can I speed up doing loads of "INSERT INTO" statements

C

Chrisso

Hi All

I have to import data from a poorly laid out CSV file from another
system. I import the CSV file into a "raw data" table and then walk
over this to create the rows.

This results in me adding data to my MS Access 2003 DB one row at a
time using the "INSERT INTO" SQL statement.

This loads fine and correctly but it is *very* slow. I need to insert
200,000 - 500,000 rows.

Does anyone have any ideas how I can insert rows quickly into a table?
I am guessing the overhead comes from doing the INSERT one at a time
into a table with indices.

My first thought was to insert many rows at once but looks like you
cannot do this with MS Access.

Is there any other way to make this run quicker? Would it be quicker
if I use recordsets somehow?

Cheers for any advice or help,

Chrisso
 
J

Jeff Boyce

Chrisso

In many (?most) instances, you'll find that using a query is faster than
iterating through a recordset inserting one record at a time. The query
works on the entire set.

More info, please...

Regards

Jeff Boyce
Microsoft Access MVP

--
Disclaimer: This author may have received products and services mentioned
in this post. Mention and/or description of a product or service herein
does not constitute endorsement thereof.

Any code or pseudocode included in this post is offered "as is", with no
guarantee as to suitability.

You can thank the FTC of the USA for making this disclaimer
possible/necessary.
 
C

Chrisso

Chrisso

In many (?most) instances, you'll find that using a query is faster than
iterating through a recordset inserting one record at a time.  The query
works on the entire set.

More info, please...

Regards

Jeff Boyce
Microsoft Access MVP

--
Disclaimer: This author may have received products and services mentioned
in this post. Mention and/or description of a product or service herein
does not constitute endorsement thereof.

Any code or pseudocode included in this post is offered "as is", with no
guarantee as to suitability.

You can thank the FTC of the USA for making this disclaimer
possible/necessary.














- Show quoted text -

Hi Jeff

Thanks for your reply. Let me see if I can provide you with more
information by showing you the data.

The problem is that the data is not in a sensible format at all in the
CSV file - it reads more like a text report.

All the data is in two columns - a summary of where the information
comes from then about 100-200 rows of readings (time vs values) until
the next set of data which I can identify by the text "Serial No,"
appearing again.

Here is an example:

Serial No. 020000001CE55421
Location
Foo
40360.95903 23
40360.97292 22.5
40360.98681 22.5
40361.00069 22
40361.01458 22
40361.02847 21.5
40361.04236 21.5
40361.05625 21.5

So I can read all of this (~300,000 - 500,000) rows into a raw import
table but then I have no choice (as I see it) but to walk over this
table and parse out the serial number and location then insert each
reading below until I find the next block of data.

This is taking a long time because for each reading I run an SQL
"INSERT INTO ;..." with the serial number, time and value.

Hope this helps explain my problem.

Cheers
Chrisso
 
C

Chrisso

Hi Jeff

Thanks for your reply. Let me see if I can provide you with more
information by showing you the data.

The problem is that the data is not in a sensible format at all in the
CSV file - it reads more like a text report.

All the data is in two columns - a summary of where the information
comes from then about 100-200 rows of readings (time vs values) until
the next set of data which I can identify by the text "Serial No,"
appearing again.

Here is an example:

Serial No.                     020000001CE55421
Location
Foo
40360.95903     23
40360.97292     22.5
40360.98681     22.5
40361.00069     22
40361.01458     22
40361.02847     21.5
40361.04236     21.5
40361.05625     21.5

So I can read all of this (~300,000 - 500,000) rows into a raw import
table but then I have no choice (as I see it) but to walk over this
table and parse out the serial number and location then insert each
reading below until I find the next block of data.

This is taking a long time because for each reading I run an SQL
"INSERT INTO ;..." with the serial number, time and value.

Hope this helps explain my problem.

Cheers
Chrisso- Hide quoted text -

- Show quoted text -

Hi Jeff - I have fixed my problem with the idea you gave me.

I added an Autonumber ID field to the imported CSD data as above.

I changed my code when walking over the blocks of data to record the
ID at the start and then the ID at the end.

When I hit the end I then ran an append query inserting the whole
block of data at once.

This implementation increased my speed by about 99%!

Thanks for taking the time to comment.

Chrisso
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top