Opinions wanted: Bulk Insert vs Record Insert

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I am developing an application to assist a number of analysts within my
organization to analyze data from several databases. As part of that
process, I am forced to duplicate the data to an Access database. Several of
the routines use bulk inserts to insert several thousands (in some cases
hundreds of thousands) of rows at a time. Unfortunately, these inserts are
also performing some data transformations, and tend to lock up the users PC
for long periods (up to 30 minutes). I'm wondering if it would not be more
"user friendly" to do this insert one record at a time, placing a DoEvents in
the loop to allow the user to perform other tasks during these
upload/transformation actions. I realize it would take longer to do the
uploads this way, but the user could be productively performing other tasks
while the upload is in process.
 
Last night I did a demonstration where I imported 557,357 voter records
which were in a text file into Access 2003. It took under 4 minutes mostly
because I was working from an old CD ROM drive onto a 7 year old machine
(400 MHz) with only 256MB of RAM. I then added a few indexes and was able to
run many queries in under 2 seconds.

The longest query which had to do 2 table scans through all the records
without using the indexes, took about 35 seconds.

Do your importing after having dropped the indexes. Reestablish the indexes
after the import. Make sure that you need to do the data transformations.
You may be able to simply add columns to show the reformatted data and do
that by runing a query to extrack only the data you need, then run a second
query to display it with the formatting.
--
Arvin Meyer, MCP, MVP
Microsoft Access
Free Access downloads
http://www.datastrat.com
http://www.mvps.org/access
 
Thanks, Arvin.

I forgot about the hit associated with the indexes. I'll take a look at
that.

Is there an easy way to loop through the indexes in a table, store the
pertinent data, drop the index, then reloop through the information to
reestablish the indexes?

Dale
 
Back
Top