Speed is of the essence when updating Access tables

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

XP SP2/ MSO 2003 Pro

I have an Access database that reads info from Excel worksheets that may get
revised over time. The Excel worksheets are all kept in folders on a network
drive segregated by year. I have run the Access update routine both from my
local PC and from the network share with no appreciable improvement.

Since the content of the Excel Workbooks may change over time (even the
workbook name could change), they are assigned a unique 7-9 character control
number that forms the 1st 7-9 characters of the File name.

Within the Access database I have 3 tables. The first is the file list that
contains file name and date last modified, the second has some common data
that appears in all Excel workbooks (customer name, equipment type, etc.) and
the 3rd has data that only occurs in specific types of excel files
distinguished by the name of the 1st worksheet. All linked, of course, by the
assigned control number.

My current "update" routine is like this:

Set up file locations using FSO:

For each file in the Excel Directory (for each fil in FilDir)
Verifiy that it has a 7-9 charcter control number in proper format at
beginning of file name-> if not goto error report
Extract the control number and seek in file list table
if found check for matching last modified date
if matching dates go for next fil (I've got current data already)
if not matching dates "Edit" Access File list table, common data
table and, if appropriate, the specific data table

If I did not find that control number in the file list table "Addnew"
record to the 2 or 3 tables as appropriate
go get next fil in FilDIr

The system works fine except that it takes like 3 days shy of forever to
complete, and as of now I only have 478 Excel Workbooks and 5-10 are being
added almost daily. The last run took 18 minutes.

How do I speed this up?

The fields I need out of the Excel workbooks are all "named" so the read is
to the field name. Would it be quicker to read the specific cell (e.g.
datatbl![Customer Name].Value = .cells(1,2).Value as opposed to current
datatbl![Customer Name].Value = .Range("Customer").Value?

ANY suggestions would be truly appreciated!

TIA

BAC
 
Well, the one detail you don't mention is how many rows are in each sheet?

I think the fastest way would be to actualy setup a link to the excell
shests. Another way would be to consider inprotnt the excel data into a temp
mdb file..and proces taht.

Using a whole bunhc of commands to grab data from cells via automaton to
Excel is going to be dead slow, and is not a apprahch I would suggest.

So, either improt those sheetsw one by one into a temp table, or consider
setting up a link to each excel sheet you process.

478 sheets is a lot, and if you can do 3 per minute, then you are still
talking about 478/3 = 159 minutes = 2.6 hours.

I suspsect, with lining, or importing of data, you might get up to 12 per
minute...and cuts time down to 39 minutes.

If you can proces one sheet in 4 seconds, then that is 15 per minute. That
means 478/15 = 31 minutes.

Linking to the sheets might get down to 1 second per sheet...and that is 60
per minute...and = 8 minutes....

So, do testing on 2 or 3 sheets, and time it...(no need to process a whole
bunch). I would test/try linking vs impoarting to see which is faster...
(I think linking would be the fastest).
 
opps..bumped send....


here is with spell check....
Well, the one detail you don't mention is how many rows are in each sheet?

I think the fastest way would be to actually setup a link to the excel
sheets. Another way would be to consider importing the excel data into a
temp
mdb file..and process that.

Using a whole bunch of commands to grab data from cells via automaton to
Excel is going to be dead slow, and is not a approach I would suggest.

So, either import those sheets one by one into a temp table, or consider
setting up a link to each excel sheet you process.

478 sheets is a lot, and if you can do 3 per minute, then you are still
talking about 478/3 = 159 minutes = 2.6 hours.

I suspect, with linking, or importing of data, you might get up to 12 per
minute...and cuts time down to 39 minutes.

If you can process one sheet in 4 seconds, then that is 15 per minute. That
means 478/15 = 31 minutes.

Linking to the sheets might get down to 1 second per sheet...and that is 60
per minute...and = 8 minutes....

So, do testing on 2 or 3 sheets, and time it...(no need to process a whole
bunch). I would test/try linking vs importing to see which is faster...
(I think linking would be the fastest).
 
Thanx for your input..

The consolidated data sheet(s) (1st sheet consolidating from 3-4 following
sheets) has 54 rows for RAF Type and 62 rows for REM type.
The problem with linking/importing is that the data is on an Excel worksheet
set up as a Form with merged cells (both horizontally and vertically) and not
in a "table" format. That's why I retrieve the data by range name rather than
cell reference.

What I came up with yesterday (but haven't tried yet) is to track the last
time tables were updated. Then my 1st FSO test would be against date last
updated and that would probably speed up the rejection process some...

Ya think?




bac
 
set up as a Form with merged cells (both horizontally and vertically) and
not
in a "table" format. That's why I retrieve the data by range name rather
than
cell reference.

Actaully, that range is not going to real fast either. (but, is better then
cell by cell).
What I came up with yesterday (but haven't tried yet) is to track the last
time tables were updated. Then my 1st FSO test would be against date last
updated and that would probably speed up the rejection process some...

Yes..that is a good idea.

I think the real trick here is to ensure you can select/grab the block of
data quickly as possible
Try and ensure that only once range selection occurs..and that is it....any
more is too expensive in terms of time. Each reference to data is done by
automaton..and is VERY SLOW. So, if you can pull the range once, and only
once, that should help...
 
Again, thanx for your input..

I tried the "last update" approach and that actrually works quite well

updates list of 500 potential files in less than 3.5 -> 4 minutes.
 
Back
Top