Growing Database size, how to correct?

F

Frederick Wilson

Hello,

I have a database that goes out and periodically checks for new CSV
files to bring into the database. If there is, it imports that text file
into a temp table, deletes all records in the permanent table, copies
the info from the temp table into the permanent table and finally
deletes the temp table.

The record count in these text files are over 65,000 now.

When the import happens the database more than doubles in size until I
compact the database.

Is this normal?

Is there anything I can do about this?

Instead of importing the text files, would I gain anything from linking
to them and then import direct from them directly to the permanent tables?
If so how do I link programmatically?

Thank you,
--
Frederick Wilson

_____________________________________
for multimedia design services visit
http://www.legalanimatics.com
 
M

Marshall Barton

Frederick said:
I have a database that goes out and periodically checks for new CSV
files to bring into the database. If there is, it imports that text file
into a temp table, deletes all records in the permanent table, copies
the info from the temp table into the permanent table and finally
deletes the temp table.

The record count in these text files are over 65,000 now.

When the import happens the database more than doubles in size until I
compact the database.

Is this normal?

Is there anything I can do about this?

Instead of importing the text files, would I gain anything from linking
to them and then import direct from them directly to the permanent tables?
If so how do I link programmatically?


Growth during this kind of operation is normal. How much
depends on the size of the text file and temp table.

You can modify the Connect property of a TableDef to a
linked table (a text file or whatever). The easiest way to
figure out what you need in the Connect property is to link
manually using the File - Get external data menu item and
then view the new table's connect property:

MsgBox CurrentDb.TableDefs("textfiletable").Connect

The general idea of the code for this is something like:

Set db = CurrentDb()
Set tdf = db.TableDefs("textfiletable")
tdf.Connect = ". . .;Database=" & pathtotextfile
tdf.RefreshLink

If that has an unacceptable speed penalty, another appropach
is to use your temp table approach, but put the temp table
in a temp database so it has no impact on your real
database. See
http://www.granite.ab.ca/access/temptables.htm
 
F

Frederick Wilson

Marshall said:
Growth during this kind of operation is normal. How much
depends on the size of the text file and temp table.

You can modify the Connect property of a TableDef to a
linked table (a text file or whatever). The easiest way to
figure out what you need in the Connect property is to link
manually using the File - Get external data menu item and
then view the new table's connect property:

MsgBox CurrentDb.TableDefs("textfiletable").Connect

The general idea of the code for this is something like:

Set db = CurrentDb()
Set tdf = db.TableDefs("textfiletable")
tdf.Connect = ". . .;Database=" & pathtotextfile
tdf.RefreshLink

If that has an unacceptable speed penalty, another appropach
is to use your temp table approach, but put the temp table
in a temp database so it has no impact on your real
database. See
http://www.granite.ab.ca/access/temptables.htm


Thank you very much, I will look into this.

--
Frederick Wilson

_____________________________________
for multimedia design services visit
http://www.legalanimatics.com
 
P

Piri

I'm with Marshall on this one.
I use a temp database extensively now for number crunching, temp storage
of tables etc. I have set it up so that the temp database is created on
the users hard drive (based on their log-in or CurrentUser info). No
good for dumb terminals I guess, but that's not our situation.
The temp database when created also includes a range of static data
lookup reference tables. That keeps any query traffic off our busy
network and away from the master data file - well that's the theory
anyway and it shows a dramatic improvement in the speed queries execute etc.

Using the temp database means I only move the final data into the
permanent data BE when ready.

FWIW when my report FE is loaded it checks for the existence of the temp
database - if it exists it renames it ####_BKU.MDB and recreates a fresh
one (which keeps any of the static reference data updated). If the BKU
already exists it is deleted before the previous one is renamed. That
way I have at least one previous session's temp data available if there
is a screw-up. Optional, obviously.

Using a temp database has virtually eliminated the bloat.

Good luck!
Piri
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top