Access Tables Size

G

Guest

Hello All...

I am trying to set up an access database to query a large .csv file.
When fully normalised, there are 80 million rows in the database.

I need to create 2 reports from the data. I have created 2 tables fro these
reports, these contain 4 and 17 million rows respectively.

Can anyone advise if access can handle these size of tables and work with
them effiiciently. Can anyone pass any tips for working well with tables of
this size.

Thanks you
 
T

Tim Ferguson

I am trying to set up an access database to query a large .csv file.
When fully normalised, there are 80 million rows in the database.

This does not make a lot of sense: normalisation has to do with numbers
of columns, not numbers of rows. My guess is that a single table (i.e
equivalent to a single csv file) is likely to benefit from normalisation,
but obviously I cannot tell from here.
Can anyone advise if access can handle these size of tables and work
with them effiiciently. Can anyone pass any tips for working well with
tables of this size.

Access is limited only by the size of the mdb file, which has to be less
than 2GB. The same limit applies to MSDE too. With very tall tables, good
indexing improves reporting time and slows down updates. Remember that
Access installations run on anything between 133MHz to 3 or 4GHz so it's
not really possible to talk sensibly about the speed of an application.

Does that help?

Tim F
 
J

John Vinson

Hello All...

I am trying to set up an access database to query a large .csv file.
When fully normalised, there are 80 million rows in the database.

I need to create 2 reports from the data. I have created 2 tables fro these
reports, these contain 4 and 17 million rows respectively.

Can anyone advise if access can handle these size of tables and work with
them effiiciently. Can anyone pass any tips for working well with tables of
this size.

Access will of course not benefit from any indexing or JET query
optimization on a linked .csv file: no matter the query, it will have
to scan every single row!

If you have appended the .csv data into Access tables, check your
database to be sure it's not pushing the 2 GByte size limit too
closely.

JUDICIOUSLY add indexes on the fields used for searching and sorting
for the reports. For this size of database you're walking a tightrope
- adding an index improves *search* performance, but will cause the
size of the database to increase and can markedly slow the process of
loading data into the database from the file.


John W. Vinson[MVP]
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top