Parsing normalized excel data into multiple access tables.

J

Jabberwocky

Using Access 97 (and strongly considering SQL Server Express)

I have a data warehouse project that requires daily import of large
Excel files. The files come normalized. My issue is that i'm not
sure how to take the excel file and parse it into all the sub-tables
within access.

Let me give some background :
The excel files contain a unique patient ID#. The ID# is the first
field on the "primary" table in access. This primary table also
includes static information like name, birthdate, SSN. A one-to-many
relationship exists with several other access tables that will retain
historical record of changes.

Here is my rough plan:
to start--
· Get the latest Excel file -- this will be the original data.
· Manually parse original data into "master table" called patients and
"sub-tables" called Events

Then repeat these steps daily --
· Import next days excel table into temporary access table
· Using a macro derrived append query:
o Compare new (temporary) data with existing data.
o Discard new records when entire old record is identical to new. (no
changes made)
o If any field has changed, add an additional event with
datestamp
o If original ID# does not exist, add new record to master table and
all event data
o If original ID# exists and new ID# does not exist, add a discharge
event to "events" table

I can eventually work out the macro language, but I'm troubled by the
idea of comparing a long excel record with relational tables. It
would be much easier to avoid relational tables altogether, but I
prefer to limit size where i can. Can anyone point me in the right
direction?
 
A

Arvin Meyer [MVP]

Depending upon how much data you have, I'd stay with Access until the
compacted database size is just under 500 MB., only because it is far easier
to deal with Jet than SQL-Server. I'd still use the Access front-end if you
decide to go to a SQL-Server database engine.

In Access, use TransferSpreadsheet to get the data from your Excel files to
Access tables.
--
Arvin Meyer, MCP, MVP
http://www.datastrat.com
http://www.mvps.org/access
http://www.accessmvp.com

Using Access 97 (and strongly considering SQL Server Express)

I have a data warehouse project that requires daily import of large
Excel files. The files come normalized. My issue is that i'm not
sure how to take the excel file and parse it into all the sub-tables
within access.

Let me give some background :
The excel files contain a unique patient ID#. The ID# is the first
field on the "primary" table in access. This primary table also
includes static information like name, birthdate, SSN. A one-to-many
relationship exists with several other access tables that will retain
historical record of changes.

Here is my rough plan:
to start--
· Get the latest Excel file -- this will be the original data.
· Manually parse original data into "master table" called patients and
"sub-tables" called Events

Then repeat these steps daily --
· Import next days excel table into temporary access table
· Using a macro derrived append query:
o Compare new (temporary) data with existing data.
o Discard new records when entire old record is identical to new. (no
changes made)
o If any field has changed, add an additional event with
datestamp
o If original ID# does not exist, add new record to master table and
all event data
o If original ID# exists and new ID# does not exist, add a discharge
event to "events" table

I can eventually work out the macro language, but I'm troubled by the
idea of comparing a long excel record with relational tables. It
would be much easier to avoid relational tables altogether, but I
prefer to limit size where i can. Can anyone point me in the right
direction?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top