Drop Table Columns

G

Guest

I am importing Excel Data into an Access Table. The excel file, however, has
many columns I do not need which greatly increase the size of my Access DB.

I am using the DoCmd.TransferSpreadsheet method to get the data into Access.
I cannot change the source excel file.

I am thinking the simplest approach is to simply import all the columns, and
then "drop" those columns I do not want (I then will do some other processing
to clean up other aspects of the imported data).

Note:
- The table I am importing to initially contains all the columns, even those
I do not want, to make the transfer/import easy. I cannot change the source
excel file (else I might delete the columns before import, but would prefer
to manage everything from Access).
- After I process the imported data, I then merge the data into a different
table, and then delete all rows in the import table (ie, the import table is
only a temporary holding location) in preparation for the next cycle of
importing.

Any ideas how to dynamically change the table structure, by dropping the
imported columns?

Pat
-
 
G

George Nicholson

One approach:
- Import the whole thing, as is, warts and all, into a temp table.
- Scrub the data
- - - Delete any "bad" records/rows (by sorting the table directly or via
query, etc.)
- - - Create an Append query (or a Make-table query). Only include the
fields you want. Use Cstr(), Clng(), Nz(), Format(), etc. functions to
coerce the imported data into the proper types, formats, etc. (Note that a
saved Append query can be re-used for the next import...)

- Delete the temp table (or just all the records in it, leaving a "shell"
for future importing) once you get the results you want
- Compact the mdb to recover the space taken up by the records/table you've
deleted.

Note that the record scrubing might be the only manual step, requiring
careful eyeballing the first time or two, but depending on your data, even
that can probably be codified into one or more delete queries. The idea is
to validate the imported data before it gets added to your "real" table.

I much prefer working with a permanent "shell" table (as you've suggested)
because if you open a saved query in design view that contains references to
a table that doesn't exist at that moment in time, its a huge pain to fix
the broken field references. IMO, better if the temp table is always
resident as a shell, only occasionally containing data. That way table &
field refs within queries always remain valid.

HTH,
 
G

Guest

The approach you noted is what I had kinda resigned myself to. Would be nice
if, programmatically, I could drop the tables, but then, as you note, I would
have to re-create it again next time.

I won't have to do any editing of the data (eyeballing, as you note) as my
the data source for the excel spreadsheet is actually another DB that
"exports" nice, clean data, so that part is ok, with one exception: The
person who created the DB Table I am exporting to excel, from, created date
fields as Text fields, so I do have to (as you say) coerce it into the proper
type (text--->date) for two of the fields, so I have to figure out how to do
that, but first things first.

What I am doing in this exercise (and this is my next challenge, as I am new
to this), is to merge that temp table into the master table. The master
table contains records with a primary key (unique) that will, in some cases,
exist in the temp table. When I run into this, I want to do a wholesale
replacement of the record in the master table, with the record from the temp
table (and of course, as we have discusssed, the temp table has more columns
than I want, so, that is my next challenge.

If you have any suggestions on that, I would appreciate it it. And I
sincerely thank you for the response!

Pat
 
G

George Nicholson

I'm thinking the Append should be in two steps. 1st, Update, not Append,
records with matching pre-existing keys. Then Append the rest (the
non-matches).

The extra columns aren't a big deal. When you create your Append/Update
queries, just select the columns that need to include and ignore the rest.
You're importing them into the temp table simply because its less of a
hassle than jumping through hoops and trying to exclude them. Once the data
is imported you can ignore them. Unpopped kernels at the bottom of a bag of
popcorn are a bigger nuisance than they are :)


Cdate() or DateValue() might be able to convert your text into a valid date,
depending on whether VB reads that text as a "valid" date. Otherwise,
you'll have to parse the text with Left(), Mid(), Right(), etc. and feed it
to DateSerial() in your Update/Append queries.

HTH,
 
G

Guest

I agree...let me see if I can figure this out...I have some Access
books...not many on VBA, which is how I would like to do this all...ideally,
in one routine someone can select, and insulate them from the sausage
factory, inside. WIsh me luck!

Cheers,
Pat
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top