Access or Other Question

  • Thread starter Thread starter John Vinson
  • Start date Start date
J

John Vinson

Group,
I'm a C programmer by trade and a novice VBA programmer. Recently, I'm
be diluged by gobs of data where I may have over 20,000 rows of data and in
the very near future more than 255 columns of data. My question is, will
Access meet my needs. My purpose is to find the Min, Max, Std Deviation,
Cpk etc. Can Access do this and will Access make charts? If this isn't
what Access can do, can you recommend another application.

Tony

The "more than 255" is a dealkiller here: an Access table is limited
to 255 fields (and often more critically, to 2000 bytes ACTUALLY USED
in any single record).

Other than that, Access can indeed do all of these things. 20,000 rows
is a small table (20,000,000 is a big table). If you can find some way
to Normalize this data, Access may indeed be able to do the job.
What's the nature of the data, and its current format? Text files,
spreadsheets, something else?

John W. Vinson[MVP]
 
Group,
I'm a C programmer by trade and a novice VBA programmer. Recently, I'm
be diluged by gobs of data where I may have over 20,000 rows of data and in
the very near future more than 255 columns of data. My question is, will
Access meet my needs. My purpose is to find the Min, Max, Std Deviation,
Cpk etc. Can Access do this and will Access make charts? If this isn't
what Access can do, can you recommend another application.

Tony
 
ajocius said:
John,
The data is text, separated by commas. The first 6 pieces of data
on a row is socket, time stamp, serial number, execution time,
manufactured location, and customer. After these, its test, results,
test, results, test results......etc. The results can be numbers,
letters or both.

That data should be stored in (at least) two tables so that the tests are in
rows rather than fields.

test result
test result
test result

The above table would include a field to identify the entity being tested. In
your case I'd guess that this would be a combination of the other fields you
mention or a surrogate (AutoNumber) id that exists in the primary table and
would be a foreign key in the test table.
 
John,
The data is text, separated by commas. The first 6 pieces of data on a
row is socket, time stamp, serial number, execution time, manufactured
location, and customer. After these, its test, results, test, results, test
results......etc. The results can be numbers, letters or both.

Tony
 
John,
The data is text, separated by commas. The first 6 pieces of data on a
row is socket, time stamp, serial number, execution time, manufactured
location, and customer. After these, its test, results, test, results, test
results......etc. The results can be numbers, letters or both.

I'd suggest using some VBA code to import this flat file into a
normalized table structure:

Sockets
Socket <Primary Key? is there one row per socket?>
TimeStamp Date/Time
SerialNumber <or is this the Primary Key?>
ExecutionTime <appropriate Number, not a date/time if it's a
duration>
LocationID <link to Locations, or maybe a text location field>
CustomerID <link to Customers>

Customers
CustomerID Autonumber Primary Key
<customer name & contact info fields>

Tests
TestNo <Primary Key>
Description

Results
Socket <or Serial Number, link to Sockets PK>
TestNo <link to Tests>
Result


You'll need to read in each row, and use looping code to migrate the
results into the tall-thin Results table. You'll then be able to run
totals queries, charts, etc. based on the data.

John W. Vinson[MVP]
 
I'd suggest using some VBA code to import this flat file into a
normalized table structure:

Tony, if you're new to VBA it might be easier to write a script or
utility in a familiar language to normalise each wide text file into
several narrow ones matching John's proposed structure. The normalised
data could then be imported using Access's built-in import routines.

Or if you use Perl, the txtnrm.pl utility at
http://www.j.nurick.dial.pipex.com/Code/Perl/index.htm may help.
 
John,
It seems as though I'm constantly working with text. Requirements
change all the time, but parsing text files continues unabaited. Are there
any text tools available that can scrub the data file and prepare it for a
DB? Can anyone suggest an applcation here?

Tony
 
Hi Tony,

For munging text files I use mostly Perl and occasionally the Gnu
textutils. Both are free downloads (http://www.activeperl.com and
http://unxutils.sourceforge.net).

The txtnrm.pl utility at
http://www.j.nurick.dial.pipex.com/Code/Perl/index.htm will convert an
arbitrarily wide text file into a tall narrow one. Append queries can
then be used to move the data into normalised Access tables. I wrote it
when I had to analyse data in a bundle of text files with well over a
thousand fields.

If you're not comfortable using the Windows command line, take a look at
TextPipe from DataMystic and Monarch from Datawatch. I'm sure there are
other tools out there as well.


John,
It seems as though I'm constantly working with text. Requirements
change all the time, but parsing text files continues unabaited. Are there
any text tools available that can scrub the data file and prepare it for a
DB? Can anyone suggest an applcation here?

Tony
 
Back
Top