Architecture question on parsing a large text file

J

James A. Fortune

This is invaluable "low level" stuff that might come in handy later.
But since my data is in a text file, it might be easier just to read
each file, and import the data into a SQL table, no?  If you go the
table route.  If you go the XML route you just save to XML.  Why
bother with the Excel intermediary step?

RL

CSV, Excel and XML all suffer from extra overhead when reading data.
It's not a lot, but it can add up when there are lots of records.
Using Excel was simply an easy way to get the data into some kind of
data store that doesn't have the same performance hit as those
formats. Of course, Arne's idea of storing the data in RAM is much,
much faster (i.e., once the data is read) than using a database if
you don't mind handling possibilities like invalid, malformed or
duplicate data yourself.

James A. Fortune
(e-mail address removed)
 
A

Arne Vajhøj

Hey thanks Arne. If you have links handy and *if* these packages and
Off-The-Shelf libraries are free, please do feel free to post them
here on your free time, pun intended.

There are free databases available. I am not aware of any
free reporting tools for .NET, but there are commercial ones
available. And there may also be free ones - I just don't know
such. You could also switch to Java and use JasperReports which
is a widely used free reporting tool.

SAS and SPSS cost money.
I'm not doing this commercially so I don't want to pay for anything.
If the OTS stuff is popular I might get a copy from Piratebay.org if I
can find them being seeded, since again this program is for my own
internal use only and not commercial.

That approach I don't like. If you want commercial software
then pay for it.

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top