Loading TXT to DB (VB)

D

Data

I have large TEXT files (less than 50MB) that I want to load into
MSSQL. Each line has to be extracted, parsed and then inserted into the
DB. What's the best way of doing this?

Currently I extract each line, parse it and then INSERT it into the DB.
I am not using any DataSet/DataGrids. I have one db connection always
open for this process. The problem is that this takes almost half an
hour to finish. I'm wondering if using a DataSet would be beneficial
for me.

Any help would be appreciated!
 
P

Phill. W

Data said:
I have large TEXT files (less than 50MB) that I want to load into
MSSQL. Each line has to be extracted, parsed and then inserted into
the DB. What's the best way of doing this?

For anything more than a few thousand rows at a time, I'd tackle this
as a two-stage process:

1) Read the file, parse the data and write it to /another/ file,
ready for...
2) Use the bcp utility to read this second file and slam it into
SQLServer - much, *much* faster.

HTH,
Phill W.
 
J

Jerry H.

You didn't mention what version of SQL server you are using, but do you
need to go through an application at all? You could instead set up a
DTS in SQL that imports from a text file to your SQL database.

To activate the DTS, you could run it with the SQL's Job Scheduler, or
use DTS objects from your .NET application.
 
M

Mike Labosh

I have large TEXT files (less than 50MB) that I want to load into
MSSQL. Each line has to be extracted, parsed and then inserted into the
DB. What's the best way of doing this?

Lookup the BULK INSERT statement in SQL Server Books Online.

Make a "Work-Table" that has the same structure as your TXT file. Not a
Temp Table, a Real one.
I am not using any DataSet/DataGrids. I have one db connection always
open for this process. The problem is that this takes almost half an
hour to finish.

The system that I inherited processes .txt files in excess of 500MB in less
than a half hour. Our SQL Server is a 4 CPU XEON with 3 gigs of RAM. We
load several dozen of these files all at once, on a quarterly basis.

--
Peace & happy computing,

Mike Labosh, MCSD

"When you kill a man, you're a murderer.
Kill many, and you're a conqueror.
Kill them all and you're a god." -- Dave Mustane
 
D

Data

The server is MSSQL. And unfortunately, I can't use bcp or DTS because
there are several text files that require parsing and are contingent
upon each other.

The solution has to be in such order:

File > Manipulate contents > DB
 
D

Data

Don't worry about this. I will post the method that I used so that
others can benifit from it.

I inserted the text files contents into a single column DataSet table
and then parsed one-by-one into the database. I was able to do a 32MB
text file within 7 minutes.

Alternatively, with the original method that I had which was to read
each line from the file, parse it and then insert it into the db was
taking around 20 minutes to do.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top