read csv

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

what do you think would be the best way to read a csv
containing thousands of records...validating those records and transferring
them to a SQL Server database??
I mean ...use a streamreader then a dataset/datatable or anything else??
 
PK,

Does this only have to be done once? Or multiple times by uploading the file
to the web? If it only has to be done once use SQL itself. It has tools
built into for importing CSV files.

If you need to upload the file to a website and then convert it from there
then I would use a solution similar to this one found on Experts Exchange:

http://www.experts-exchange.com/Databases/Microsoft_SQL_Server/Q_21700652.html?qid=21700652#15743678

(Register on the site, for free, to view the solution)

--
Sincerely,

S. Justin Gengo, MCP
Web Developer / Programmer

www.aboutfortunate.com

"Out of chaos comes order."
Nietzsche
 
DTS might not be an option. We might have to do everything in a VB.net
class/application.
so in that case I wanted to know what is best to use -
stream reader, data reader or dataset considering there are validations too.
 
P K said:
what do you think would be the best way to read a csv
containing thousands of records...validating those records and
transferring
them to a SQL Server database??
I mean ...use a streamreader then a dataset/datatable or anything else??

I did something like this last year.

I allowed the user to upload the csv,
I then opened the file and pushed all info into a table,
Asked the user to specify which flds were which,
Then cleaned, transformed, and pushed the data into tables the app can use.
 
You might look at my parser I sell, http://www.csvreader.com . It
allows for bulk importing of data directly into SQL Server at almost
DTS speeds using native SQL bulk loading. You can also validate the
data, manipulate the data, skip rows, and run inside a transaction. And
you don't need dbo rights on the db, just insert rights on the table
being imported into. I developed all this specifically for situations
like yours.

Bruce Dunwiddie
 
You might look at my parser I sell, http://www.csvreader.com . It
allows for bulk importing of data directly into SQL Server at almost
DTS speeds using native SQL bulk loading. You can also validate the
data, manipulate the data, skip rows, and run inside a transaction. And
you don't need dbo rights on the db, just insert rights on the table
being imported into. I developed all this specifically for situations
like yours.

Bruce Dunwiddie
 
This sure looks cool but wanted to know the load on memory when the dataset
would have thousands of records which need to be manipulated/validated and
then written to SQL
 
It's stream based, so the load on memory is only as large as the
largest row of data in your file, or 4 kb, whichever is larger because
that's the initial size of the data buffer. The data is manipulated and
validated on it's way in to sql. There are no datasets involved.

Bruce Dunwiddie
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top