CSV Import

J

Jim

I have an asp.net (vb) application in which users upload csv files to a
SQL Server 2000 database.


I've been using SQL Server DTS and stored procs to do this however I'd
like the solution to be a bit more flexible.


I've developed code to pull the csv file into a dataset (datatable
would work as well) using
oledbConnection/oledbCommand/oledbDataAdapter however I am unsure how
to insert the data into an existing SQL server table.


So, I've got....


Dim strFileName As String = "MyFile.csv"
Dim strPath As String = "C:\"
Dim strCon As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data
Source=" + strPath + ";Extended
Properties='text;HDR=Yes;FMT=Delimited'"
Dim con As New OleDbConnection(strCon)
Dim cmdGetCsv As New OleDbCommand("SELECT * from " +
strFileName, con)
Dim csvAdapter As New OleDbDataAdapter(cmdGetCsv)
'Dim csvReader As OleDbDataReader
con.Open()


Dim dsCsv As DataSet = New DataSet


csvAdapter.Fill(dsCsv)


....


Clearly I now need to make a connection to my SQL db but how do I
insert all the records from the dataset?


The csv file could contain thousands of records. Are there performance
issues that would dictate one method over another?


Thanks in advance for any assistance.
 
G

Guest

If you are going the DataSet route, you can grab the XML and send it to SQL
Server. In 2000 and 2005, SQL Server gives you the option of using XML as an
input and a simple method of parsing the XML, as a "table", for easy insert.
If you find the DataSet format a bit hard to work with, you can run an XSLT
transformation to mold the DataSet XML differently. This is one option.

Another DataSet option is create an empty DataSet against the database
table(s) you are inserting into. Create the CSV DataSet and move the info
from CSV DataSet to database DataSet. Run Update() on the DataAdapter and let
ADO.NET handle the update for you.

DTS is still an option. It is quite easy to use Package2Class to run a DTS
package against your file. The downside here is it is more time consuming to
tailor the DTS for differing file names.

You can also get away from the DataSet and use a StreamReader to read line
by line. You then call a custom SQL statement or Procedure to do the insert
for you.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************
 
J

Jim

I use DTS now which works great and I can pass the filename in using a
dynamic property object and store proc.

The XML transformation sounds alittle messy. BULK INSERT is an option.

Ultimately I'd like to have the most flexible solution for defining the
field mappings so that the end user could import files with different
fields. DTS seems to be rigid in this regard.

Dataset is an option but I am concerned about performance.

Thanks for the replies.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top