Bulk Insert Using SqlCommand

H

hharry

Hello All,

I am running a bulk load from a flat file to sql server, using a
sqlcommand object. I have an issue when the load fails due to a row
being longer than specifed in the format file - the command fails and
exits.

Is it possile to skip errors and continue loading until the eof ?
I have tried setting the maxerrors option but same result.

Source code:

str_sql.Append("BULK INSERT " + MYTABLE + vbCrLf)
str_sql.Append("FROM '" + MYFLATFILE + "'" + vbCrLf)
str_sql.Append("WITH (FORMATFILE = '" + bcpFile + "'," +
vbCrLf + "BATCHSIZE = 1000," + vbCrLf + "MAXERRORS = 1000" + vbCrLf _
+ ")")


Thanks in advance
 
G

Guest

With the Bulk Load API, there are certain types of errors that cannot be
overcome, even if you specify a large number for the number of errors that
are acceptable. It appears wrong line length is one of those.

If you want exception handling while loading, DTS gives you the greatest
number of options. With DTS, you can build a multi-phase pump that handles
any problems encountered.

If DTS is not an option, consider ripping through the file and pitching all
lines (to an exception file?) that are the wrong length. This adds a bit of
time to the solution, but at least you get the data in. Note that the Stream
Readers and Writers in .NET are very fast. If you are really concerned with
speed, you have the option of moving down to byte arrays, which can trim a
lot of time off of the clensing, but it is less maintainable, as it requires
a deeper understanding than using a StreamReader.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top