Is 40,000 rows too much for xml output from ado.net?

R

Rich

Hello,

I am trying to push 40,000 rows of data from an ado.net
dataTable to a sql server table. It takes me about 5
minutes to read the data from an external data source
(using a vb.net project). But it takes over 10 minutes to
write the data to sql server using dataAdapter.Update
(dataset1, "myTable"). I was thinking maybe I could
output the data to disk as an xml file and use DTS in Sql
server to retrieve the data (I was thinking about writing
out just a plain text file, but there are over 100
fields/columns of which some are nText - very hard to
delimit with just one delimeter - would need 3 chars to
delimit columns). Would this be more efficient than
dataAdapter.Update... for 40,000 or more rows? My other
idea would be to loop through the data table and just use
a command object to insert each row into the sql server
table row by row. Any suggestions appreciated.

TIA,
Rich
 
D

David Browne

Rich said:
Hello,

I am trying to push 40,000 rows of data from an ado.net
dataTable to a sql server table. It takes me about 5
minutes to read the data from an external data source
(using a vb.net project). But it takes over 10 minutes to
write the data to sql server using dataAdapter.Update
(dataset1, "myTable"). I was thinking maybe I could
output the data to disk as an xml file and use DTS in Sql
server to retrieve the data (I was thinking about writing
out just a plain text file, but there are over 100
fields/columns of which some are nText - very hard to
delimit with just one delimeter - would need 3 chars to
delimit columns). Would this be more efficient than
dataAdapter.Update... for 40,000 or more rows?
Yes.

My other
idea would be to loop through the data table and just use
a command object to insert each row into the sql server
table row by row.

That is what the DataAdapter is doing.

Try wrapping your update in a SqlTransaction. If you don't then SQLServer
must commit each row as it's inserted. It's more efficient to let SQL write
all 40,000 rows and commit once at the end. Moreover if something goes
wrong during the load, you would get a tidy rollback.

David
 
R

Rich

Thanks for your reply. Truth is, I just recently started
using .net. May I ask how to set up a sqlTransaction? Do
I do this in dataAdapter.Upate(...sqlTransaction...)? I
confess, that I am not familiar. But it does sound like a
solution.

Thanks,
Rich
 
R

Rich

I looked up sqlTransaction on google to get some ideas. I
am also using a commandbuilder object to build my commands
for the dataAdapter. So would something like this be on
the right track? CommandBuilder is actually building my
Insert Command.

Dim Trans As SqlTransaction
dataAdapter1.InsertCommand.Transaction = Trans
....
dataAdapter1.Update(dataset1, "myTbl")
Trans.Commit()
 
G

Guest

I think that the DataAdapter has a lot of overhead. It might be worthwhile investigating a method using a sql string built and executed using DbCommand class.
 
D

David Browne

Rich said:
I looked up sqlTransaction on google to get some ideas. I
am also using a commandbuilder object to build my commands
for the dataAdapter. So would something like this be on
the right track? CommandBuilder is actually building my
Insert Command.

Dim Trans As SqlTransaction
dataAdapter1.InsertCommand.Transaction = Trans
...
dataAdapter1.Update(dataset1, "myTbl")
Trans.Commit()

If you use the CommandBuilder, you would induce the CommandBuilder to create
the Insert command, and then enlist it in the transaction

Dim Trans As SqlTransaction

dim cb as SqlCommandBuilder ...


dataAdapter1.InsertCommand = cb.CreateInsertCommand
dataAdapter1.InsertCommand.Transaction = Trans
...
dataAdapter1.Update(dataset1, "myTbl")
Trans.Commit()


David
 
R

Rich

Well, I got the sqlTransaction thing to work, but for
10,000 records, it took me 70 seconds to pull the records
and 124 seconds to write them to sql server. Currently, I
am using a vb6 app to pull the data and write it to sql
server using com ADO. The data pulling part is much
faster in Ado.Net because I can just write to memory. But
writing from the .net dataTable to sql server seems to
take too long. Com ADO appears to be faster overall. If
I could use DTS to pull the data from the dataTAble, that
would really be cool (is that doable?).

I will try using just a command object and loop through
the dataTable and see if that is any faster than
dataAdapter.Update(...). I really like .net, but I guess
I will have to work with it for a while before I can
figure out how to get the best/most performance from it.

Rich
 
A

Angel Saenz-Badillos[MS]

Rich,
As you have discovered we are missing proper bulk update functionality in
the current release of ado.net, this is one of the areas in which we have
done the most work for our next release. We will be including a way to batch
update data from your adapter instead of updating line by line and if you
are using Sql Server a SqlBulkUpdate class that will do exactly what you
want, take data from a datatable and write it to the server in the fastest
possible way.

Until the next release I would recommend using DTS for any bulk updates to
Sql Server, if you need to work with a different backend using a local
transaction and manually batching your command (assuming your backend
supports batching) is probably going to be your best bet.
 
R

Rich

Thanks for the reply. This is enlightening (and
encouraging). In the meantime, I am experimenting with
using the ado.net command object. One nice thing about
ado.net is being able to read large quantities of data
into memory tables (dataTables). The hassel is then
writing that data to Sql Server. I just feel that the
ado.net command object is faster than the com Ado
recordset object. I am just having a hassel right now
with apostrophe's in my data (yes I know about ''s - for
some reason I am still having a problem - just have to
work it out).

Rich

-----Original Message-----
Rich,
As you have discovered we are missing proper bulk update functionality in
the current release of ado.net, this is one of the areas in which we have
done the most work for our next release. We will be including a way to batch
update data from your adapter instead of updating line by line and if you
are using Sql Server a SqlBulkUpdate class that will do exactly what you
want, take data from a datatable and write it to the server in the fastest
possible way.

Until the next release I would recommend using DTS for any bulk updates to
Sql Server, if you need to work with a different backend using a local
transaction and manually batching your command (assuming your backend
supports batching) is probably going to be your best bet.

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.


Well, I got the sqlTransaction thing to work, but for
10,000 records, it took me 70 seconds to pull the records
and 124 seconds to write them to sql server. Currently, I
am using a vb6 app to pull the data and write it to sql
server using com ADO. The data pulling part is much
faster in Ado.Net because I can just write to memory. But
writing from the .net dataTable to sql server seems to
take too long. Com ADO appears to be faster overall. If
I could use DTS to pull the data from the dataTAble, that
would really be cool (is that doable?).

I will try using just a command object and loop through
the dataTable and see if that is any faster than
dataAdapter.Update(...). I really like .net, but I guess
I will have to work with it for a while before I can
figure out how to get the best/most performance from it.

Rich

minutes
to if
something goes


.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top