More efficient way to insert a dataset in .NET 2.0?

G

Guest

I always wished there was an efficient way to insert a dataset into a SQL
Server DB in .NET 1.0 rather than just iterating and doing an insert each
time. Is there any method in the new framework that will do this? Thanks!
 
S

Sahil Malik [MVP]

You can do this in the old .NET framework using SqlXml (interoped) - refer
Chapter 12 of my first book.

In .NET 2.0, you can do this using either SqlXml (managed), or you can set
BatchSize property on the dataadapter to batch up a number of commands to
prevent multiple hits to the database. This is explained in Chapters 9,10,11
of my upcoming ADO.NET 2.0 book.

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/
 
M

Mythran

Stefan Wrobel said:
I always wished there was an efficient way to insert a dataset into a SQL
Server DB in .NET 1.0 rather than just iterating and doing an insert each
time. Is there any method in the new framework that will do this?
Thanks!

There is in 1.1...

Dim conn As SqlConnection
Dim selectCmd As SqlConnection
Dim bldr As SqlCommandBuilder
Dim adap As SqlDataAdapter
Dim ds As DataSet = FillDataSet() ' Fills dataset with new lines.

conn = New SqlConnection(connectionString)
conn.Open()

selectCmd = New SqlCommand("select * from TableName where 0 = 1", conn)
bldr = New SqlCommandBuilder(selectCmd)

adap = New SqlDataAdapter(selectCmd)
adap.InsertCommand = bldr.InsertCommand

adap.Update(ds)

The above code was written off the top of my head so there may be a syntax
error or something, but I believe it is close enough for you to grab info
from :)

Anywho, that's the .Net 1.1 way (afaik).

Mythran
 
G

Guest

Awesome, thanks for the response, I searched for a few hours one day and
couldn't find anything, so I assumed the functionality didn't exist.
 
P

Poohface

Yes but at least this is an actual answer for the guy and an attempt
at helping him rather than plug'n a book. Refer to my finger next to
index.
 
S

Sahil Malik [MVP]

Pooh face, allright I'll have to argue your point now. The original question
was

"I always wished there was an efficient way to insert a dataset into a SQL
Server DB in .NET 1.0 rather than just iterating and doing an insert each
time. Is there any method in the new framework that will do this? "

How in the world does DataAdapter.Update is an answer to this question?

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/
 
C

Cor Ligthert

Sahil,
I think he is looking for a method that doesn't execute a SQL for every
changed row in the datatable.
DataAdapter.Update masks it for you, but internally it is still row by
row - thus inefficient.

Can you think of a solution (while not thinking on a program language) for
what you propose.

Every instruction by a processor is done one by one. In a database can
forever the previous command have influence on the next. (think by instance
on autoincrement).

So this would need a solution with add less optimistic multiprocessing. I
think that, before (AFAIK is opt. multiproc. still not complet out of the
labaratories) this kind of now in common rather fast to process problems
will use those technics, we should at least wait 10 years.

(I count a multithreading processor for at least 2 processors)

However just my thought,

Cor
 
S

Sahil Malik [MVP]

Cor,

The problem with executing every command one by one is the network latencies
you have to fight for each row that is changed. So if the data adapter is
working with a dataset that has 3 changed rows, the data adapter will make 3
trips to the database.

By using SqlXml, you can batch those 3 rows in one UpdateGram, and by using
BatchSize, you can use batched commands to send in the updates as one single
network roundtrip.

Even though commands might still be executed sequentially, the multiple
network roundtrips amount to a significant saving of time.

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/
 
C

Cor Ligthert

Sahil,

Now I understand you and partially agree with you.

Will it real gain much performance? Most networks use Ethernet, which uses
mostly relative small packets to send.

Keeping all these kinds in mint will let in my idea only to a strange effect
to the often by me used 20 to 80 rule. Don't try to optimize in the part
that is not used so often.

However, I have another one probably in past used even more in the case
where network is involved "don't try to optimize in the part that is not the
smallest part of the pipeline".

But, what I as well often use is that the user needs his natural moments to
rest. The update is in my opinion such a moment (not the fill).

However just my thoughts,

(A nice discussion by the way)

:)

Cor
 
S

Sahil Malik [MVP]

Cor it actually makes a TONNE of a difference.

Let me explain how.

Okay in 1995, when I was sitting here, and I'd ping a server in tokyo, it'd
take me about 1/2 a second. If I tried downloading a 30 MB file, it'd take
me an hour.
Today, if I ping a server in tokyo, it still takes me 1/2 a second. But if I
try downloading a 30 MB file, it streams thru in 2 minutes or lesser.

Also, in future, while network download speeds continue to improve, network
latencies will stall (unless we can improve the speed of electric current
which I believe in copper is 0.667C, where C is the speed of light).
Alternatively optical fibers transmit light at 0.997C - much faster, but
still finite. In short, every medium an EM wave transmits thru is always a
KC where K is coefficient always less than 1. I know some guy will argue
that electricity is not an EM wave, but electrons pushing each other to
transmit a finite charge, causing a resulting perpendicular magnetic field -
that is everything typical of an EM wave.

Plus a MAJOR part of the latency comes from semiconductor transistors
switching over digital circuits, which is a painfully slow process.

So in short - when you batch commands, they go thru in almost about the same
time, or with not much difference. But when you donot batch and go over the
network again and again - you are adding the latency - which you will not be
able to reduce in the near future, on every single datarow :).

So yes, it makes a TONNE of a difference. You should always try and make
your conversations on a distributed network chunky not chatty.

The bigger question is - what is an EM Wave discussion got to do with
ADO.NET? LOL :).

BTW - try out the batchsize thing in ADO.NET 2.0 and run a few performance
metrics yourself on a LAN connection (remember to involve atleast two
computers for a pronounced effect, though I think you might be able to see
it even on one computer).

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/
 
W

W.G. Ryan MVP

What is your problem man? Sahil's answer WAS CORRECT and hardly a plug for
his book. But assuming it was, where do you come off with your comment
about "my finger next to index"?
 
S

Sahil Malik [MVP]

Now *THAT* is a tough question :)

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/


Miha Markic said:
Why would anyone ping server in Tokio?
;-)

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
SLODUG - Slovene Developer Users Group www.codezone-si.info

Sahil Malik said:
Cor it actually makes a TONNE of a difference.

Let me explain how.

Okay in 1995, when I was sitting here, and I'd ping a server in tokyo,
it'd take me about 1/2 a second. If I tried downloading a 30 MB file,
it'd take me an hour.
Today, if I ping a server in tokyo, it still takes me 1/2 a second. But
if I try downloading a 30 MB file, it streams thru in 2 minutes or
lesser.

Also, in future, while network download speeds continue to improve,
network latencies will stall (unless we can improve the speed of electric
current which I believe in copper is 0.667C, where C is the speed of
light). Alternatively optical fibers transmit light at 0.997C - much
faster, but still finite. In short, every medium an EM wave transmits
thru is always a KC where K is coefficient always less than 1. I know
some guy will argue that electricity is not an EM wave, but electrons
pushing each other to transmit a finite charge, causing a resulting
perpendicular magnetic field - that is everything typical of an EM wave.

Plus a MAJOR part of the latency comes from semiconductor transistors
switching over digital circuits, which is a painfully slow process.

So in short - when you batch commands, they go thru in almost about the
same time, or with not much difference. But when you donot batch and go
over the network again and again - you are adding the latency - which you
will not be able to reduce in the near future, on every single datarow
:).

So yes, it makes a TONNE of a difference. You should always try and make
your conversations on a distributed network chunky not chatty.

The bigger question is - what is an EM Wave discussion got to do with
ADO.NET? LOL :).

BTW - try out the batchsize thing in ADO.NET 2.0 and run a few
performance metrics yourself on a LAN connection (remember to involve
atleast two computers for a pronounced effect, though I think you might
be able to see it even on one computer).

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/
 
C

Cor Ligthert

Sahil,

Do you know that nice song (not from however) most known as done by Harry
Bellafonte. "There is a hole in the bucket, Alica". I like that as well
often to use as sample.

So see my message before this. Than we come as the smallest part from the
pipeline back to the processing. And that I tried to explain already with
the fact that there are in a database a lot of dependencies. Where I agree
with your, that those can be improved (slightly) when there is in one time a
complete batch processed. However that should in my opinion not give direct
over the total time measured a quicker access.

In addition, is this in fact not the same question about avoiding
concurrency errors, which can be improved with pessimistic concurrency. How
often will this take place and what will than be the total benefit in
overall time (when it will not even be a disadvantage measuring that)?

Just my thought,

Cor
 
M

Mythran

Sahil Malik said:
Pooh face, allright I'll have to argue your point now. The original
question was

"I always wished there was an efficient way to insert a dataset into a SQL
Server DB in .NET 1.0 rather than just iterating and doing an insert each
time. Is there any method in the new framework that will do this? "

How in the world does DataAdapter.Update is an answer to this question?

- Sahil Malik [MVP]
http://codebetter.com/blogs/sahil.malik/

I simply gave an answer so that the original poster does not have to
manually iterate through each row. . . and I don't mind being rebuttled
about it.....even if Sahil does make comments referring to his book...it's
in his own right...he wrote it :)

Mythran
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top