Performance of ADO.NET dataset

A

Abdul Malik Said

Does anyone know how best to do diagnostics on the data operations performed
by an ADO.NET dataset? I am interested in measuring the speed to compare
with the speed of doing the same operations to the same data in a SQL
database directly.

Any thoughts on the architecture of DB solutions that use ADO.NET
recordsets? I think it is slower to use an ADO.NET dataset for operations
such as searching for certain keys or joining. I think SQL must be optimised
to do that kind of operation better. But I am not an expert. I am interested
in the fastest way to make complex calculations on data and then update the
database. Is it better to put database operations in stored procs and call
them from the .NET code? Or is it better to perform the operations in a
dataset and then update the SQL tables afterward?

Is there a way to do a bulk insert from an ADO.NET dataset? By this, I mean
a true bulk insert like in SQL Server, instead of looping through the
dataset to insert updated records.

Any discussion would be greatly appreciated.

Abdul Malik Said
 
C

Cowboy \(Gregory A. Beamer\) [MVP]

SQL Server will run circles around it. Having said that, perf is not the
only issue. Using DataSets, you end up with a very maintainable, easy to
understand, system. That is worth something.

To increase perf here are some suggestions:

1. Use readers when you are displaying only. Reports are great for readers
2. Consider strongly typed DataSets, as you end up with less chance of
wasting cycles figuring out data types.
Is there a way to do a bulk insert from an ADO.NET dataset?

No. At least not in the current incarnation. If you call UPDATE, it loops.
Of course, you can take control of the UPDATE process, using XML and create
an updategram/diffgram type of system. This will allow for "bulk" inserts of
sorts, as the faster SQL engine will figure out which pieces to run UPDATEs
on. Under SQL 2005, this gets even better.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

************************************************
Think Outside the Box!
************************************************
 
A

Abdul Malik Said

Thank you very much for your response. I considered the XML technique with
an updategram/diffgram as you suggested, but I was told by a colleague that
this can also make inserts more difficult. Still, I am going to learn more
about that technique, so thanks a lot.

Your other advice to use strongly typed data sets and using readers for
display-only data is very useful.

I am more knowledgeable about SQL stored procedures, so I would rather just
do everything in SQL, but I need to work with a team that knows more about
..NET and wants to keep the business logic there.

I can see the reasons for keeping business logic in .NET assemblies, but it
seems from what we are saying that data operations, themselves, are much
faster when done in the database. For the sake of maintainability, would I
be correct in saying that the best architecture puts low-level data
operations in SQL queries, whereas the business logic, itself, goes into
..NET assemblies?

Abdul Malik Said
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top