DataSet Performance

  • Thread starter Klaus Aschenbrenner
  • Start date
K

Klaus Aschenbrenner

Hi!

I've a DataTable with 100 rows and 100 columns. Then I'm updating each
column in each row. When I call the Update method on the DataAdapter, this
generates 10000 UPDATE statements.
Are there any solutions how I can create a better solution, which reduces
the UPDATE statements? Or how can I handle such big updates with SQL Server?

Thanks

Klaus Aschenbrenner
MVP Visual C#
www.csharp.at, www.anecon.com
http://weblogs.asp.net/klaus.aschenbrenner
 
D

David Portas

Use a stored procedure to do the update. This sounds very much like you
are using a table as an array though, which isn't generally a good way
to model data in SQL.
 
K

Klaus Aschenbrenner

I've already tried it with stored procedures and the performance isn't
better.
The problem on the data model is that the 100 rows and 100 columns are
representing a fincance plan. So each column must save additional
information (like formats, formula, ...).

So I have a table for each row (called "Position") and this table references
another table which stores the columns (called "PosVal") of the row. With
this data model I've the possibility that the table "PosVal" can reference
other tables which contains the format, formulas...

Or this there any other way to model this?

Thanks

Klaus Aschenbrenner
MVP Visual C#
www.csharp.at, www.anecon.com
http://weblogs.asp.net/klaus.aschenbrenner
 
D

David Portas

It seems like you are trying to model an abstraction ("rows", "columns"
and "formulae" from a hypothetical spreadsheet) instead of modelling
the actual data. Isn't your metadata static enough to create a proper
relational representation of it? If not then I suggest you need a
middle tier to present this data. The back end may be largely
irrelevent - I'm not sure just what benefit you are hoping to get from
using SQL Server as the data store for this.

If you do have some real data to model, then A) Normalize your tables,
B) post a CREATE TABLE statement and your stored proc. Since your proc
can update an entire row at a time I would have expected 100 updates to
outperform 10,000 but that largely depends on how you are doing the
updates and what your data looks like.
 
B

Bob Barrows [MVP]

Klaus said:
Hi!

I've a DataTable with 100 rows and 100 columns. Then I'm updating each
column in each row. When I call the Update method on the DataAdapter,
this generates 10000 UPDATE statements.
Are there any solutions how I can create a better solution, which
reduces the UPDATE statements? Or how can I handle such big updates
with SQL Server?

Are you using the CommandBuilder to generate the code? It probably makes
more sense to write the code yourself.

You stated in a subsequent reply that you created a stored procedure to do
the update but that it did not improve performance. Could you elaborate on
what the procedure did? I'm assuming you created a procedure that accepted
parameters for each of the 100 columns and did the update for an entire row
at a time, requiring 100 calls to the procedure instead of 10000 calls to a
procedure that did 1 column at a time...
Is that correct?

Bob Barrows
 
S

Sahil Malik

Klaus,

You can use SQLXML that comes with MDAC to reduce the number of hits going
to your database. You can then send an updategram. With 2.0, you have a
graceful upgrade to the Managed SQLXML driver, so it's not a deadend.

This is obviously SQL Server specific, but Oracle has other such equivalent
solutions.

- Sahil Malik
http://codebetter.com/blogs/sahil.malik/
 
A

Anthony Thomas

First and foremost, an RDBMS is NOT FOR PERFORMANCE. You gain performance
by using the tool correctly and using the native enhancements to boost
performance. That is not to say that an RDBMS can not be fast, on the
contrary; however, that is not its chief purpose.

If performance is your ONLY concern, use a flat file or an XML file.

You use an RDBMS to MODEL THE DATA, so that others can query it in a myriad
of ways and garauntee that there results are accurate. Therefore, you have
to use the RELATIONAL rules to model your data before you build the physical
database and constrain it in order to provide DATA INTEGRITY. It is this
integrity that you build on an RDBMS system. The system is optimized for
performance, but only after providing the foundation, a relational database
properly constrained.

Sincerely,


Anthony Thomas




--

Hi!

I've a DataTable with 100 rows and 100 columns. Then I'm updating each
column in each row. When I call the Update method on the DataAdapter, this
generates 10000 UPDATE statements.
Are there any solutions how I can create a better solution, which reduces
the UPDATE statements? Or how can I handle such big updates with SQL Server?

Thanks

Klaus Aschenbrenner
MVP Visual C#
www.csharp.at, www.anecon.com
http://weblogs.asp.net/klaus.aschenbrenner
 
C

Cor Ligthert

Klaus,

The rows are updated depending on the rowstate.

It can be that when you start that the rowstate are set to "added" while you
don't want to update them all. (The dataadapter.fill set them automaticly
to unchanged when you have not set the property for that to false, however
when you load them by hand, by instance using a datareader they are all set
to "added".).

You can by instance in the case of that filling with the datareader set the
rowstate of all rows to unchanged by ds.acceptchanges

Maybe this helps?

Cor
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top