Disconnected model - high memory cost

G

Guest

Hi all,

I am worrying a bit about the disconnected model, using drag 'n drop
approach. Fine, it generates code for me, but it looks to be very expensive
regarding memory. I started and application and had a look at it. When the
Data Source is added, a DataSet class is created and associated classes as
well.

What I then discovered, is that for each Window Form created with the drag
and drop approach, the Data Set and other objects are instantiated. It means
that if, say, I have 5 windows opened (MDI application) the DataSet,
DataTables, etc. are instantiated 5 times, and the database data is cached 5
times ! If the database is large, that mean a lot of memory is used. What
bugs me, is that the datasets may not contain the same data.

I think datasets are fine. Is there some way to use this model, benefit from
generated code, but have a single DataSet shared among Forms ?

Thanks
 
M

Marina Levit [MVP]

Then you need to drag it on only once, and manually pass it around to the
other forms.

The designer is just doing what you tell it. You tell it to create a dataset
on each form, so it does.
 
C

Cor Ligthert [MVP]

Gilles,

What version are you using, this differs much between version 1.x and 2.0

Cor
 
G

Guest

Sorry Cor,

I forgot to mention I talk about ADO.NET 2.0. Having functions that generate
the code is fine, but there are some drawbacks:

- for each Form the data set is instantiated, so data is cached more than once
- the binding navigator is great, but not much flexible
- there is so much code, it is quite impossible to modify, and the
modifications could be rewritten by code generator.

I understand issues with connected mode, but I would like to fine tune the
dataset/disconnected mode pair. I have read quite a good book "Pro ADO.NET
2.0" that shows where the generated code is lacking, and I am trying to find
a better way to exploit disconnected mode. I would like to see figures which
shows the links and flow of data between all the components like DataSets
DataTables, DataAdapters and so on, so I can come up with my own "model".

For me, a figure is worth a thousand paragraphs of explanation.
 
M

Miha Markic [MVP C#]

Hi Gilles,

Gilles Plante said:
Hi all,

I am worrying a bit about the disconnected model, using drag 'n drop
approach. Fine, it generates code for me, but it looks to be very
expensive
regarding memory. I started and application and had a look at it. When the
Data Source is added, a DataSet class is created and associated classes as
well.
Right.


What I then discovered, is that for each Window Form created with the drag
and drop approach, the Data Set and other objects are instantiated. It
means
that if, say, I have 5 windows opened (MDI application) the DataSet,
DataTables, etc. are instantiated 5 times,

That's fine.

and the database data is cached 5

But you don't cache entire database, do you? If you do, then you are doing
it all wrong.

If the database is large, that mean a lot of memory is used. What
bugs me, is that the datasets may not contain the same data.

And that is good, too.
I think datasets are fine. Is there some way to use this model, benefit
from
generated code, but have a single DataSet shared among Forms ?

Generally, sharing same dataset is a wrong approach. What if you share
dataset between two forms and one form screws the data of the other?
You should have a dataset per operation (normally this means per form) with
data that is required for the operation.
 
G

Guest

Miha,
But you don't cache entire database, do you? If you do, then you are doing
it all wrong.

I agree. But if you use the drag 'n drop approach, this is what "kind of"
happens. The table adapter Wizard always generate a Fill method, by default,
that brings all the data. I know that at design time, you may change the
method, but if at run time you wish to get a subset of the data, then you
have to set a filter on the BindingSource, which simply hides part of the
data, the complete table being cached. And regarding caching the complete
data base, it looks like the only data cached for a form is the one in the
tables needed for that form.
Generally, sharing same dataset is a wrong approach. What if you share
dataset between two forms and one form screws the data of the other?
You should have a dataset per operation (normally this means per form) with
data that is required for the operation.

Hum, I must say this makes some sense. I have always praise that data should
reside in only one place. This is what a data base is all about. Now, we
bring a copy of the data in our computer, maybe more than a copy. We play
with these copies for as long as we wish, modifying data, creating, deleting,
and it get persisted only when we hit the Save button. This is against the
rule that the data is only in one place. In the past, data was saved as soon
as the user would move to another record. All this bugs me. What I meant in
"If the database is large, that mean a lot of memory is used. What bugs me,
is that the datasets may not contain the same data." is that the two datasets
may contain different values for the same records.

Maybe a proper way to look at this would be to look at the Tables in ADO.NET
2.0 as a new kind of RecordSet. I would like to have more control about whats
is going on. Right now, with the drag 'n drop approach, eveything is tied
together in large code. Instantiating the form automatically fills the
fields, and you don't have control on the behaviour on the BindingNavigator.

I would like to set the Fill method at run time and then fill the cache,
much like we did with the good old RecordSet, where we were in complete
control of the process. I would also like to set the code differently for the
BindingNavigator. Taking care of exceptions is not easy.

Maybe I just need to study deeper the beast...
 
G

Grant

You don't have use ADO.NET just because it is Microsofts latest
offering. You can still use traditional ADO (by adding a reference
to ADODB) from .NET (with all the advantages and disadvantages of a
connected model). ADO.NET is really designed for applications with
large numbers of users that need to scale well (and so avoid consuming
precious connections). Its not necessarily a good choice for small
desktop applications which will only ever have a few users -
particularly if you to be able to browse large data tables as you could
traditional standard ADO. In this case ADO.NET's disconnected model is
a memory hog and performance bottleneck.

The one issue with using traditional ADO from .NET is that you can't
bind .NET controls to ADODB recordsets. Infralutions has a product
(Virtual Data Objects) that allows you to do this. You can get more
information at:

www.infralution.com/virtualdata/html

Regards
Grant Frisken
Infralution
 
C

Cor Ligthert [MVP]

Miha,
Generally, sharing same dataset is a wrong approach. What if you share
dataset between two forms and one form screws the data of the other?
You should have a dataset per operation (normally this means per form)
with data that is required for the operation.
???

Are you telling that every UI should have its own datalayer, I don't believe
that you are telling this, something should been gone wrong in this message

:)

Cor
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top