Problems/Doubts: Implementing Disc. DB concept(Dataset)!!!

G

Guest

Hi All

I need to know :

1. If there is an upper limit to Amount of data and the no. of Tables in a Dataset which I have to cache at the Client system. (The requirements are such that we need to cache some 120+ tables at client system, in order to avoid round trips to Database as the no. of updations going to the Database are huge. Datasets emerge as clear answer for our requirements of having a local(client) data base, owing to the rich functionalities it provides.) [Note: The Application is a Desktop Application application (WinForms)]
2. I also want to know whether there are any implicationts to the usage of DatsSets in this manner
3. If the amount of tables and data being cached in Datasets is not an issue then What all Precautions do we need to take in order to avoid any problems for example Memory leaks etc.
You can also pass on some good resources/links on Pros and cons, and precautions, BEST PRACTICES to be followed while implementing such a disconnected database approach using DataSets

Thanks in advance
Saurab
 
M

Miha Markic [MVP C#]

Hi,

saurabh said:
Hi All,

I need to know :-

1. If there is an upper limit to Amount of data and the no. of Tables in a
Dataset which I have to cache at the Client system. (The requirements are
such that we need to cache some 120+ tables at client system, in order to
avoid round trips to Database as the no. of updations going to the Database
are huge. Datasets emerge as clear answer for our requirements of having a
local(client) data base, owing to the rich functionalities it provides.)
[Note: The Application is a Desktop Application application (WinForms)].

Limited only by available memory.
2. I also want to know whether there are any implicationts to the usage of
DatsSets in this manner.

Apart for uptodate data none (concurrency issues) that I know of.
3. If the amount of tables and data being cached in Datasets is not an
issue then What all Precautions do we need to take in order to avoid any
problems for example Memory leaks etc.

Nothing. Garbage collector will take care of it. Just invoke Dispose on
dataset (not mandatory, however a good practice) and release any reference
to it when you are done with it. Or invoke DataSet.Clear method if you want
to reuse it.
You can also pass on some good resources/links on Pros and cons, and
precautions, BEST PRACTICES to be followed while implementing such a
disconnected database approach using DataSets.

Visit William Ryan's www.knowdotnet.com - there are plenty of articles.
 
P

Patrice

You may want to explain your scenario in more details.

In a disconnected scenario and with such an amount of data I would use
rather a "true" DBMS such as MSDE. I't's not clear if the disconnected model
essential in your scenario or if this is just to implement some kind of
"caching" (for example if you keep this for a long period what if the
computer hangs, are all updates lost ?)

Patrice

saurabh said:
Hi All,

I need to know :-

1. If there is an upper limit to Amount of data and the no. of Tables in a
Dataset which I have to cache at the Client system. (The requirements are
such that we need to cache some 120+ tables at client system, in order to
avoid round trips to Database as the no. of updations going to the Database
are huge. Datasets emerge as clear answer for our requirements of having a
local(client) data base, owing to the rich functionalities it provides.)
[Note: The Application is a Desktop Application application (WinForms)].
2. I also want to know whether there are any implicationts to the usage of DatsSets in this manner.
3. If the amount of tables and data being cached in Datasets is not an
issue then What all Precautions do we need to take in order to avoid any
problems for example Memory leaks etc.
You can also pass on some good resources/links on Pros and cons, and
precautions, BEST PRACTICES to be followed while implementing such a
disconnected database approach using DataSets.
 
G

Guest

Hi Patrice

You are right .. caching such a huge block of data on the client will have Issues .... We are not considering the Idea of using MSDE due to the extra performance burden involved ....

Let me explain as to why we have to consider such a disconnected model .....

The Client applications(Winform application) need to perform a lot of calculations.
For example, User is working on a particular Form and modifies a value in a text control. On loosing focus from this control, a lot of calculations and updations are performed (some 30 to 50, may be more fields in the Database are to be modified). Also on the same form many such text controls can be present which tend to affect a lot of tables (the calculations have to take care to update all the dependent/related variables and controls on the same or (mostly) other forms

Now modifications directly to the Database will be a lot time consuming (with these many updations happening), will have a lot of n/w clutter, clogging of n/w bandwidth, and decrease in overall Application Performance.

In order to avoid all the above and have better performance we need to Cache the data at the client side, (disconnected data model) which holds all such tables and the actual updations to the Database will happen say when switching from one form to another and which can be done partly in asynchronous fashion

I hope the basic picture is clear.

Any Pointers now ???!!!!

thanks and regards
Saurabh
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top