Design Question

L

Lee

I am in the beginning stages of rewriting an application from Delphi to
C#/VS2005. In the current application, we cache much of the static
data on the local work station in memorydatasets. This works nicely
because it reduces network traffic and the app is very responsive.
There's an inital delay at startup with progress indication and then
while the app is running, database access is 90% short transactional
type INSERT and UPDATES.

In the current software, there are about 20 memorydatasets that are
used. Most of them hold only 20-50 records except for one which holds
product information and will hold on average 1000 product records
athough in some extreme circumstances, we've had customers with 5K
records or more.

Now, I'm wondering with the increase in network speeds, faster
computers using gobs of memory, potential to run of Citrix or RDP, if
it's a good idea to rethink this particular setup, at least from a due
diligence standpoint in light of my new development platform (JIT vs.
Native exe's, different memory management, etc).

Are there any gotchas I should look out for that might be particular to
..net framework should I decide to stay with the "cached" way of doing
things?

--
Warm Regards,
Lee

"Upon further investigation it appears that your software is missing
just one thing. It definitely needs more cow bell..."
 
L

Lee

Lee enlightened me by writing:
Are there any gotchas I should look out for that might be particular
to .net framework should I decide to stay with the "cached" way of
doing things?

BTW, I will be using objects and object collections instead of datasets
in the C# version...

--
Warm Regards,
Lee

"Upon further investigation it appears that your software is missing
just one thing. It definitely needs more cow bell..."
 
M

Marius Groenendijk

Hi Lee, Your app looks very much like ours.

We use HashTables for caches. Extremely responsive.
Several 100s/1000s of items is no problem so no gotchas.

Some caches (dynamic) are better kept in datasets/datatables
and that's where the gotchas start.

We have one DB table with potentially > 10K, >100K,.. of big items.
Forget about loading this in a dataset or presenting it in the GUI
in a datagrid or UPDATEing these with SQL cmds.

My 0.02? worth,
Marius.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top