Caching dictionary in local drive

A

Andrus

I'm creating WinForms 5-language client-server application which allows to
add/change translations at runtime.

I have table in server which contais translations, 9000 records :

MainLanguage CHAR(254) primary key,
Language1 CHAR(254),
Language2 CHAR(254),
Language3 CHAR(254),
Language4 CHAR(254)

Translators change this table at runtime. I have created a trigger which
stores last change of this table to
allow replication.

I need to create single method

enum DestLanguage { Lang1, Lang2, Lang3, Lang4 };

static string Translate( string mainLanguageText, DestLanguage
destinationLanguage ) { ... }

which returns translated text in desired language.


My application may work over slow internet connection. Single application
invocation usually uses 5% of records from tihs table.
So it seems not reasonable to make round strip to server for every label and
caption in application which needs to translated.

So I need to cache this table in C: drive.
Any idea how to implement caching ?

Installing and distributing SQLite or SQL server compact edition with my
application seems to be overkill for this small task.
I'm looking for a lightweight data structure which allows to store and
search data.

I can serialize and deserialize it as dictionary object in isolated storage.
However in this case I must load whole dictionary (9000 items, approx 6 MB )
in application startup.
I'm afraid that this cause some delay at application startup and consumes to
much memory.

Is using serialization/deserialization for this amout of data reasonable
solution ?
Any idea how to implement this ?

Andrus.
 
N

Nicholas Paldino [.NET/C# MVP]

Andrus,

You could use serialization, but I think the bigger issue is how much
data you want to cache and load into memory, vs the cost of getting that
information when it is needed.

You said it doesn't seem reasonable, but have you run some performance
tests to see just how long it would take to get that information? It might
be quicker than you think.

Also, if you are going to cache this much data locally, you really are
better of using a database, like SQL Server (you could use SQL Server
Express, if you want).
 
N

Nicholas Paldino [.NET/C# MVP]

Actually, let me change that, 6MB of data isn't really that much. I
think you could get away with just serializing/deserializing it, but like
you said, you are going to take a hit when you first load it into memory,
and it's going to create ^some^ memory pressure (not a tremendous amount,
depending on the hardware, but it's definitely not insignificant).
 
A

Andrus

Actually, let me change that, 6MB of data isn't really that much. I
think you could get away with just serializing/deserializing it, but like
you said, you are going to take a hit when you first load it into memory,
and it's going to create ^some^ memory pressure (not a tremendous amount,
depending on the hardware, but it's definitely not insignificant).

Thank you.
I will try serialization/deserialization first.

1. Usually user translates only to single language.
So maybe instead of

Dictionary<string, struct {string,string,string,string }> Languages;

should I use separate dictionary for every language and load it on demand:

Dictionary<string,string> lang[4];

?

2. Is Dictionary<TKey,TValue> best type for translation when TKey and
TValue are strings with size 1 .. 200 unicode characters ?

3. Which serialization should I use, Binary, XML, SOAP or DataContract based
serialization ?

Andrus.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top