caching and locking

L

laimis

For my middle layer objects, I have a base class that other classes
subclass. This base class houses properties that all of my objects (with
some exceptions) have.

Here is the problem. One of the properties the client can configure by

1) setting it himself.
2) accessor can set it up by readign it from the config file.

I am caching the value read from the config file using a key set by the
client. The same client can create different objects with different
keys. My concern is the code that reads the value from the cache. Since
I am using the middle layer from the web and different asp.net worker
threads might want to set the property up:

// cahce is static variable here

public string Key
{
get{;}
set{;}
}
public string SomeValue
{
get
{
if (cache.Contains(Key)
return cache[Key];
else
{
lock
{
cache.Add(Key, readFromConfig(Key));
}
}
}
}

Is it ok that I use locking here? The condition might happen that
multiple threads will not find the value based on key and one will add,
and then another will want to add with the same key.

Is my approach ok here? I could also catch exception of duplicate key
and assume that someone added the value.

What do you guys think?
 
J

Jon Skeet [C# MVP]

Is it ok that I use locking here? The condition might happen that
multiple threads will not find the value based on key and one will add,
and then another will want to add with the same key.

Is my approach ok here? I could also catch exception of duplicate key
and assume that someone added the value.

What do you guys think?

You should lock around the checking as well - effectively make the
checking and potential adding atomic.
 
L

laimis

You should lock around the checking as well - effectively make the
checking and potential adding atomic.

Actually here is how I am doing get part:

public string SomeValue
{
get
{
if (cache.Contains(Key)
return cache[Key];
else
{
lock
{
if (cache.Contains(Key)
return cache[Key];
else
cache.Add(Key, readFromConfig(Key));
}
}
}
}

This way, I don't lock first time when checking the cache. So that there
is no delay when the key exists. In the lock, I am checking again to
make sure that the key was not added previously ...

So I guess Jon you are saying this would be a fine approach? No concerns?
 
J

Jon Skeet [C# MVP]

laimis said:
You should lock around the checking as well - effectively make the
checking and potential adding atomic.

Actually here is how I am doing get part:

public string SomeValue
{
get
{
if (cache.Contains(Key)
return cache[Key];
else
{
lock
{
if (cache.Contains(Key)
return cache[Key];
else
cache.Add(Key, readFromConfig(Key));
}
}
}
}

This way, I don't lock first time when checking the cache. So that there
is no delay when the key exists. In the lock, I am checking again to
make sure that the key was not added previously ...

So I guess Jon you are saying this would be a fine approach? No concerns?

No, what I said was that you need a lock round the whole thing.

Locks are pretty cheap - don't risk thread safety by trying to avoid
them. There are *very* subtle concerns when it comes to using shared
data.

It's possible (according to the CLI spec) for the cache addition to
become visible to other threads before all the memory writes in
readFromConfig. This would mean it's possible for one thread to be in
the process of adding an object to the cache, and another object could
read that cache entry and see an incomplete object.

It almost certainly won't happen in the current implementation of the
CLI, but the performance gain you'll get from avoiding the lock is so
small that it's not worth the risk, IMO.

See http://www.pobox.com/~skeet/csharp/threads/volatility.shtml for
more details.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top