A
Anders Borum
Hello!
I have a list of singleton classes (model managers) that store objects
internally using hashtables. Each of these classes use a single hashtable to
store e.g. users, pages, elements and so on (I have complete control of the
objects stored). I am currently using a set of abstract cache classes that
my model managers subclass.
The hashtable in a model managers could, potentially, store more than 25.000
objects (let's imagine 50.000 for sake of the discussion). The hashtables
are accessed very frequently (they store objects that are cloned and handed
to the client context services).
I am tinkering about creating a central cache, but have yet to figure out
when (or if) a hashtable breaks down, or when it becomes a bad decision to
go this way.
Obviously, using a central cache does implicit ask for synchronization
(which I also use in abstract cache classes), but sharing the cache could
sacrifice additional speed.
For testing purposes, I tried populating a hashtable with 650.000 instances
(ate up around 75MB of RAM), and it worked all right. I haven't done any
tests on how the size of the hashtable actually affects lookup times (I
belive to have read that the lookup time is almost constant, regardless of
the size).
The caching mechanism in ASP.NET is built around a hashtable. I haven't
found any recommendations for number of items stored here, so I be on the
right track.
Any comments and ideas are very welcome!
I have a list of singleton classes (model managers) that store objects
internally using hashtables. Each of these classes use a single hashtable to
store e.g. users, pages, elements and so on (I have complete control of the
objects stored). I am currently using a set of abstract cache classes that
my model managers subclass.
The hashtable in a model managers could, potentially, store more than 25.000
objects (let's imagine 50.000 for sake of the discussion). The hashtables
are accessed very frequently (they store objects that are cloned and handed
to the client context services).
I am tinkering about creating a central cache, but have yet to figure out
when (or if) a hashtable breaks down, or when it becomes a bad decision to
go this way.
Obviously, using a central cache does implicit ask for synchronization
(which I also use in abstract cache classes), but sharing the cache could
sacrifice additional speed.
For testing purposes, I tried populating a hashtable with 650.000 instances
(ate up around 75MB of RAM), and it worked all right. I haven't done any
tests on how the size of the hashtable actually affects lookup times (I
belive to have read that the lookup time is almost constant, regardless of
the size).
The caching mechanism in ASP.NET is built around a hashtable. I haven't
found any recommendations for number of items stored here, so I be on the
right track.
Any comments and ideas are very welcome!