Is there a capacity limit in hashtables?

  • Thread starter Cengiz Dincoglu
  • Start date
C

Cengiz Dincoglu

When I insert a long value as key and the object to a hashtable. it fails at
about 600,000 data points.

ht.add(long, long)

We are developing a real time application that is required to keep about 2.5
million objects in a hashtable and the object should be readly available to
the app very fast at any time.
Is it better to come up with an algorithm to have a hashtable of
hashtables(holding 200,000 objects each) ?

if it has limit, any other suggestions ?

Thanks
 
J

Jon Skeet

Cengiz Dincoglu said:
When I insert a long value as key and the object to a hashtable. it fails at
about 600,000 data points.

How much memory do you have? Bear in mind that each long you add (each
of key and value) is going to take up 16 bytes when boxed, and there'll
also be the reference to each of them (4 bytes each). The hashtable
also remembers the hashcode, which is another 4 bytes. The internal
capacity is also always larger than the capacity, but I still can't
understand why it would be failing after 600000 iterations... indeed,
it works fine on my box:

using System;
using System.Collections;

public class Test
{
static void Main()
{
Hashtable t = new Hashtable();
for (long i=0; i < 3000000000; i++)
{
if (i%100000==0)
{
Console.WriteLine
("i={0}, memory={1}", i, GC.GetTotalMemory(true));
}
t=i;
}
}
}

Which version of .NET are you using? In an old version of 1.0, there
was a problem with the large object heap not getting garbage collected
- might that be what's wrong?
ht.add(long, long)

We are developing a real time application that is required to keep about 2.5
million objects in a hashtable and the object should be readly available to
the app very fast at any time.
Is it better to come up with an algorithm to have a hashtable of
hashtables(holding 200,000 objects each) ?

if it has limit, any other suggestions ?

Well, you'd save an awful lot of memory by writing your own custom
collection which *only* stored longs as the keys and values, if that's
really what you'll be doing.
 
R

Robert Jacobson

Have you considered other data structures? For example, you could use store
your objects in an array (either a strongly-typed array or an ArrayList),
sort the array, and then do a binary search to look up the right items. If
you want to use the BinarySearch method of the array or ArrayList, you could
implement IComparable using the object's key value (which wouldn't be
hard.) Otherwise, you could implement your own searching algorithm.
However, this solution wouldn't be optimal if the list frequently changes,
since you'd have to re-sort the array each time the list changes.

You could also try implementing a sorted binary tree, which would also
enable both fast key lookups (using a binary search) and fast insertions and
deletions of objects. There isn't one built into the .Net framework,
though, so you'd have to build your own.

--Robert Jacobson
 
W

Willy Denoyette [MVP]

Jon Skeet wrote:
||| Jon Skeet wrote:
||| .....
||||| Hashtable t = new Hashtable();
||||| for (long i=0; i < 3000000000; i++)
||||| {
||| .....
|||
||| Jon,
|||
||| Are you sure this runs on your system?
||
|| Oops - not to completion, no :)
||
|| However, it *does* get past the 2.5 million stage fairly easily.
||
|| --
|| Jon Skeet - <[email protected]>
|| http://www.pobox.com/~skeet/
|| If replying to the group, please do not mail me too

Otherwise you would have to tell us what HW and OS you did run this on ;-)

Willy.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top