Memory Usage of DataSet

  • Thread starter Finn Stampe Mikkelsen
  • Start date
F

Finn Stampe Mikkelsen

Hi...

Been looking a bit around and i can't seem to find a way to calculate the memory usage of a dataset/datatable...

I would like to get the potential usage of memory and the actual size of a specific datatable/dataset.

Can anyone help there??

/Finn
 
J

Jason Keats

Finn said:
Hi...

Been looking a bit around and i can't seem to find a way to calculate
the memory usage of a dataset/datatable...

I would like to get the potential usage of memory and the actual size of
a specific datatable/dataset.

Can anyone help there??
/Finn

I read somewhere that for a datatable the memory used is:
rows * (sum of the size of the types in all columns + 40)
where an empty datarow is apparently about 40 bytes.

A dataset can obviously have several datatables as well as relations
(which also take some memory) between tables.

The size of numeric and fixed length strings fields/columns is known,
however varchar/nvarchar are not fixed length - but do have a maximum
length - so you would have to know the average length of text stored in
those fields/columns.

If, however, you've got text/ntext columns then they can be any size -
making estimation impossible.

Alternatively, you use a memory profiler to give you some idea.
 
A

Arne Vajhøj

Been looking a bit around and i can't seem to find a way to calculate
the memory usage of a dataset/datatable...

I would like to get the potential usage of memory and the actual size of
a specific datatable/dataset.

That will be a very complex calculation.

A single data table will contain at least (you will need
to check all the data structures in reflector to the exact):

number of colums * average size of meta data about columns
number of rows * (size of reference + object overhead for each row object)
number of rows * number of columns * (size of reference + object
overhead + average size of data)

The size of data will depend on the data type:
integers/floating point - 2/4/8 depending on specific type
strings - 2*actual character length in database

For few rows the overhead will probably be a lot bigger
than the data itself.

For many rows the data will be more important.

For many rows and very long strings (or BLOB's) then
the data will completely dwarf the overhead.

It is probably easier to do some measurement on a specific
data table than to try to calculate it.

Or you can just make an assumption that:

data table size in memory < 3 * raw data size

I am confident that it will be true for many realistic data.

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top