The DataTable / DataSet size

G

Guest

Hi,

We do ADO.NEt programming only on the time-to time basis, so we don't have
the real deep experience in this matter.

Every where in MSDN or other documentation I can read that DataTable/DataSet
is designated to keeping disconnected data from a database. So far, so good.
But I have never seen any recommendation about how (or when) to control the
size of these objects.

For example let’s say we have a table that can be small at the beginning and
we keep this small amount of records in a disconnected DataTable object.
After several months the size of the table in a SQL database can grow to
several-docents of thousand records. How should the client side application
respond to this change, or, when we should start to worry about the size of
fetched data? I know it depends from available memory and so on…. But anyway,
what is your typical size of DataTable object (MBs, number of records…).



Thanks,
Lubomir
 
S

Sheng Jiang[MVP]

If the datatable is too large to be transferred, use paging and/or filtering
 
C

Cor Ligthert[MVP]

Lubomir,

It looks as if the DataSet is an ofline database but it is not, the meaning
is to take only the parts of the database that are needed (which can in some
cases be the whole database).

This you can done by using by instance the "where" statement in your SQL
script.

Cor
 
M

Miha Markic

As Cor said, you should load only data you need for a certain process,
process it and store it back. There is no need to keep enormous amounts of
data in DataSet.
 
G

Guest

Hi,
Thanks for answers.

Yes, we are using filters to fetch the data from database. However we are
concerned that when database will be loaded with real data, the amount of
fetched records will be too big down the road.

Maybe we are too much worried on the other hand. What I was looking for is
to get the picture about the size of the DataTable used by other, more
experienced database developers.

In our case I think the disconnected datatable will not have more than 100
000 records with 8-10 columns - in teh worst scenario. Most probably it will
be much less.

Regards,
Lubomir
 
M

Miha Markic

One way to figure approximate size is by *binary* serializing to a
memorystream. Once you are done you can check the size of the stream.
 
M

Miha Markic

Almost forgot. It doesn't matter the size of the DataTable if you have
enough RAM.
100,000 records seems like a lot to me. I would prefer working on smaler
sets.
 
J

Jim Rand

We had to import some data into SQL Server, but we ran into data exception
problems with "wierd" data. So, we needed a simple way to view the data.
Since there were over 256 columns, we could not use Microsoft Access for
this.

What to do? What to do?

I know. We will load it into a dataset and bind a grid to it to see what we
were dealing with.

Worked great.

By the way, the number of rows exceeded 2,000,000. Not bad for a six year
old cheap desktop with 256 megs of memory and .NET 1.1. Microsoft obviously
did something very clever under the hood.
 
C

Cor Ligthert[MVP]

Lubomir,

There is no reasonable answer for this then that you can self better
calculate. If there is one column with real images in it, then it will be
sure much more then that there are only 1.000 columns which can contain one
integer.

Cor
 
G

Guest

Hi,

Thanks for all answers. I wanted to get the feeling about "what is
standard/usuall size and what seems to be too much". I think "I have that
feeling ".

Thanks,

Lubomir
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top