LOH behavior

  • Thread starter Thread starter Frank Rizzo
  • Start date Start date
F

Frank Rizzo

As I understand it, the .NET framework places objects on the Large
Objects Heap (LOH) if they are over 85kb in size.

So for an object like

public class MyClass
{
private byte[] largeArray = new byte[100000];
}

the MyClass object will go to the regular managed heap, while largeArray
object will end up on the LOH. This suggests that it is really
difficult to know which pieces of a real life object will go to LOH,
since the Parent does not necessarily go to LOH.

With this in mind, if I load a DataSet that that contains 100MB of data,
what exactly (if anything) goes to LOH?
Also, how can I see this LOH allocation using the CLR Profiler.


Regards
 
Peter said:
What on earth are you doing with a 100MB DataSet! Yikes!

Massive processing of data. It turns out that processing in c# for this
type of data (it does not lend itself to set processing very well), is
on the order of 50 times faster than in SQL.

Regards
 
Hi,

Frank Rizzo said:
With this in mind, if I load a DataSet that that contains 100MB of data,
what exactly (if anything) goes to LOH?

First of all, with this amount of data you will have A LOT more to worry
about that the memory, parsing it will be a killer!!!

That amount of data belongs to a database, not a dataset.

Answering your question though, most probably NOTHING from the data will go
to the LOH, you did not especified your DB though.
The most probably residend of the LOH will the arrays that the dataset
should have internally. But your data most probably will not be there.


In short, use a DB
 
Frank Rizzo said:
As I understand it, the .NET framework places objects on the Large
Objects Heap (LOH) if they are over 85kb in size.

So for an object like

public class MyClass
{
private byte[] largeArray = new byte[100000];
}

the MyClass object will go to the regular managed heap, while largeArray
object will end up on the LOH. This suggests that it is really
difficult to know which pieces of a real life object will go to LOH,
since the Parent does not necessarily go to LOH.

With this in mind, if I load a DataSet that that contains 100MB of data,
what exactly (if anything) goes to LOH?

Probably nothing. The only things that will end up on the LOH are
objects where the *individual object* is over 85K. In real life, that's
likely to only be huge arrays (like the one above) and long strings.
For reference types, arrays would have to have ~21,000 elements to end
up on the LOH (assuming you're running in a 32-bit CLR) - so if you've
got about that many rows, chances are that array (assuming it's backed
by an array - it's pretty likely) will be on the LOH, but not the rows
themselves.
Also, how can I see this LOH allocation using the CLR Profiler.

Couldn't tell you that, I'm afraid - it's a while since I've used it.
 
Ignacio said:
Hi,



First of all, with this amount of data you will have A LOT more to worry
about that the memory, parsing it will be a killer!!!

Actually, it takes about 20 seconds to bring the data into the client
and 4 seconds to build an object tree. And this is on my 2 year old
laptop. The app will be on a powerful server.
That amount of data belongs to a database, not a dataset.
In short, use a DB

In general, yes, however, for the data that does not lend itself to set
processing, c# is way better.
 
Hi Frank,

Since the DataSet has a lot of reference to small pieces of data, I agree
that nearly nothing will be put into LOH, unless you have large strings or
arrays inside the DataSet itself.

To check the Large Object Heap size, you will see it in the statistics
window. I'm currently using the v2.0 of CLR profiler. You can see it on
page 10 of the help document.

If anything is unclear, please feel free to let me know.

Kevin Yu
Microsoft Online Community Support

==================================================
Get notification to my posts through email? Please refer to
http://msdn.microsoft.com/subscriptions/managednewsgroups/default.aspx#notif
ications.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscriptions/support/default.aspx.
==================================================

(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 
Hi,,

| Ignacio Machin ( .NET/ C# MVP ) wrote:
| > Hi,
| >
| > | >> With this in mind, if I load a DataSet that that contains 100MB of
data,
| >> what exactly (if anything) goes to LOH?
| >
| > First of all, with this amount of data you will have A LOT more to worry
| > about that the memory, parsing it will be a killer!!!
|
| Actually, it takes about 20 seconds to bring the data into the client
| and 4 seconds to build an object tree. And this is on my 2 year old
| laptop. The app will be on a powerful server.

4 s for a 100MB dataset?

I have problems when the dataset goes over 10MB , it takes like 30 s to
load it.
 
Ignacio said:
Hi,,

| Ignacio Machin ( .NET/ C# MVP ) wrote:
| > Hi,
| >
| > | >> With this in mind, if I load a DataSet that that contains 100MB of
data,
| >> what exactly (if anything) goes to LOH?
| >
| > First of all, with this amount of data you will have A LOT more to worry
| > about that the memory, parsing it will be a killer!!!
|
| Actually, it takes about 20 seconds to bring the data into the client
| and 4 seconds to build an object tree. And this is on my 2 year old
| laptop. The app will be on a powerful server.

4 s for a 100MB dataset?

I have problems when the dataset goes over 10MB , it takes like 30 s to
load it.

They the price of creating/adding a new object is too high. Make sure
that the constructor does nothing but simply populate properties.
My dataset is divided into several tables which all bring related
information. Do not use the related tables feature because it simply
slows the living hell out of the process. Besides, it is kind of
useless, since all the data in the dataset must be loaded into the
object tree. So I simply loop through each table and use hashtables
(created on the fly) to keep track of related information in DataTable
objects.
 
Back
Top