Throwing bad_alloc when calling new on large class

S

SteveZ

First off, I am not sure if this belongs in this group or the C# group. It
seems more like a C++ problem to me. Anyways...

I have a C# project which links in an unmanaged C++ DLL, both built with
VS2005. I am having a problem where the C# program is making a call into the
C++ DLL which new's a large class. For a class of a certain size the new is
failing. Here's somewhat of an example of what I am doing:

// Start C# program code
public enum MyTypes
{
TYPE1 = 0,
TYPE2 = 1
}

[DllImport("MyDll.dll")]
public static extern void UnmanagedFunc(MyTypes PassedType);

MyTypes funcArg = MyTypes.TYPE1;
UnmanagedFunc(funcArg);
// End C# program code

// Start unmanaged C++ DLL code
// This next line is from the class declaration and initialized to NULL in
the
// class constructor, I just haven't shown that here.
ParentClass *ClassPtr_;

void UnmanagedClass::UnmanagedFunc(MyTypes PassedType)
{
if(ClassPtr_)
{
delete ClassPtr_;
}

switch(PassedType)
{
case TYPE1:
ClassPtr_ = new Class1();
break;
case TYPE2:
default:
ClassPtr_ = new Class2();
break;
}
}
// End unmanaged C++ DLL code

A few things to note. I left out the C++ wrapper code which wraps the
unmanaged call to this function to the managed call--there is no issue with
that code. Class1 and Class2 are inherited from ParentClass. Class1's size
is around 150 MB, while Class2 is around 200 MB. Both classes have large
static arrays in them, but Class2 has more arrays, which makes up the 50 MB
difference.

When UnmanagedFunc is called with the parameter MyTypes::TYPE1, there is no
problem. However when MyTypes::TYPE2 is used, then the new call throws
std::bad_alloc. I have plenty of physical memory and virtual memory
available, so that should not be a problem. My program size (according to
Task Manager and Performance Monitor) is about 750 MB at the point before
attempting the new. This brings up a few questions in my mind (other than
the obvious, "why doesn't this work?"):

Are unmanaged and managed heaps shared within the context of a process? If
not, could one heap (in this case the managed heap) reserve more space than
it needs such that the other heap (in this case the unmanaged heap) doesn't
have enough heap to new the larger class?

Is there a way to specify the size of the heap in C# (since there is no way
to specify heap size for a DLL that I am aware of)?

If this is a problem with fragmentation, is there any way to "defragment"
the heap space so I am able to allocate the contiguous block that I need?

Many thanks in advance.
-SteveZ
 
B

Brian Muth

A few things to note. I left out the C++ wrapper code which wraps the
unmanaged call to this function to the managed call--there is no issue with
that code. Class1 and Class2 are inherited from ParentClass. Class1's size
is around 150 MB, while Class2 is around 200 MB. Both classes have large
static arrays in them, but Class2 has more arrays, which makes up the 50 MB
difference.

Those are pretty big classes. Remember, you are asking the operating system to allocate 200 MB of contiguous memory. Because of
memory fragmentation, this simply may not be available. This is probably a very valid exception. I don't mean to be critical, but
requiring this much contiguous memory at once is an indication of poor design.

Brian
 
S

SteveZ

Brian Muth said:
Those are pretty big classes. Remember, you are asking the operating system to allocate 200 MB of contiguous memory. Because of
memory fragmentation, this simply may not be available. This is probably a very valid exception. I don't mean to be critical, but
requiring this much contiguous memory at once is an indication of poor design.

Brian

The context of the application somewhat calls for the use of large classes
like this for various reasons. One of which is that the unmanaged C++ DLL is
targeted for both desktop PC and embedded (non-Windows) platforms. There is
an option of breaking the class down to smaller classes, but that will
increase the context switch performance hit in the embedded environment with
more new calls. I would rather not go down that route.

Not to mention that containing all the structures in those single classes
would add to the readability and maintainability of the code because of how
it is structured. The classes contain large structures which store data from
large images in them--that's the reasoning for the large arrays in the
classes.

I would rather like to know the answer to the question as to why a large
class like that cannot be created on the heap on the Windows platform. Is it
because of poor heap management and having a lot of heap fragmentation? Or
is it because of competing heaps (managed and unmanaged)? It is not a
physical memory issue because the systems I am working with have 3 GB of RAM.
The used system memory of the system is much less than that (per Task
Manager). Is there some boundary as to the size of object which is created
on the heap? These are similar to the questions I had proposed originally
(and the reasoning as to why I proposed them).

Thanks.
-SteveZ
 
B

Brian Muth

SteveZ said:
:

Not to mention that containing all the structures in those single classes
would add to the readability and maintainability of the code because of how
it is structured. The classes contain large structures which store data from
large images in them--that's the reasoning for the large arrays in the
classes.

I would rather like to know the answer to the question as to why a large
class like that cannot be created on the heap on the Windows platform. Is it
because of poor heap management and having a lot of heap fragmentation?

That's what it is. Unless you are using the /3G switch in your boot.ini file, you are only using 2 GB of physical memory.
Regardless, you only have at best 2 GB of virtual memory per application. You can imagine it doesn't take too many allocations and
deallocations of huge chunks of memory before the system fails to find the 200 MB of contiguous memory you are asking for.

One fairly easy way out is to move to a 64-bit OS. You aren't faced with these upper memory boundaries in that environment.
Otherwise you may be stuck with managing your own heap.


Brian
 
B

Ben Voigt [C++ MVP]

You mentioned these are static arrays? They shouldn't add to the
per-instance object size at all.
That's what it is. Unless you are using the /3G switch in your boot.ini
file, you are only using 2 GB of physical memory. Regardless, you only
have at best 2 GB of virtual memory per application. You can imagine it
doesn't take too many allocations and

Both these statements are incorrect. 32-bit WinXP uses up to 3.5 GB of
physical memory regardless of the /3G switch. Each application has either
2G or 3G of virtual address space depending on the switch, and each
application can have nearly unlimited virtual memory using file mappings.
 
B

Brian Muth

Both these statements are incorrect. 32-bit WinXP uses up to 3.5 GB of physical memory regardless of the /3G switch. Each
application has either 2G or 3G of virtual address space depending on the switch, and each application can have nearly unlimited
virtual memory using file mappings.

Thanks for the correction
You mentioned these are static arrays? They shouldn't add to the per-instance object size at all.

I'm not convinced the OP actually used to static keyword. If so, this is indeed a surprise, since the actual allocation is probably
quite small, and one wouldn't expect a bad_alloc exception. Perhaps the OP can clarify.

Brian
 
S

SteveZ

Brian Muth said:
Thanks for the correction


I'm not convinced the OP actually used to static keyword. If so, this is indeed a surprise, since the actual allocation is probably
quite small, and one wouldn't expect a bad_alloc exception. Perhaps the OP can clarify.

Brian

Brian is correct. It was a poor choice of words on my part in the OP. I
should have said that they are fixed-size arrays, not static arrays. So they
are allocated each time the class is created with new.

Since I am not using the complete physical address space and I should have
plenty of virutal memory (as indicated by Ben Voigt), I am still confused as
to why the allocation of the class is failing. I should also mention that
this exception occurs on the first occurance of class allocation and is
consistent between program runs. Can the heap be this fragmented every time
I run the program?

Thanks.
-SteveZ
 
B

Ben Voigt [C++ MVP]

SteveZ said:
Brian is correct. It was a poor choice of words on my part in the OP. I
should have said that they are fixed-size arrays, not static arrays. So
they
are allocated each time the class is created with new.

Since I am not using the complete physical address space and I should have
plenty of virutal memory (as indicated by Ben Voigt), I am still confused
as
to why the allocation of the class is failing. I should also mention that
this exception occurs on the first occurance of class allocation and is
consistent between program runs. Can the heap be this fragmented every
time
I run the program?

Your virtual address space is pretty fragmented, because dynamic libraries
are compiled to use random locations in the hope of preventing collisions.
(If there were a collision the loader would have to fixup every reference to
code, global variable, etc which adds a lot of overhead to initializing a
DLL).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top